حنیف شیخ عبدالکریم-Hanif sheikhabdolkarim
Contact me
My Profile
Blog Author(s) حنیف شیخ عبدالکریم-Hanif sheikhabdolkarim
Previous Months Home Archive ۱۳٩٠/۱۱/۸ ۱۳٩٠/۱/٢٧ ۱۳۸٩/۱٢/۱٤ ۱۳۸٩/۱٢/٧ ۱۳۸٩/۱۱/٢۳ More ...
      فن آوری اطلاعات و مدیریت -- ICT (ICT -Managment-High Tech)
Microsoft Technology for BPM by: حنیف شیخ عبدالکریم-Hanif sheikhabdolkarim
On This Page

Introduction
A new paradigm
Microsoft Technology for BPM
Understanding BPEL
Business Rules in Business Processes
Conclusion

Introduction

Business processes are dependant and ordered activities that result in predictable and repeatable outcomes. Consisting of an organization’s operating procedures, institutional working knowledge, and information resources, business processes are designed to satisfy defined business objectives in an efficient and timely manner. In an efficient environment, the functional components of a process can be readily identified, adapted, and deployed to address ever-changing corporate requirements—a capability termed as business agility. By definition, business agility is an organization’s systemic ability to fluidly marshal and reconfigure resources in response to business requirements and opportunities. Business Process Management (BPM) tools are designed to provide for such agility by facilitating the creation and execution of highly transparent and modular process-oriented workflows that meet the operational performance standards IT organizations demand.

Automated business processes developed and executed within such an environment are characterized by the following attributes:

  • Visibility of end-to-end process activities

  • Process components and functionality that are exposed and self-describing

  • Ability to integrate disparate information source and application functionality into a process

  • Information flow and event notification that can be automated and monitored throughout a process

  • Workflow participation that makes the most of desktop productivity and communication tools

  • Service level agreements that can be specified, monitored, and enforced for activities in a process

  • Ability to add, remove, or reconfigure any process activity or component, without disrupting the process

  • Processes that can be monitored in real time or near real time

  • Process designs that can accommodate any exception handling requirement

  • Processes that can be easily replicated, extended, and scaled

With the support of XML and Web Services, BPM systems are transforming the way in which IT organizations are implementing and executing workflow components. XML applies structure to information, freeing it from any functional dependency on the software that operates on it. Web Services on the other hand provide the framework for application-to-application messaging and invocation over an unbounded network. BPM tools provide the additional support infrastructure to harness these capabilities to create, deploy, and execute the entire scope of workflow management, enterprise application integration (EAI), and trading partner integration (TPI).

This document examines how Microsoft’s tools for Business Process Management and supporting technologies facilitate the creation of processes that share the characteristics defined previously. The paper also describes how XML and Web Services are deployed within a BPM solution to achieve unprecedented modularity and extensibility. Finally, the document illuminates the gains in development and operational productivity that BPM technology engenders, which in turn enables real-time business agility.


A new paradigm

Just as standards-based Web servers and browsers facilitated the communication and distribution of information between people, BPM tools that employ standards-based XML and Web Services technologies will facilitate the wide-scale proliferation of automated and distributed business processes.

A defining characteristic of BPM technology is the elevation of design and development functions from the program layer to the information (document) and transport (messaging) layer. An application is no longer an opaque data-centric procedural construct; it is a messaging event or messaging agent capable of processing the exposed declarative properties of rich (XML) documents. Workflow processes, integration scenarios, or trading partner interactions consist of an orchestrated flow of messages that are routed, transformed, and processed according to message content, formatting requirements, and business rules.

Transparency and modularity are also defining characteristics of BPM technology. Not only are documents and messages exposed, self-describing, and extensible, but so are the communicating endpoint definitions, services, business rules, transformation maps, and process execution instructions that exchange and act upon the messages and documents. A process component can be “loosely coupled” with any other component so that a modification made to one activity or component in a process does not necessitate changes to other activities or components. Each component is functionally independent of any other component.


Microsoft Technology for BPM

Business Process Management technology represents a major conceptual reorientation of the methodologies of workflow development and deployment for business processes. As with any paradigm shift, it must result in significant benefits to be justified. Substantial evidence already indicates that users of BPM technology are achieving dramatic development efficiencies, accelerated returns on their investment, and most importantly, reduced resource requirements. Although XML, Web Services, and BPM platforms impose a new conceptual model on business process development and execution, the technologies required to do so are proven Microsoft® products that have been augmented to support this new paradigm. This section describes the core and supporting component technologies that constitute Microsoft’s process development and execution suite.

BizTalk® Server is Microsoft’s central platform for Enterprise Application Integration (EAI) and BPM and embodies the integration and automation capabilities of XML and Web Services technologies. BizTalk Server functions as a process execution engine and as a multitransport hub for messaging and document transformations.

Visual Studio® .NET is Microsoft’s flagship integrated development environment. The Orchestration Designer module found in previous versions of BizTalk Server is now an integral part of Visual Studio .NET with significantly more functionality. It is a visual development tool for building sophisticated workflows and processes that incorporate business rules, events, transactions, and exceptions and for linking these elements to implementation objects and messaging events. The assembled process generates an XML-based run-time script (BPEL) of the process that is executed in BizTalk Server.

SQL Server is tightly coupled with BizTalk Server and functions as its real-time data store for document tracking information and dehydrated instances of long-running processes.

Office 2003 redefines the functional concept and capabilities of Word and Excel by making XML the native file format. The applications can now behave like network clients, in the manner of a Web browser or e-mail client, but are capable of far more sophisticated and automated interactions with any source of XML information. Furthermore, in Office 2003, Microsoft also introduces InfoPath, an XML-based form application designed to address complex workflow documentation requirements. Because these applications can generate and decode XML documents with their respective schema definitions and processing instructions, they can directly engage in event-level interactions with BizTalk Server.

Active Directory® provides automated and federated authentication and authorization facilities as part of the process development and execution architecture. Its authentication and authorization capabilities facilitate sophisticated workflow processes that involve multiple participants, applications, documentation flows, and information sources. Active Directory also defines role-based participation attributes for workflow activities.

Host Integration Server and BizTalk Server Adapters facilitate integration with enterprise applications, legacy networking and transport protocols, and numerous data formats.

Microsoft Operations Manager and Application Center provide data center class system tools for building, monitoring, and scaling high-performance, mission-critical deployments of BizTalk Server and its supporting technologies.

Microsoft is at the forefront of XML, Web Services, and Business Process Management development and is committed to the implementation of these enabling technologies throughout its products. Nowhere are the potential capabilities of XML and Web Services more evident and maximized than within Microsoft’s integration, development, and productivity technologies. The core XML and Web Services capabilities found in the new releases of BizTalk Server, Visual Studio .NET, Visio, and Microsoft Office 2003 demonstrate a coherent vision for distributing EAI and BPA development and deployment activities, both along functional lines and among stakeholders. The details of this vision become apparent through an examination of the new features and functions of the component technologies described previously.

BizTalk Server 2004

BizTalk Server is the central product of the Microsoft Enterprise Application Integration (EAI) and Business Process Management (BPM) toolset and embodies the application integration and process automation capabilities of XML and Web Services technologies. BizTalk Server has two core functions:

  • A process execution engine that manages the steps, applies the business logic, and invokes the supporting applications of a complex process and/or transaction set.

  • A multitransport messaging hub for transforming and routing information to and from applications and process steps.

The release of BizTalk Server 2004 represents Microsoft’s third generation of BPM technology. The initial introduction of BizTalk Server 2000 demonstrated Microsoft’s early leadership in defining BPM functionality and supporting XML. BizTalk Server 2002 provided feature set refinements and performance enhancements. The new version of BizTalk Server is a major upgrade that incorporates the recommendations of thousands of users. BizTalk Server 2004 offers many new features and has been reengineered to provide substantial upgrades including improved performance and monitoring of process execution, robust Visual Studio .NET integration for programmatic control, and enhanced workflow modeling capabilities.

By leveraging the application integration and process automation capabilities of BizTalk Server and Microsoft Office, Microsoft has adopted the W3C XML Schema Recommendation. XML Schema is a specification that formally defines an extensive array of data type primitives and structural components for creating XML documents; it serves as a dictionary of abstract elements, attribute entities, and organizational rules. Creating XML documents that conform to the schema delivers a significant advantage: the meaning, function, and application of the document’s content is comprehensible to, and operable by, any XML-enabled application that can access the document’s underlying schema.

Every industry-specific initiative to develop a common vocabulary and set of procedures for the exchange and processing of information is based on XML Schema. By using XML Schema internally, BizTalk Server can import any XML schema definition natively without translation. This significantly reduces the time and effort required to facilitate sophisticated, interactive business scenarios—particularly those that involve the exchange of multiple document types. For example, consider the emergence of generic business-to-business (B2B) framework specifications for standardizing document content and data exchange procedures and behavior. There are typically hundreds of document types and exchange scenarios identified in each initiative specification, all of which are defined using XML Schema. By supporting XML Schema natively, the BizTalk Server messaging infrastructure and process execution engine can effectively support any B2B framework initiative.

Office 2003

XML Schema also figures prominently in the release of Microsoft Office 2003 as Word and Excel adopt XML as a native document format using an XML schema definition file. Also in Office 2003, Microsoft introduces InfoPath™, an XML-based form tool designed to propagate automated workflow capabilities throughout an organization. InfoPath allows workflow participants who create new information or perform analytical or content collection functions to generate, interact with, and exchange structured information. Typically, these activities are paper-bound or use a digital representation of paper. A form created in a word processing or spreadsheet program can be filled out easily enough, but the information that is entered in the form cannot be understood or processed without programmatic or manual intervention. Generating, conveying, extracting, manipulating, and reorganizing unstructured information in and from these formats is extremely labor intensive, inefficient, and costly.

InfoPath addresses this problem by creating smart forms that generate structured XML information, including processing instruction metadata. Underlying an InfoPath form is a template that incorporates one or more XML schemas, XSLT style sheets, embedded controls, and business logic instruction sets. The template controls the behavior of a form in the following ways:

  • Generates multiple form views of the same information

  • Constrains data types and values that can be entered in a form

  • Defines and controls the dependencies for entering information

  • Generates automatic, derived, and computed values

  • Invokes events, prompts, and instructions based on dependencies

  • Provides access to remote information sources

  • Enables the incorporation of digital signatures

Form templates are created in a WYSIWYG design tool and require no procedural programming, predefined XML schemas, or XSLT style sheets. The schema and processing instructions are implicitly defined as the form template, which is represented in XML, is constructed from a palette of drag-and-drop controls, wizards, and dialog boxes. When an InfoPath form is populated, it generates an XML document that contains the entered and derived information, required metadata, processing instructions, and digital signatures of the participants who have accessed the form. The document also includes references to the template schemas and XSLT files required by other XML-enabled applications.

By enabling Microsoft Office applications to automatically recognize and process the information and metadata contained in a document, they become capable of engaging in event-level interactions. These “content-responsive” applications have the ability to facilitate automated workflow processes that take place between applications, and between applications and participants. This is the same fundamental premise of Web Services, and it redefines the functional concept and capabilities of these applications. They now behave as network clients, in the manner of a Web browser or e-mail client, but are capable of engaging in sophisticated and automated interactions with any source of XML information. Participants using these tools still engage in their workflow functions, but much more efficiently because of the elimination of manual processing tasks that are irrelevant to the effective execution of those functions.

Exchanging Information

BizTalk Server provides extensive transport and messaging services such as receive and send locations, adapters, pipelines and publish and subscribe distribution capabilities through a messagebox data store—all of which support content-based routing and processing. These facilities combined with the BizTalk Server document tracking and process monitoring capabilities, provide an overlay infrastructure for managing workflow processes. With BizTalk Server as the messaging and process hub and Office 2003 applications as XML-processing clients, participant involvement in workflows can be orchestrated, monitored, and qualified for reliability and performance metrics. This has the potential to radically alter the dynamics and efficiencies of workflow processing on an enterprise-wide basis.

The methodologies and technologies of process design, implementation, and execution are also undergoing a paradigm shift. As such, Microsoft is introducing key innovations in its BPM tools that will significantly improve process development and deployment. One of these innovations is the introduction of a set of high-level process-design and implementation tools that correspond to the roles of the participants involved in process development. These tools make it possible to graphically construct the business logic model of a process, link the steps in the model to actual implementation agents and components, and then generate an executable run-time instruction set of the finished process model in XML.

Roles of Business Analyst and Developer

Automating business processes is a collaborative activity that takes place between line-of-business professionals and programmers. Because each discipline has its own language and development issues, a communication and procedural divide separates their understanding of the development objectives. Consequently, software development is characterized by recursive revision cycles and ambiguity. Its methodology requires interpreting the apparent objective of a specification and translating it into a highly abstract form. While development notation systems such as UML provide business analysts with a structured approach for documenting use cases and specifications, programmers still have to interpret the documentation and translate its intent into a different language and format.

To take advantage of a business process paradigm that is exposed, loosely coupled, and document-driven, development tools and methodologies that incorporate these concepts are required. In creating these tools, Microsoft offers an alternative methodology for developing process-oriented applications. This methodology eliminates the inefficient interpretation and translation cycles that presently characterize application development.

Microsoft Visio remains the ideal tool for the business analyst. Designed for line-of-business professionals, Visio enables users to create diagrams of a business process by arranging, ordering, labeling, and linking various symbols that represent activities, events, decisions, flow, and transactions. When a process diagram is completed, Visio generates an XML representation of it in a Business Process Execution Language (BPEL) document. The BPEL representation of the designed process is then imported into Visual Studio .NET where each design object in the process will be implemented.

The corresponding product for developers remains Visual Studio .NET. Offering an extensive set of functionality, Visual Studio .NET contains a special project overlay template that provides programmers with a visual workspace. This workspace exposes the “binding” that exists between implementation objects, documents, and messaging infrastructure and the process steps created in the design stage. Though Visual Studio .NET is a programming environment, the method for implementing the process design does not resemble procedural programming application development. Instead, the visual workspace provides a collaborative environment where the programmer and process designer can work together on a diagrammatic object model of the process that both can understand.

Bringing the pieces together

In the workspace, Visual Studio .NET reconstitutes the diagrammatic representation of the business process. In addition to the primitive element design palette found in Visio, it also contains a palette of implementation objects: application integration components, COM objects, Web Services, XML documents, messaging and transport facilities, and transformations. These objects are visually “bound” to the design objects through messaging events or method calls using drop, drag, and connect activities. Furthermore, the implementation mechanisms for highly complex functions, such as transactions requiring two-phase commit, rollback, and so on, are built-in functions— thus eliminating the need to write complicated procedural code.

Because the primary dynamic of BPM exchanges is based on sending, receiving, inspecting, and transforming exposed XML documents, a BPM development tool should be capable of visually mapping the flow and exchange of information in a messaging event associated with a process step. This is facilitated through a special “Data” template that graphically represents the message flow in a process as well as the transformation of, and operations performed on, document elements through the process flow.

This assembly approach to building process applications incorporates the modular, loosely coupled, and exposed paradigm for creating any type of interaction or event. Each implementation binding or messaging event is functionally independent of any other binding or messaging event. A change on the design side does not affect the operation, function, structure, or integrity of the objects on the implementation side. In conventional process development, a complex integration or process scenario is embodied in opaque programming code. That code incorporates the structure of the endpoint objects, the process flow logic, the conversion of data formats, business rules, and the bindings to transport infrastructure. If a modification is required to any one facet of the integration module, the integrity of the entire exchange implementation is compromised. The risk of introducing unexpected behavior when modifying code has always been a pitfall of software development and it accounts for the hesitancy to make ongoing process changes in response to business requirements.

It is no accident that the tools and methodologies for developing business processes exemplify the modular, loosely coupled, and exposed paradigm that characterizes the processes themselves. This paradigm is fundamental to every aspect of the Microsoft BPM environment. The ability to visualize and comprehend complex business logic and its implementation mechanisms makes process development and maintenance infinitely more efficient and manageable—enabling organizations to flexibly adapt to new business requirements and opportunities.


Understanding BPEL

Real-world business processes are complex and incorporate numerous integrity controls: ACID transaction support, stateful persistence of long-running interactions, nested and parallel operations, compensation and exception mechanisms, acknowledgements, and correlation capabilities. As process complexity increases, the value of being able to design, implement, and document highly dependant and sophisticated behavior using a visual assembly methodology becomes more apparent. This is especially the case when processes require modification or one process design serves as the basis for another.

The Business Process Execution Language for Web Services (BPEL4WS or BPEL) is another key standard supported by Microsoft in BizTalk Server 2004. Developed collaboratively by Microsoft, IBM, and BEA Systems, BPEL was designed to orchestrate and coordinate Web Services so they can be engaged in collaborative and transactional behavior. The BPEL specification has been submitted to the OASIS standards body for review and eventual designation as a protocol standard.

While Web Services provide the methodology for application-to-application messaging and method invocation over an unbounded network, by themselves they cannot satisfy the operational requirements of a business process. A business processes is a set of dependant and ordered activities, the execution of which results in a predictable and repeatable outcome in a timely manner. BPEL will enable Web Services to ultimately meet these requirements. BPEL formally defines basic and structured activities that are used to compose sophisticated business processes. Because a BPEL instruction set is an XML representation of a process with a precise language and grammar structure, it provides a readable and understandable instruction set for documenting a process. In fact, the object primitives used in BizTalk Server Orchestration Designer are direct representations of basic and structured BPEL elements such as receive, invoke, sequence, flow, switch, partners, role, link, and source.


Business Rules in Business Processes

Business processes are driven by business rules, and the majority of modifications to a business process life cycle pertain to changes in business rules (as opposed to technology-related modifications). However, because business rules in conventional applications are embodied in opaque programming code, they cannot be accessed or modified easily and without potential disruption to running processes. There’s no argument that isolating business rules from procedural code or any process implementation mechanisms dramatically improves the efficiencies of managing and adapting business processes in response to new requirements or business conditions.

It is therefore of particular significance that, in BizTalk Server 2004, Microsoft introduces the Business Rules Composer. The Business Rules Composer consists of a business rule editor and engine for creating and processing sophisticated rule sets using a forward-chaining inference model. A rule set (or “Policy”) that drives a specific activity or function is created with the Business Rules Composer and becomes a resource object that is referenced in a BizTalk Server orchestration. Transparency and loose coupling governs the creation and implementation of business rules. A rule set incorporated within a BizTalk Server orchestration can be viewed, modified, or replaced both at design and run time, without affecting any other operational aspect of a process or interrupting running instances of the affected process.

Isolating, exposing, and publishing business rule sets as services that can be accessed by any application or process provides one of the most compelling value propositions for the Services Oriented Architecture paradigm. As such, the Business Rules Composer module is one of Microsoft’s most noteworthy products.

BPM and the Service Oriented Architecture

The next era of computing will be characterized by the detachment of information from applications, leading to a widely distributed Service Oriented Architecture.

  • The meaning, function, relationships, and presentation of information will be self-describing and embedded in the information itself, using schema vocabularies and style sheet references.

  • Information will be generated and published without knowledge of how it will be consumed or used.

  • Applications will be capable of consuming information and the methods of other applications, while being consumed themselves.

  • Processes will be self-configurable and self-modifying based on event-level interactions between rule sets and information.

  • Entirely new applications and business models will evolve from this paradigm.

Microsoft’s BPM tools and the XML technologies on which they are built introduce the next era of computing based on the Services Oriented Architecture paradigm. In examining the innovations being introduced in Microsoft’s BPM toolset, it becomes apparent how cumulative XML functionality accrues: XML Schema enables Web Services and InfoPath. XML technologies embedded in BizTalk Server enable an entirely new model for application integration and process management. Each manifestation by itself has significant value, but combined, they offer the potential to facilitate wholesale efficiencies and innovative solutions. It is an object lesson in the whole being greater than the sum of the parts.

Operational Support for BPM Technology

It is one matter to have a vision of how things can work; it is another to actually make that vision work. The innovations engendered by XML require operational support context. On their own, they cannot be simply inserted into an organization’s existing infrastructure and expected to provide functional efficiencies, or to conform to the operational performance standards to which IT organizations are accustomed. Their value is actualized when implemented within a framework of complementary and supporting technologies that facilitate their use within an embedded infrastructure. Just as InfoPath and BizTalk Server complement each other, Microsoft’s array of proven enterprise applications (SQL Server, Active Directory, Host Integration Server, Microsoft Operations Manager, and Application Center) complement and support the implementation of these applications in a real-world, mission-critical context.

SQL server provides the transaction record data store for all BizTalk Server messaging events as well as dehydrated instances of long-running processes. Because SQL Server directly supports XML, it can maintain a record of the document instance associated with each message event as well. In this capacity SQL server provides the persistence capabilities of BizTalk Server.

The BizTalk Server application integration methodology is based on converting the native file formats of legacy and packaged applications into an internal XML representation, which is then transformed according to the requirements of the integration exchange. A prerequisite of this conversion function is knowledge of the native file formats. Microsoft’s Host Integration Server and other third-party adaptors provide this information to BizTalk Server in a comprehensible and usable way. Furthermore, these products provide transport gateways between supported transport facilities in BizTalk Server and the transport protocols used by legacy systems.

Lastly are the requirements for system performance, availability and scalability. Any enterprise-class application must be engineered to accommodate performance metrics monitoring, operational redundancy, and scalability at a granular level. This fine grain level of performance management, failure prevention, and scalability can be achieved with the support of Microsoft Operations Manager (MOM) and Applications Center. These tools are tightly integrated with all Microsoft server products and provide the operational assurances necessary for enterprise class information processing.


Conclusion

Over the last two years, based on thousands of successful deployments, BizTalk Server has clearly demonstrated that a platform based on XML technologies can dramatically improve the efficiencies and economics of application integration development. With the innovations introduced in BizTalk Server 2004, Office 2003, Visio, and Visual Studio .NET XML technologies will demonstrate an even greater potential to deliver operational efficiencies and benefits—and will, no doubt, assume a prominent role in enterprise computing

 

--------------------------------------------------------------------------------------

Business-Process Engineering BPE and Business-Process Management BPM

 

Miriam Grace and Sandi Jeffcoat

April 2007

Summary: Learn about Business-Process Management (BPM), and get ready for the future. What is right about IT is that we recognize the value of full collaboration with our business customers and, finally, we are developing tools that enable us to realize that partnership fully. (5 printed pages)

There is something wrong with IT—something dreadfully wrong.
–Smith & Fingar [2003]

Contents

Introduction
Business-Process Management (BPM): Coming Attractions
A Better Approach
Conclusion
Sources

Introduction

Bob is a new software architect about to embark on a software-development project where the customer is re-engineering their business processes. He has a generous budget and the full support of his IT leadership, and his customer is bringing in experienced business subject-matter experts (SMEs) to define the new business-process architecture. He has talked at length with the business-process owners, who are focused on developing lean and efficient business operations, so that they are "starting from scratch" and documenting new, common approaches to getting their work done across multiple business divisions. The business leaders are committed to engaging collaboratively with Bob and his IT team to get the requirements right. So far, so good.

Bob is an expert programmer. He has been a Senior Developer for over a decade and took the opportunity to advance to the position of software architect when this project was advertised. But he has never engaged in the "front end" of the software-development life cycle where the business processes are documented. This is new territory for him; and, although his IT skills are significant, neither is he used to thinking in business terms nor does he know how to prepare himself and his IT team for the adventure ahead of them.

Business-Process Management (BPM): Coming Attractions

If Bob were more aware of the emerging field of Business-Process Management (BPM), he would know what to do with the intellectual assets that his customers are about to create. He would be aware of the "coming attractions" in the technology of BPM systems, and he would understand how to capture the process information and turn those process descriptions into digital data—the key deliverable of the next phase in the software-development life cycle. He would know that a business process is a sequence of activities that carries out a complete business goal [Debevoise, 2005, 3]. Bob would be aware that BPM is the identification, understanding, and management of business processes that link with people and systems in and across organizations.

BPM knowledge would have provided Bob the power to visualize his customer's business processes as a critical "information chain" that connects the step-by-step series of activities with the data entities and relationships that Bob requires for system development. But Bob missed the previews of the coming attractions in the world of BPM. So, he and his IT team are in for a bumpy ride, as well as an expensive but worthwhile learning adventure.

Over the next several months, Bob and his team of functional analysts participate in workshops that start early, go all day long, and often extend into the evenings. He watches his customers argue among themselves about how to perform key functions in a way that satisfies all of their conflicting requirements. There are focal points from the various divisions—each one trying to work toward a common way of doing things, while fighting the habits of years of doing it their own way. Conflicts arise and resolutions are negotiated. This is hard work, and agreements are a win.

Antipattern 1, Divide and Conquer: Separate Users from Process

Bob and his IT team are learning about the customers' business; but, they are not sure how to help, other than to try to facilitate the discussions. When the customers finally do reach agreement, the results are documented by a business-process modeler who captures the business processes in Microsoft Visio diagrams and pastes the diagrams into static Microsoft Office Word documents, where the vibrant business descriptions are sealed away.

Thus begins the separation of the business people from their processes, and thus begins what will become a widening gulf between Bob and his customers. This is an all-too-familiar scenario in IT projects and what made Smith and Fingar [2003] say, "Something is dreadfully wrong with IT."

Antipattern 2, Divide and Conquer: Separate Users from Data

But Bob does not seem to be aware of the communication gap that is growing between his customers and his team. He has begun to get nervous about all the time this process work is taking. He has brought in a data architect and data modeler, armed with a data modeling tool. He begins to move the subject of data from the background to the foreground. In his mind, data is what is important to the system design; this is familiar territory, this is what he knows. Words like "cardinality" and "facet" creep into his vocabulary. He holds sessions with the customers to educate them on the importance of the data model. The data architect shows the customers examples of how to think about their data, how to identify it. The customers have printed out paper representations of their business processes. Bob doesn't notice how they clutch them as if someone were going to steal them away. They look at their process flows and then at the data model; their brows furrow and they become confused.

Bob asks them to stop thinking about process now and turn their attention to the data. He tells them that the data is what is important now—that process definitions revealed the requirements—but he must understand the data to design the system. They try to bend, to follow his request. Both he and the data architect search to find a way to make the customers understand, but they do not seem to get it. Bob feels increasingly frustrated. They are not making any forward progress. He asks himself, "What more can I do to help them understand? It's their data. Don't they know what data they need to do their jobs?"

Now, reflect a moment. Think about what Bob is asking of his customers. They have just spent months working to describe their business operations. They have gone deep, describing what they do and trying to find lean, common approaches to old resource-intensive ways of operating. They have redesigned their processes to reduce cycle time and some have even sub-optimized their individual ways of doing business for the good of the whole. Finally, they reached a place of satisfaction with their creation. Now, he is asking them to climb another mountain, absorb a new language, understand new symbols, and create new representations of their reality.

They cannot bend that far without breaking. So, they fight it. Concepts like one-to-many and many-to-one relationships escape them. The breech widens. They experience a sense of loss. The pride that they had in their process definitions is diminished. Ownership has subtly transferred to Bob and the other IT professionals. The customer's knowledge about their business does not easily translate into information-systems design work. Although Bob is designing a system for them, they feel it move from their hands to his, and they worry about their needs being met. The breech is now a chasm.

The Business-IT Divide Is Alive and Well!

If Bob were more aware of the coming attractions in the world of BPM, right from the beginning he would have had a data architect and data modeler collaborating with the process modeler—constructing a data model and data dictionary from the maturing process definitions. Even more effectively, they would have required that the process models be captured in a digital design tool that enabled the transformation from business-process design to data model—where the inputs and outputs of the business become data entities, and where complete traceability from requirements to end-product delivery is birthed.

But, can we propose a scenario in which the problem never got started?

Yes!

A Better Approach

What if Bob and his business partners could concurrently perform the process-definition work right along with the system configuration? What if their focus was on "process processing" and not on "data processing?" What if Bob was looking toward the creation not of a "database," but of a "process base?" What if he could deal directly with the business process as an application, instead of data and application? What if his customers retained direct, physical ownership and could directly engage with him in the design and configuration of their system? What if their business processes were the design of the system?

Business-Process Management Systems

I understand that ideas as radical as eliminating the need for application development as we know it might make you raise your eyebrows and twist your lips into a look that speaks a "Yeah, not in this lifetime!" expression. And you would be right—almost. We are not there yet. But it is inevitable. Business-process management systems (BPMS) are here already, but they are still just a glint in the eye of the future that they foreshadow. If Bob had educated himself on the promise of BPM, separating what is possible now and what the five-year road map looks like, he would have been more sensitive to the value of the business processes for the long-term viability of his system design, and he would have understood what building blocks he should be putting in place now to support this profound change in IT development methods.

BPM has the potential to eliminate the business-IT divide, because it allows business and IT to do what they do best. When the business process is the application, the two become one, and modeling of the business process is the modeling of the application. After all, we have always said that software development is a process for building, maintaining, and perfecting business process. It's true now—more than ever.

BPM promises a focus on the "now"—the strategic goal that generated just-in-time inventory management and that stretches toward the ideal of the adaptable organization, in which time lags between innovation and execution are virtually eliminated. As Smith and Fingar [2003] have suggested, "Change is the primary design goal of business-process management systems, because in the world of BPM, the ability to change is far more prized than the ability to design in the first place."

BPM is a technology-enabled way of running a business—harnessing the universal connectivity of the Internet to stay one step ahead of the competition [Fingar, 2006]. BPM represents a change in the language, symbols, and tools that we use to describe and design information systems. These methods and tools have become more business-centered and less tied to technical details. BPM allows the business to model its operations in words and process diagrams. To update the process or associated rules, the words and diagrams are updated by the business—allowing true flexibility and instant adaptability to changing business circumstances. This is the promise of BPM.

The next "big thing" in business is operational innovation and business transformation. BPM is the technology that is keeping pace with that global change. If there is any chance to achieve the promise of business and technology evolving in tandem, software architects must begin putting in place now the building blocks that will facilitate the transformation.

Conclusion

So, preview the "coming attractions" of BPM. Learn about BPM, and get ready for the future. What is right about IT is that we recognize the value of full collaboration with our business customers and, finally, we are developing tools that enable us to realize that partnership fully.

Sources

  • [Davenport, 1993] Davenport, Thomas H. Process Innovation: Reengineering Work Through Information Technology. Boston, MA: Harvard Business School Press, 1993.
  • [Davenport et al., 2000] Davenport, Thomas H., and Laurence Prusak. Working Knowledge: How Organizations Manage What They Know. Boston, MA: Harvard Business School Press, 2000.
  • [Debevoise, 2005] Debevoise, Tom. Business Process Management with a Business Rules Approach. Roanoke, VA: Business Knowledge Architects, 2005.
  • [Fingar, 2006] Fingar, Peter. Extreme Competition: Innovation and the Great 21st Century Business Reformation. Tampa, FL: Meghan-Kiffer Press, 2006.
  • [Harrington et al., 1997] Harrington, H. James, Erik K. Esseling, and Harm van Nimwegen. Business Process Improvement Workbook: Documentation, Analysis, Design, and Management of Business Process Involvement. New York, NY: McGraw-Hill, 1997.
  • [Smith & Fingar, 2003] Smith, Howard, and Peter Fingar. Business Process Management: The Third Wave. Tampa, FL: Meghan-Kiffer Press, 2003.

 

About the authors

Miriam Grace is an Enterprise Systems Architect and a member of the Boeing Technical Excellence Fellowship. She has over 20 years of experience as a computing systems architect, with a specialty in business-process management systems. Miriam is a published author, holds a Master's degree in Systems Design, and is a Ph.D. Candidate in Leadership. Miriam designed a learning system that provided a curriculum of foundational skills for software architects at Boeing.

Sandra Jeffcoat is a member of Boeing's Technical Excellence Program, a 2005 National Women of Color in Technology and 2007 National Society of Black Engineers (NSBE) Golden Torch awards winner, and a Ph.D. candidate for the Antioch University Leadership and Change Program. She has more than 28 years of technical experience for such companies as the Boeing Company, Eastman Kodak, NCR, Mead Paper Corporation, AT&T, and various governmental agencies.

This article was published in Skyscrapr, an online resource provided by Microsoft. To learn more about architecture and the architectural perspective, please visit .

  Comments ()
datajs - JavaScript Library for data-centric web applications by: حنیف شیخ عبدالکریم-Hanif sheikhabdolkarim

Contents

  1. datajs Overview
  2. API Documentation
    1. read
    2. request
    3. defaultSuccess
    4. defaultError
    5. defaultHttpClient
  3. OData
    1. OData code snippets
    2. Metadata
    3. Security
    4. Cross-domain requests
    5. OData internals

datajs Overview

datajs is a cross-browser JavaScript library that supports data-centric web applications by leveraging modern protocols and browser features. It's designed to be small, fast, and provide functionality that makes web applications first-class citizens of web data.

Currently the library offers functionality to communicate with OData services. If you're not familiar with OData, there are good resources for learning at http://www.odata.org/developers.

The library supports receiving data and submitting changes or invoking service operations. The API can be made aware of metadata in cases where it's required, and operations can be batched to minimize round-trips.

We plan to extend the level of functionality in this area, as well as providing APIs to work with local data through browser features such as IndexedDB as they become available in popular browsers.

API Documentation

OData.read

Description: Reads data from the specified URL.

OData.read = function (url | request, [success(data, response)], [error(error)], [handler], [httpClient], [metadata])

Parameters

url - A string containing the URL to which the request is sent.

request -  An Object that represents the HTTP request to be sent.

success(data, response) - A callback function that is executed if the request succeeds, taking the processed data and the server response.

error(error) - A callback function that is executed if the request fails, taking an error object.

handler - Handler object for the response data.

httpClient - Object to use as an HTTP stack.

metadata - Object describing the structural metadata to use.

Details

This function is used to read data from an OData end point. In its simplest form this function takes a URL string to which an HTTP GET request is sent.

For example you can get all available Genres in the Netflix service by making the following call.

OData.read("http://odata.netflix.com/v1/Catalog/Genres");

As part of the URI you can also pass in parameters.

OData.read("http://odata.netflix.com/v1/Catalog/Genres?$top=3");

For full documentation on OData supported URIs refer to the OData documentation on the OData site.

The success parameter of the read operation takes a callback function, which is executed if the request succeeds. If it is not defined then the default success handler is used from OData.defaultSuccess, which simply shows a dialog box with the results in a string form. It is a good practice to define your own success callback function.

OData.read("http://odata.netflix.com/v1/Catalog/Genres",
function (data, request) {
var html = "";
for (var i = 0; i < data.length; i++) {
html += "<div>" + data[i].Name + "</div>";
}
document.getElementById("target-element-id").innerHTML = html;
});

The read function also takes an error parameter which is a callback function. The error callback is executed if the request fails. If the function is not defined then the default error handler OData.defaultError is invoked causing the error to be thrown as an exception.

OData.read("http://odata.netflix.com/v1/Catalog/Genres",
function (data, request) {
var html = "";
for (var i = 0; i < data.length; i++) {
html += "<div>" + data[i].Name + "</div>";
}
document.getElementById("target-element-id").innerHTML = html;
}, function(err){
alert("Error occurred " + err.message);
});

A handler can optionally be passed to the OData.read function. By default the MIME-based handler is used, which looks at the Content-Type header and uses the correct handler to parse the payload results. For example if the response's content type header is application/json then the JSON handler would be invoked to parse the data.

The read function also takes in an httpClient object. If it is not defined then the default httpClient object is used from OData.defaultHttpClient, which is set to a library that can take an HTTP request representation and return a response message from it.

More information on how to define a custom handler and httpClient is covered under handler and httpClient documentation respectively.

The metadata parameter is optional, and defaults to OData.defaultMetadata. Handlers make use of metadata to enhance how they process data.

To allow finer control over what is sent in the request this function also allows passing in a request object instead of a simple URI string.

The following example shows how to specify the accept type inside the request object, along with success and error callback functions.

OData.read({ requestUri: "http://odata.netflix.com/v1/Catalog/Genres",
headers: { Accept: "application/json" } },
function (data, response) {
alert("Operation succeeded.");
}, function (err) {
alert("Error occurred " + err.message);
});

The read operation returns a request value. The request value supports an abort invocation to cancel progress.


Top

 

OData.request

Description: Sends a request containing OData payload to the server.

Odata.request = function (request, [success(data, response)], [error(error)], [handler], [httpClient], [metadata])

 Parameters

request - An Object that represents the HTTP request to be sent

success(data, response) - A callback function that is executed if the request succeeds, taking the processed data and the server response.

error(error)  - A callback function that is executed if the request fails, taking an error object.

handler - Handler object for the request data.

httpClient - Object to use as an HTTP stack.

metadata - Object describing the structural metadata to use.

Details

The OData.request is a low level API for fine grained control over the request. It provides optional parameters to set callback functions, custom data handler and the HTTP client to use.

This API takes a request object as the first parameter that contains the headers, the target URI, the HTTP verb, which defines which CRUD operation to perform, and lastly the data on which the action takes place.

The request object should conform to the following signature:

request = { 
headers : object, // object that contains HTTP headers as name value pairs
requestUri : string, // OData endpoint URI
method : string, // HTTP method (GET, POST, PUT, DELETE)
data : object // Payload of the request (in intermediate format)
};

The headers is an object containing key value pairs that let the user control the semantics of the HTTP request. For example user can specify the 'Accept' type for data or the version of the DataService. Whatever is defined in the headers object takes precedence over the defaults; however if a value is missing in the headers then the defaults are used. For example if the caller specifies a given DataServiceVersion, then handlers will not set the value for the DataServiceVersion header.

var request = { headers : { "DataServiceVersion": "2.0" } };

The requestUri is a string value that specifies the location on the OData endpoint to which the operation is targeted. For operations on existing items the URI should resolve to a single entity or link. For an insert operation this URI points to the resource set or collection to which the entity is inserted. For more information on OData operations refer to: http://www.odata.org/developers/protocols/operations

Here are few examples of how the requestUri would look for a given operation.

Add a new entity in an entity set
requestUri : http://services.odata.org/website/odata.svc/Customers

Add a new entity in a linked entity set
requestUri : http://services.odata.org/website/odata.svc/Customers(1)/Orders

Update an existing entity
requestUri : http://services.odata.org/website/odata.svc/Customers(1)

Update an existing linked entity
requestUri : http://services.odata.org/website/odata.svc/Customers(1)/Orders(1)

Merge an existing entity
requestUri : http://services.odata.org/website/odata.svc/Customers(1)

Merge an existing linked entity
requestUri : http://services.odata.org/website/odata.svc/Customers(1)/Orders(1)

Remove an existing entity
requestUri : http://services.odata.org/website/odata.svc/Customers(1)

Remove (i.e. set to null) a primitive property value in an existing entity
requestUri : http://services.odata.org/website/odata.svc/Customers(1)/FirstName/$value

Remove an existing linked entity
requestUri : http://services.odata.org/website/odata.svc/Customers(1)/Orders(1)

 

The method property in the request object determines, via an HTTP verb, the CRUD operation to perform. The following table shows the operations supported by the protocol and how they map to HTTP verbs.

Operation HTTP Verb
Add a new entity POST
Update an existing entity PUT
Delete an entity DELETE

The data property defines the payload of the request defined in the intermediate format. The request is passed as input to the handler object which serializes the request payload to the appropriate wire format; as controlled by the request headers. The transformed data is then stored in the body property. Finally, the modified request object is passed to the underlying network stack.

Some update operations, such as add, OData endpoint generate a response with the new item. This data is valuable because it provides useful metadata like the edit URI or concurrency information. It is also useful for any server generated property values. The response is processed by the appropriate handler. All responses trigger the invocation of either a success or error callback depending on the status of the HTTP response.

The success parameter of the request operation takes a callback function, which is executed if the request succeeds. If it is not defined then the default success handler is used from OData.defaultSuccess. It is a good practice to define your own success callback function.

The request function also takes an error parameter which is a callback function. The error callback function is executed if the request fails. If error callback function is not defined then the default error handler is used from the OData.defaultError.

Optionally a handler can be passed to the OData.request function. If it's not defined, the MIME-based handler is used, which looks at the Content-Type header and uses the correct handler to parse the payload results. For example if the data type is JSON then JSON handler would be invoked to parse the data.

The request function also takes in an httpClient object. If it is not defined then the default httpClient object is used from OData.defaultHttpClient.

Batch Operations
datajs request API also supports OData batch operations that that allow grouping multiple operations into a single HTTP request. Following two examples show how to do batch operations for GET, PUT and POST by using __batchRequests and __changeRequests
Example 1:

OData.request( {
requestUri: "http://ODataServer/FavoriteMovies.svc/$batch",
method: "POST",
data: { __batchRequests: [
{ requestUri: "BestMovies(0)", method: "GET" },
{ requestUri: "BestMovies(1)", method: "GET" }
]}
},
function (data, response) {
//success handler
}, undefined, OData.batchHandler);


Example 2:

OData.request( {
    requestUri: "http://ODataServer/FavoriteMovies.svc/$batch",
    method: "POST",
    data: { __batchRequests: [
            { __changeRequests: [
                { requestUri: "BestMovies(0)", method: "PUT", data: {MovieTitle: 'Up'} },
                { requestUri: "BestMovies", method: "POST", data: {ID: 2, MovieTitle: 'Samurai'} }
                ]
            }           
    ]}
},
function (data, response) {
    //success handler
}, undefined, OData.batchHandler);

 

Top

 

OData.defaultSuccess

Description: default callback function for APIs that succeed.

Odata.defaultSuccess = function (data, response)


Details

The OData.defaultSuccess property is used to fill in the value for calls that do not specify a success callback handler.

By default, it simply displays the string representation of data in an alert box. This is convenient for quick prototyping, for example you can type javascript:(function(){OData.read("/myservice.svc/Customers");})() in the location bar of the browser when datajs.js is available, and see the results in an alert box.

This property is not often used; instead, most API calls should include a situation-specific success handler.

Top

OData.defaultError

Description: default callback function for APIs that fail.

Odata.defaultError = function (error)

Details

The OData.defaultError property is used to fill in the value for calls that do not specify an error callback handler.

By default, it simply throws the given error, which may go unhandled and break into the debugger if one is running in the browser. This is convenient for debugging, but typically the library consumer can do something more appropriate.

This property is often used as a generic error handler hook when all or most API calls end up routing through the same error handling mechanism.

Note that the error object may contain more or less information depending on when the error was found. For example, failures that occur when a request is being sent will not have a response property, but failures that occur while processing the response typically will have this property set.

Top

OData.defaultHttpClient

Description: provides the default HTTP layer for datajs.

Odata.defaultClient = { request: function (request, success, error) } 

Details

The OData.defaultHttpClient property is used to fill in the value for calls that do not specify an HTTP client value.

The single property request provides a function that can be invoked with a request object, a success callback and an error callback.

By default, the library provides an HTTP client that relies on the XMLHttpRequest object to handle network requests.

The built-in HTTP client also provides properties to control the use of JSONP - see Cross-domain requests for additional information.

This property is typically left in its default state, although it may be replaced for advanced scenarios such as to add custom HTTP-level logging.

 Top

OData

The current OData support in datajs provides the ability to read and write to an OData service, using both the JSON and ATOM-based formats, including support for batch operations. The API is very simple and the library consists of only a few concepts.

The basic approach is as this: you can use OData.read with a URL to get data from a service, and the library will make sure that the reply comes in a consistent format. To add, update, or delete data, you can use OData.request with a request that includes the POST, PUT or DELETE methods and an object in that same format, and the library will take care of serializing it into the correct format.

For common cases, that's it. There are additional facilities to do more interesting things if you're interested, like tweaking the way the formatters work or the way that the OData.defaultHttpClient object uses the network.

Over time, we expect to build additional functionality that makes it easier to use so you don't have to deal with the payload format, or help with tracking changes, or making sure that you have a single instance if you ask multiple times for the same piece of data.

 Top

OData Snippets

The following snippet shows how to read a list of category names from the Northwind service, using jQuery to add them to an element with ID 'target-element-id'. Because the service supports JSONP, the default network library uses it even though the document comes from a different domain.

OData.read(
  "http://services.odata.org/Northwind/Northwind.svc/Categories",
  function (data) {
    var html = "";
    $.each(data, function(l) { html += "<div>" + l.CategoryName + "</div>"; });
    $(html).appendTo($("#target-element-id"));
  }
);

The following snippet shows how to add a new customer to an OData service that exposes a Customers resource set with Name, CustomerCategory and ID properties.

OData.request(
  { requestUrl: "/customer-service/Customers",
    method: "POST",
    data: { Name: "customer name", CustomerCategory: 123 } },
  function (insertedItem) {
    $("<div>inserted customer ID: " + insertedItem.ID + "</div>").appendTo($("#target-element-id"));
  }
);

TODO: add more samples, validate samples.

 Top

Metadata

Metadata is used by handlers to enhance how requests and results are processed. This corresponds to the Service Metadata Document as described in the OData documentation.

Metadata is optional for many scenarios, but it can be used to improve how server values are interpreted and annotated in their in-memory representations.

In some cases metadata makes it possible to understand models that cannot otherwise be processed correctly. For example, if a server maps an entity property outside of the entity content, the ATOM representation requires metadata to be able to extract the information from the right location in the document.

 Top

Security

The default HTTP library works by using the XMLHttpRequest object, which doesn't allow the page to get data from a server different from the one that served the page (this is often referred to as the same origin policy.).

This means that the library generally trusts the server to provide non-malicious payloads (the library is, after all, also provided by the same server).

This last point is why JSONP is disabled by default in the OData.defaultHttpClient object. If you trust the servers that you will be contacting, then you can turn support on by setting this property to true, as in the following snippet.

OData.defaultHttpClient.enableJsonpCallback = true;

To protect against man-in-the-middle attacks, you should enable and use the HTTPS protocol and refer to the data sources through 'https://' URLs.

 Top

Cross-Domain Requests

Browsers have a policy (commonly referred to as the same origin policy. that blocks requests across domain boundaries. Because of this restriction update operations cannot be performed if the web page is served by a domain and the target OData endpoint is in a different one. Users have the ability to disable this policy in the browser, however it is typically turned on by default. datajs is designed with such an assumption. The following options are available to support this scenario:

  • Have the web server provide a relaying mechanism to redirect requests to the appropriate OData endpoint.
  • Use an XDomainRequest object. This option is not available in all browsers.
  • Use cross origin XMLHttpRequest object. This option is also not available in all browsers.
  • Prompt for consent on first use. This option is not available in all browsers and generally provides a poor user experience.

Read operations are available for cross-domain requests if the OData server supports JSONP. To configure the datajs to use JSONP, set the following properties on the OData.defaultHttpClient object.

  • formatQueryString: this is a query string option inserted in the URL.
  • callbackParameterName: this is the name of a query string option that will be assigned a function name to call back into.
  • enableJsonpCallback: see Security before setting this property to true.

By default, all values except for enableJsonpCallback are set to work with the popular JSONP extension for WCF Data Services-based servers, but can be modified to any appropriate value before a request is submitted.

 Top

OData Internals

  • HTTP Client. This component makes network requests. The built-in handler is available as OData.defaultHttpClient, but can be replaced by any object with the same operations. You can easily mock it when testing your app.
  • Format handlers. These components turn raw payloads in JSON or ATOM format and turn them into a consistent in-memory representation.
  • Requests, responses and data. These are simply JavaScript objects that have certain specific members. There is no need to call constructors or anything of the sort; simply create a new object with the right shape and you're ready to go.
  • APIs. There are two APIs that can be used to read and write data, OData.read and OData.request. They simply glue all of the above with callbacks and provide a very simple way to interact with a data service.
  Comments ()
Ajax Control Toolkit by: حنیف شیخ عبدالکریم-Hanif sheikhabdolkarim

یکی از بزرگترین و موفق ترین پروژه های متن باز مایکروسافت شامل کامپوننتها و ابزارهای گوناگون برای برنامه نویسان سیستمهای وب در محیط دات نت می باشد.این ابزار برای ساخت سایتهای نسل جدید بر روی web 2  با تکنولوژی interactive بسیار کاربردی است .

The Ajax Control Toolkit contains a rich set of controls that you can use to build highly responsive and interactive Ajax-enabled Web applications. The Ajax Control Toolkit contains more than 40 controls, including the AutoComplete, CollapsiblePanel, ColorPicker, MaskedEdit, Calendar, Accordion, and Watermark controls. Using the Ajax Control Toolkit, you can build Ajax-enabled ASP.NET Web Forms applications by dragging-and-dropping Toolkit controls from the Visual Studio Toolbox onto a Web Forms page.

 

  Comments ()
معرفی پروژه های کد باز مایکروسافت by: حنیف شیخ عبدالکریم-Hanif sheikhabdolkarim

مایکروسافت پس از فراز نشیبهای فراوان در بدست آورن مدل تجاری مناسب برای رقابت با متن بازها ،از چند سال پیش با ظهور تکنولوژی .Net ،پروژه کد باز خود را معرفی کرد:

 

http://www.codeplex.com

 

این سایت ،سایت رسمی پروژه های متن باز مایکروسافت است .

بسیاری از منابع این سایت در ضمینه متن باز های و پروژه ها اقتباس شده از این سایت می باشد.

 

  Comments ()
فن آوری آطلاعات by: حنیف شیخ عبدالکریم-Hanif sheikhabdolkarim

فناوری اطلاعات (فا) [۱] (به انگلیسی: Information Technology یا IT)، همچنانکه به‌وسیله انجمن فناوری اطلاعات آمریکا (ITAA‎) تعریف شده‌است، «به مطالعه، طراحی، توسعه، پیاده‌سازی، پشتیبانی یا مدیریت سیستم‌های اطلاعاتی مبتنی بر رایانه، خصوصا برنامه‌های نرم‌افزاری و سخت‌افزار رایانه می‌پردازد». به طور کوتاه، فناوری اطلاعات با مسائلی مانند استفاده از رایانه‌های الکترونیکی و نرم‌افزار سروکار دارد تا تبدیل، ذخیره، حفاظت، پردازش، انتقال و بازیابی اطلاعات به شکلی مطمئن و امن انجام پذیرد.

اخیرا تغییر اندکی در این عبارت داده می‌شود تا این اصطلاح به طور روشن دایره ارتباطات مخابراتی را نیز شامل گردد. بنابراین عده‌ای بیشتر مایلند تا عبارت «فناوری اطلاعات و ارتباطات» (Information and Communications Technology) یا به اختصار ICT را به کار برند. [۲]

فهرست مندرجات

[ویرایش] تعریف

فناوری اطلاعات بسیار از علم رایانه وسیع‌تر (و مبهم تر) است. این اصطلاح در دهه ۱۹۹۰ جایگزین اصطلاحات پردازش داده‌ها و سیستم‌های اطلاعات مدیریت شد که در دهه‌های ۱۹۷۰ و ۱۹۶۰ بسیار رایج بودند. فناوری اطلاعات معمولاً به تولید و پردازش و نگهداری و توزیع اطلاعات در موسسات بزرگ اشاره دارد.

دانش فناوری اطلاعات و رایانه با هم فرق می‌کنند، البته در موارد زیادی با هم اشتراک دارند. اگر علم رایانه را مشابه مهندسی مکانیک بگیریم، فناوری اطلاعات مشابه صنعت حمل و نقل است. در صنعت حمل و نقل، خودرو و راه‌آهن و هواپیما و کشتی داریم. همه این‌ها را مهندسان مکانیک طرح می‌کنند. در عین حال در صنعت حمل و نقل مسائل مربوط به مدیریت ناوگان و مدیریت ترافیک و تعیین استراتژی حمل و نقل در سطح شرکت و شهر و کشور مطرح است که ربط مستقیمی به مهندسی مکانیک ندارد اما فناوری اطلاعات و ارتباطات (آی‌سی‌تی) مهمترین مقوله در این زمینه‌است.

[ویرایش] عناصر اصلی

فناوری اطلاعات متشکل از چهار عنصر اساسی (انسان، ساز و کار، ابزار، ساختار) است، به طوری که در این فناوری، اطلاعات از طریق زنجیره ارزشی که از بهم پیوستن این عناصر ایجاد می‌شود جریان یافته و پیوسته تعالی و تکامل سازمان را فراراه خود قرار می‌دهد: [۳]

  • انسان: منابع انسانی، مفاهیم و اندیشه، نوآوری
  • ساز و کار: قوانین، مقررات و روشها، سازوکارهای بهبود و رشد، سازوکارهای ارزش گذاری و مالی
  • ابزار: نرم‌افزار، سخت‌افزار، شبکه و ارتباطات
  • ساختار: سازمانی، فراسازمانی مرتبط، جهانی

[ویرایش] زمینه‌های فناوری اطلاعات

امروزه معنای اصطلاح «فناوری اطلاعات» بسیار وسیع شده‌است و بسیاری از جنبه‌های محاسباتی و فناوری را دربر می‌گیرد و نسبت به قبل شناخت این اصطلاح آسان‌تر شده‌است. چتر فناوری اطلاعات تقریباً بزرگ است و بسیاری از زمینه‌ها را پوشش می‌دهد. متخصصین فناوری اطلاعات وظایف متنوعی دارد، از نصب برنامه‌های کاربردی تا طراحی شبکه‌های پیچیده رایانه‌ای و پایگاه داده‌های اطلاعاتی. چندی از زمینه‌های فعالیت متخصصین فناوری اطلاعات می‌تواند موارد زیر باشند: [۲] فناوری اطلاعات و علوم کتابداری و اطلاع رسانی ارتباط تنگاتنگی با هم دارند.

در ایران متولی اصلی فناوری اطلاعات و ارتباطات را وزارت ارتباطات و فناوری اطلاعات میدانند . [۱]

[ویرایش] فناوری اطلاعات در دانشگاه‌های ایران

در بیشتر کشورها این دانش در دانشگاه‌ها با عنوان رشته «فناوری اطلاعات» (Information Technology) شناخته می‌شود، در حالیکه در ایران بر اساس تصمیم سازمان آموزش عالی کشور عنوان «مهندسی فناوری اطلاعات» برای این رشته بکار برده می‌شود و رشته ای نیز تحت عنوان مهندسی فناوری اطلاعات و ارتباطات (ICT) به پیشنهاد وزارت ارتباطات و فناوری اطلاعات اخیراً در دانشگاههای ایران تدریس می‌شود همچنین رشته‌ای با عنوان فقط «فناوری اطلاعات» وجود ندارد.[نیازمند منبع] همچنین رشتهٔ میان‌رشته‌ای دیگری با عنوان رشته «مدیریت فناوری اطلاعات» در دانشگاه‌های ایران و دیگر کشورها وجود دارد که از ترکیب دو رشته "مدیریت" و «فناوری اطلاعات» به وجود آمده‌است. رشته مهندسی فناوری اطلاعات به چگونگی سازماندهی و ساماندهی داده‌ها می‌پردازد و رشته مدیریت فناوری اطلاعات به چگونگی تدوین سیستم و استفاده از داده‌ها می‌پردازد. هرکدام از این رشته‌ها دارای گرایش‌های ویژه خود هستند که در دانشگاه‌های ایران به شرح زیر اند:

گرایش‌های رشته مهندسی فناوری اطلاعات:

  • تجارت الکترونیکی
  • سیستم‌های چندرسانه‌ای
  • مدیریت سیستم‌های اطلاعاتی
  • امنیت اطلاعات
  • شبکه‌های کامپیوتری
  • مهندسی فناوری اطلاعات (IT)

گرایش‌های رشته مدیریت فناوری اطلاعات:

  • مدیریت منابع اطلاعاتی
  • سیستم‌های اطلاعات پیشرفته
  • نظام کیفیت فراگیر

گرایش‌های رشته مهندسی فناوری اطلاعات و ارتباطات:

  • مدیریت شبکه
  • دیتا و امنیت شبکه
  • ارتباطات سیار
  • مدیریت ارتباطات و فناوری اطلاعات
  • سیستمهای چند رسانه ای

[ویرایش] سرفصل دروس مهندسی فناوری اطلاعات

درس‌های تخصصی کارشناسی مهندسی فناوری اطلاعات عبارتند از:

  • مبانی فناوری اطلاعات
  • مهندسی فناوری اطلاعات
  • تجارت الکترونیکی
  • مدیریت و کنترل پروژه‌های فناوری اطلاعات
  • برنامه‌ریزی استراتژیک فناوری اطلاعات
  • آموزش الکترونیکی
  • محیط‌های چند رسانه‌ای
  • پروژه فناوری اطلاعات
  • کارآموزی IT
  • گرافیک کامپیوتری

[ویرایش] مهندسی فناوری اطلاعات و ارتباطات

برنامه درسی کارشناسی ناپیوسته "مهندسی تکنولوژی ارتباطات و فن آوری اطلاعات ICT" با ده گرایش در جلسه روز 25/6/85 شورای برنامه ریزی آموزش و درسی وزارت علوم تحقیقات و فناوری به تصویب نهایی رسید. این برنامه برای رفع نیازهای تخصصی وزارت ارتباطات و فن آوری اطلاعات از سوی این وزارتخانه طراحی و تدوین شده و با تأئید گروه صنعت به دبیرخانه شورا ارسال شده بود.

دکتر کشت کار معاون پژوهشی و مدیر دفتر برنامه ریزی درسی دانشگاه جامع نیز ضمن اعلام خبر فوق اظهار داشت این برنامه جهت آموزش مهندسینی طراحی شده که بتوانند نیازهای تخصصی تعداد زیادی از مشاغل مرتبط با تکنولوژی ICT را برآورده نمایند. این رشته با 10 گرایش برای ده طیف از متصدیان مشاغل مهندسی و طراحی در وزارت ارتباطات با همکاری مشترک بین این وزارتخانه، دانشگاه جامع و وزارت علوم تدوین شده است. ضمناً این رشته از سال 85 در دانشکده پست و مخابرات ایران و از سال 87 به صورت پایلوت در دانشگاه صنعتی شریف، دانشگاه علم و صنعت ایران، دانشگاه تهران و دانشگاه امیر کبیر اجرایی شد.

[ویرایش] متولی فناوری اطلاعات در ایران

در ایران همیشه بحث بر سر متولی اصلی فناوری اطلاعات وجود داشت تا با تغییر نام وزارت پست و تلگراف و تلفن در سال 1382 به وزارت ارتباطات و فناوری اطلاعات و مهمتر از آن ایجاد معاونت فناوری اطلاعات وزارت ارتباطات، خود را متولی اصلی فناوری اطلاعات در کشور مطرح ساخت. از این سال به بعد توسعه همه جانبه‌ای در این وزارتخانه صورت گرفت تا شرکتها و مراکز متعددی زیر مجموعه آن تشکل یافتند و هر یک از آنها با توانمندیها و فعالیتهای بسیار، تحولات فراوانی را شکل داده و باعث گسترش وضع ارتباطی کشور در بخش‌های پست و مخابرات شدند.معاونت فناوری اطلاعات به منظور تدوین راهبردها،سیاستها،برنامه‌های بلند مدت و اهداف کیفی و کمی بخش توسعه فناوری اطلاعات و ارائه آن به شورای عالی فناوری اطلاعات معاونتی تحت عنوان معاونت فناوری اطلاعات در ساختار سازمانی وزارت ارتباطات و فناوری اطلاعات در نظر گرفته شد. و کم کم سازمانهایی مثل سازمان فناوری اطلاعات و ارتباطات زیرساخت نیز در این رابطه شکل گرفتند . [۲]

 

منبع:http://fa.wikipedia.org

  Comments ()
استفاده از .Net برای طراحی برنامه های کاربردی Enterprise by: حنیف شیخ عبدالکریم-Hanif sheikhabdolkarim

This article is an attempt to show how to implement distributed application in .NET Framework from scratch. Indeed I can share my experience which I hope will be useful for architects (obviously beginners in architecture design) or a lead developer wants to become an architect to start with. The application will contain a simple web client CustomerOrderManagement System with our own distributed application platform.

The application will cover the following parts:

  • Part 1 (Distributed Application Layers with project details): Learn about what is layered design in distributed environment and how we are going to name it while implementing the actual app.
  • Part 2 (Database and Library Design): Learn about database design and implementing library which interacts with Edmx container.
  • Part 3 (Engine and Service Managers design): Learn how to implement the engine which has core business logic and implementing the actual WCF service with service contracts, and also how to test the service using test client.
  • Part 4 (Client implementation): Learn how to implement the actual client with MVVM pattern which invokes services.

I will try to post the remaining parts ASAP.

Prerequisites

In order to run WCF service, Database design and Silverlight application, you need the .NET Framework version 3.0, or greater. Windows Vista has the .NET Framework v3.0 installed by default, so you only need to install it if you have Windows XP SP2. To develop distributed enterprise applications, you should have Visual Studio 2008 or a later version of Visual Studio, SQL Server Management Studio Express 2005 and also the Windows SDK. Article source uses VS2010 and SQL Server Management Studio Express 2005.

Why Should I Read This Article?

You can ask this question when you read the title of the this article. If you are an architect or a lead developer who aspires to architect solutions, spend some time on this (sorry to take your valuable time). When you complete this article, you will feel that you can design any complex application very easily. I am pretty sure about this. However I am not going to talk about design principles in depth. If you want, you can learn it from Microsoft site which describes everything very clearly. In my view, describing the concepts theoretically again and again without practical implementation doesn’t make sense. So I will describe directly what, why and how to do that in real time. I always like action instead of reading action story:). Designing complex application is not an easy task. As everyone knows, many decisions need to be taken at the architecture, design, and implementation levels. These decisions will have an impact on the abilities of the application security, scalability, maintainability, and availability. This article will help you to design your application with clear separation between the layers such as Data Layer, Business Layer, Service Layer and Presentation Layer from scratch. Let’s start with action.

What Are These Layers and What Should They Provide?

Let's start by visiting each of these layers and discuss what these layers should provide and what they should not. Figure 1 shows the simplified logical component layers that this article uses to design the architecture. I am not going to include many components here again. Just see figure 1 and understand the purpose of this article right away.

AppLayer.png

(Figure 1 – Enterprise Application Layers)
  • Presentation Layer - Contains user related functionality for managing user interaction with the system, and generally consists of service calls for communicating with business logic through service layer.
  • Service Layer - Consists of service contracts and message types to communicate with the business logic to separate the business layer as an independent layer. Also this layer can be located on different tiers, or they may reside on the same tier.
  • Business Layer - This layer implements the core functionality of the system, and encapsulates the relevant business logic. It generally consists of components, some of which may expose service interfaces that other callers can use.
  • Data Access Layer – This layer communicate with database to retrieve and save the data in database using its own context. This data access layer exposes generic interfaces that the components in the business layer can consume.
  • Data Layer – This layer contains the actual business raw data. DBA can design and maintain this layer.

How Do We Implement and By Using What Technologies?

Let’s take figure 1 and add another figure along with that to describe how to implement customer order management enterprise application from scratch. Before I start discussing, I will consider some component from the layers as platform. Because this platform can be hosted anywhere and client can consume whatever they need through network. This is called distributed environment. Here we will consider our business layer, service layer and data access layer as a single platform. So those multiple client platforms (Mobile, Web, etc) can utilize our COMS(CustomerOrderManagement System). This is the major advantage of our COMS. See figure 2 and get an idea about how we are going to implement our COMS platform with technology details.

Note: Please note that COMS refers to the CustomerOrderManagement System throughout the article.

AppLayerWithProjectDetails.png - Click to enlarge image

(Figure 2 – Enterprise Application Layers With Project Details)
  • Service Libraries (Data Access Layer) - This interacts with entity framework through edmx entity container and also provides data to the service engines. We will talk more about this when we come to the library implementation.
  • Service Engines (Business Layer) – This contains core business logic implementation and it interacts with service library to get the actual database without contacting database directly. All the business validation should be implemented here for extensibility. We will talk more about this when we come to the engine implementation.
  • Service Managers (Service Layer) - This layer exposes all the COMS system functionality as WCF service which can be accessed by client from any platform. Manager never interacts with database or library directly. Because business layer contains all the validation which needs to be invoked before sending the response to the client. We will talk more about this when we come to the service manager’s implementation.
  • Client (Presentation layer) – The actual user interface which calls services. This can be any rich client, web client or mobile client or another service call. I am going to use Silverlight for implementing web client in this article. If it’s possible, I will try to post some mobile UI implementation also for testing our COMS platform as part of this article.

To be frank, I don’t know about Mobile app implementation, so I will learn and explain in one part how to make use of our COMS platform services in mobile platform. But you have to trust me and wait for a couple of weeks

Hanif Sheikhabdolkarim

Ref:http://www.codeproject.com

  Comments ()
به پرشین بلاگ خوش آمدید by: پرشین بلاگ
بنام خدا

كاربر گرامي

با سلام و احترام

پيوستن شما را به خانواده بزرگ وبلاگنويسان فارسي خوش آمد ميگوييم.
شما ميتوانيد براي آشنايي بيشتر با خدمات سايت به آدرس هاي زير مراجعه كنيد:

http://help.persianblog.ir براي راهنمايي و آموزش
http://news.persianblog.ir اخبار سايت براي اطلاع از
http://fans.persianblog.ir براي همكاري داوطلبانه در وبلاگستان
http://persianblog.ir/ourteam.aspx اسامي و لينك وبلاگ هاي تيم مديران سايت

در صورت بروز هر گونه مشكل در استفاده از خدمات سايت ميتوانيد با پست الكترونيكي :
support[at]persianblog.ir

و در صورت مشاهده تخلف با آدرس الكترونيكي
abuse[at]persianblog.ir
تماس حاصل فرماييد.

همچنين پيشنهاد ميكنيم با عضويت در جامعه مجازي ماي پرديس از خدمات اين سايت ارزشمند استفاده كنيد:
http://mypardis.com


با تشكر

مدير گروه سايتهاي پرشين بلاگ
مهدي بوترابي

http://ariagostar.com
  Comments ()
Recent Posts استفاده از تکنیکهای داده کاوی در سیستمهای لجستیکی روشهای کشف دانش از سیستمهای عملیاتی سازمانها ، استفاده از تکنیکهای داده کاوی آشنایی با استانداردهای مدیریت اطلاعات Cobit،ITIL،CMMI... IT service management Chaos theory تئوری هرج و مرج ۱۸ راه آزاد برای ذخیره هر نوع ویدئویی از اینترنت خلاصی از شر سایت های تبلیغاتی با کمک آقای گوگل و یک کلیک ساده گروه Qtel سرویس جدید سلامت و بهداشت موبایل معرفی می کند دیوار آتش و نحوه عملکرد آن Google crisis response
My Tags hanif sheikhabdolakrim (۱٢) فن آوری اطلاعات (٤) iran (٢) موبایل (٢) google (٢) داده کاوی (٢) itil (٢) managment and ict (٢) خدمات دولت الکترونیک (٢) ایران فن آوری اطلاعات (٢) it service management (٢) information technology (٢) codeplex (٢) net (۱) cobit (۱) سازمان های هوشمند (۱) logistic (۱) chilee (۱) دیواره آتش (۱) ذخیره هر نوع ویدئویی از اینترنت (۱) حنیف شیخ عبدالکریم (۱) india-chaos theory (۱) enterprise application (۱) پروژه های متن باز (۱) ajax control toolkit (۱) json (۱) web application (۱) datajs (۱) business processes (۱) programming (۱) firewall (۱) data mining (۱) net framework (۱) knowledge management (۱) هوش تجاری (۱) aspnet (۱) اپراتور سوم (۱) گوگل (۱) مایکروسافت (۱) اتاق بازرگانی (۱) دولت (۱) تحریم (۱) مدیریت دانش (۱) microsoft (۱)
My Friends نرم افزارهای Enterprise مایکروسافت asp.net http://www.codeproject.com/ باشگاه مدیران و متخصصان My Pardis