10-06-2017, 06:58 AM
ABSTRACT
Microsoft Windows Distributed interNet Applications Architecture (Windows DNA) is the application development model for the Windows platform. Windows DNA specifies how to: develop robust, scalable, distributed applications using the Windows platform; extend existing data and external applications to support the Internet; and support a wide range of client devices maximizing the reach of an application. Developers are free from the burden of building or assembling the required infrastructure for distributed applications and can focus on delivering business solutions.
Windows DNA addresses requirements at all tiers of modern distributed applications: presentation, business logic, and data. Like the familiar PC environment, Windows DNA enables developers to build tightly integrated applications by accessing a rich set of application services in the Windows platform using a wide range of familiar tools. These services are exposed in a unified way through the Component Object Model (COM). Windows DNA provides customers with a roadmap for creating successful solutions that build on their existing computing investments and will take them into the future. Using Windows DNA, any developer will be able to build or extend existing applications to combine the power and richness of the PC, the robustness of client/server computing, and the universal reach and global communications capabilities of the Internet
INTRODUCTION
The increased presence of Internet technologies is enabling global sharing of information not only from small and large businesses, but individuals as well. The Internet has sparked a new creativity in many, resulting in many new businesses popping up overnight, running 24 hours a day, seven days a week. Competition and the increased pace of change are putting ever-increasing demands for an application platform that enables application developers to build and rapidly deploy highly adaptive applications in order to gain strategic advantage.
Introducing Windows DNA: Framework for a New Generation of Computing Solutions
Windows DNA refers to the Windows Distributed interNet Application architecture, launched by Microsoft."Windows DNA is essentially a 'blueprint' that enables corporate developers and independent software vendors (ISVs) to design and build distributed business applications using technologies that are inherent to the Windows platform,it consists of a conceptual model and a series of guidelines to help developers make the right choices when creating new software applications." Applications based on Windows DNA will be deployed primarily by businesses, from small companies to large enterprise organizations. Consumers are likely to use many of the applications built to take advantage of Windows DNA, such as electronic commerce Web sites and online banking applications.
A major force driving the need for Windows DNA is the Internet, which has dramatically changed the computing landscape. Five years ago, the process of developing programs used by one person on one computer was relatively straightforward. By contrast, some of today's most powerful applications support thousands of simultaneous users, need to run 24 hours a day, and must be accessible from a wide variety of devices from handheld computers to high-performance workstations. To meet these demanding requirements, application developers need adequate planning tools and guidance on how to incorporate the appropriate technologies. The Windows DNA architecture addresses this need.
Microsoft Windows Distributed interNet Applications Architecture (Windows DNA) is Microsoft's framework for building a new generation of highly adaptable business solutions that enable companies to fully exploit the benefits of the Digital Nervous System. Windows DNA is the first application architecture to fully embrace and integrate the Internet, client/server, and PC models of computing for a new class of distributed computing solutions. Using the Windows DNA model, customers can build modern, scalable, multitier business applications that can be delivered over any network. Windows DNA applications can improve the flow of information within and without the organization, are dynamic and flexible to change as business needs evolve, and can be easily integrated with existing systems and data. Because Windows DNA applications leverage deeply integrated Windows platform services that work together, organizations can focus on delivering business solutions rather than on being systems integrators.
Guiding Principles of Windows DNA:
Web computing without compromise.Organizations want to create solutions that fully exploit the global reach and "on demand" communication capabilities of the Internet, while empowering end users with the flexibility and control of today's PC applications. In short, they want to take advantage of the Internet without compromising their ability to exploit advances in PC technology.
Interoperability. Organizations want the new applications they build to work with their existing applications and to extend those applications with new functionality. They require solutions that adhere to open protocols and standards so that other vendor solutions can be integrated. They reject approaches that force them to rewrite the legions of applications still in active use today and the thousands still under development.
True integration. In order for organizations to successfully deploy truly scalable and manageable distributed applications, key capabilities such as security, management, transaction monitoring, component services, and directory services need to be developed, tested, and delivered as integral features of the underlying platform. In many other platforms, these critical services are provided as piecemeal, non-integrated offerings often from different vendors, which force IT professionals to function as system integrators.
Lower cost of ownership. Organizations want to provide their customers with applications that are easier to deploy and manage, and easier to change and evolve over time. They require solutionsthat do not involve intensive effort and massive resources to deploy into a working environment, and that reduce their cost of ownership both on the desktop and server administration side.
Faster time to market.Organizations want to be able to achieve all of the above while meeting tight application delivery schedules, using mainstream development tools, and without need for massive re-education or a "paradigm shift" in the way they build software. Expose services and functionality through the underlying "plumbing" to reduce the amount of code developers must write.
Reduced complexity. Integrate key services directly into the operating system and expose them in a unified way through the components. Reduce the need for information technology (IT) professionals to function as system integrators so they can focus on solving the business problem.
Windows DNA Architecture
Windows DNA architecture employs standard Windows-based services to address the requirements of each tier in the multi tiered solution: user interface and navigation, business logic, and data storage. The services used in Windows DNA, which are integrated through the Component Object Model (COM), include:
1. Dynamic HTML (DHTML)
2. Active Server Pages (ASP)
3. COM components
4. Component Services
5. Active Directory Services
6. Windows security services
7. Microsoft Message Queuing
8. Microsoft Data Access Components
The most common used standards in the Web Services world are:
1. XML Schema: For message data typing and structuring. It allows defining a common vocabulary that the sending and receiving party may understand for achieving the message interchange goal.
2. WSDL: For associating messages and message exchange patterns (logic interface) with service names and network addresses (endpoints acting as physical interface).
3. WS-Addressing: For including endpoint addressing and reference properties associated with endpoints. Many of the other extended specifications require WS-Addressing support for defining endpoints and reference properties in communication patterns.
4. WS-Policy: For associating quality of service requirements with a WSDL definition. WS-Policy is a framework that includes policy declarations for various aspects of security, transactions, and reliability.
5. WS-Security: For providing message integrity, authentication and confidentiality, security token exchange, message session security, security policy expression, and security for a federation of services within a system.
6. WS-Metadata Exchange: For querying and discovering metadata associated with a Web service, including the ability to fetch a WSDL file and associated WS-Policy definitions.
The challenge was to design a system that would allow the actual presentation layer to run on distributed servers but deliver the data required to produce that presentation using Web services technology. Because of this decision, the system is inherently scalable at the front end. There is no single point of failure for the Web presentation layer. In fact, since most customers are using hosting services that employ multiple front-end servers using failover and round-robin DNS, we get the benefit of their existing hosts redundancy and scalability.
This architecture allowed us to focus on the next layer of the distributed architecture. Many architects use the term business services layer" to describe the layer of service that sits behind the presentation services in a 3-tier system.
But after years of system design, development, and research on Windows DNA 3-tier systems
Microsoft has added several key aspects to the architecture with Windows 2000. This section contains:
1. Component Services
2. Dynamic HTML: Dynamic Hypertext Markup Language (DHTML).
3. Windows Script Components
4. XML: Extensible Markup Language (XML).
5. Active Directory Service Interfaces and IIS
COM allows developers to create complex applications using a series of small software objects.
COM also offers the advantage of programming language independence. That means developers can create COM components using the tools and languages they're familiar with, such as Visual Basic, C, C++ and Java. An easy way to look at it is that COM serves as the glue between the tiers of the architecture, allowing Windows DNA applications to communicate in a highly distributed environment.
DNA - Architecture for Distributed Applications
The following picture shows the different pieces within the DNA architecture, and how they work together:
Server machine:- Placing your business objects on the server increases your control over the entire application, and over configuration issues. It also increases security aspects of the system, and reduces the client-side software footprint.
Central Database:-By keeping all data in a central location, you open up opportunities for data sharing between clients and for central reporting. Business objects need only a central point-of-entry into the data store. ASP is a scripting language that is supported by IIS. It is a combination of HTML, VBScript, and COM. These scripts run on the web server, and then converted to HTML for the client response. ASP provides default components for interaction with the server or with a database.
Dynamic HTML:-This extension to the HTML standard provides precise placement of objects on the screen, data binding, effects, and dynamic modification capabilities.
Custom Graphics:-Graphics and presentation are the final piece to this puzzle. A consistent GUI provides customers with a pleasing means of interfacing with your application.
Cooperating Components:-Microsoft's Windows DNA strategy rests on Microsoft's vision of cooperating components that are built based on the binary standard called the Component Object Model (COM). COM is the most widely used component software model in the world, available on more than 150 million desktops and servers today. It provides the richest set of integrated services, the widest choice of easy-to-use tools, and the largest set of available applications. In addition, it provides the only currently viable market for reusable, off-the-shelf client and server components.
COM enables software developers to build applications from binary software components that can be deployed at any tier of the application model. These components provide support for packaging, partitioning, and distributed application functionality. COM enables applications to be developed with components by encapsulating any type of code or application functionality, such as a user interface
control or line of business object. A component may have one or more interfaces; each exposes a set of methods and properties that can be queried and set by other components and applications. For example, a customer component might expose various properties such as name, address, and telephone number.
Client Environments and Presentation Tier:-Today, many application developers using cooperating components target the development of their applications to the Windows platform to take full advantage of the rich user interface Windows has to offer. Likewise, customers have come to expect a rich, highly functional user interface from their applications. The extended reach of information and services to customers that the Internet has enabled has created a new challenge for the application developer. The application developer today must develop a user interface that is distributable, available on Windows and non-Windows platforms, and supports a wide range of client environments, from handheld wireless devices to high-end workstations. Yet, applications must be rich with features to stay competitive and maintain the functionality that customers have come to expect.
The business logic tier is the heart of the application, where the application-specific processing and business rules are maintained. Business logic placed in components bridge the client environments and the data tiers. The Windows DNA application platform has been developed through years of innovation in supporting high-volume, transactional, large-scale application deployments, and provides a powerful run-time environment for hosting business logic components
the application platform for developing Windows DNA applications includes Web services, messaging services, and component services. Web Services
Integrated with Microsoft's application platform is a high-performance gateway to the presentation tier. Microsoft's Internet Information Server enables the development of Web-based business applications that can be extended over the Internet or deployed over corporate intranets. With IIS, Microsoft introduced a new paradigm to the Internet transactional applications. Transactions are the plumbing that makes it possible to run real business applications with rapid development, easy scalability, and reliability.
Microsoft broadened COM's applicability beyond the desktop application to also include distributed applications by introducing Microsoft Transaction Server (MTS). MTS was an extension to the COM programming model that provided services for the development, deployment, and management of component-based distributed applications. MTS was a foundation of application platform services that facilitated the development of distributed applications for the Windows platform in a much simpler, more cost-effective manner than other alternatives. COM+ is the next evolutionary step of COM and MTS. The unification of the programming models inherent in COM and MTS services makes it easier to develop distributed applications by eliminating the tedious nuances associated with developing, debugging, deploying, and maintaining an application that relies on COM for certain services and MTS for others. The benefits to the application developer is to make it faster, easier, and ultimately cheaper to develop distributed applications by reducing the amount of code required to leverage underlying system services.
To continue to broaden COM and the services offered today in MTS 2.0, COM+ consists of enhancements to existing services as well as new services to the application platform. They include:
Bring your own transaction. COM components are able to participate in transactions managed by non-COM+ transaction processing (TP) environments that support the Transaction Internet Protocol (TIP).
Expanded security. Support for both role-based security and process-access-permissions security. In the role-based security model, access to various parts of an application is granted or denied based on the logical group or role that the caller has been assigned to (for example, administrator, full-time employee, or part-time employee). COM+ expands on the current implementation of role-based security by including method-level security for both custom and IDispatch(Ex)-based interfaces.
Centralized administration. The Component Services Explorer, a replacement for today's MTS Explorer and DCOMCNFG, presents a unified administrative model, making it easier to deploy, manage, and monitor rc-tiered applications by eliminating the overhead of using numerous individual administration tools.
Queued components. For asynchronous deferred execution when cooperating components are disconnected, this is in addition to the session-based, synchronous client/server programming model, where the client maintains a logical connection to the server today.
Event notification. For times when a loosely coupled event notification mechanism is desirable, COM+ Events is a unicast/multicast, publish/subscribe event mechanism that allows multiple clients to "subscribe" to events that are "published" by various servers. This is in addition to the existing event notification framework delivered with connection points.
Load balancing. Load balancing allows component-based applications to distribute their workload across an application cluster in a client-transparent manner.
Microsoft Message Queue Server (MSMQ) provides loosely coupled and reliable network communications services based on a messaging queuing model. MSMQ makes it easy to integrate applications by implementing a push-style business event delivery environment between applications, and to build reliable applications that work over unreliable but cost-effective networks. MSMQ also offers seamless interoperability with other message queuing products, such as IBM's MQSeries, through products available from Microsoft's independent software vendor (ISV) partners.
WINDOWS DNA Universal Data Access
Universal Data Access is Microsoft's strategy for providing access to information across the enterprise. Today, companies building database solutions face a number of challenges as they seek to gain maximum business advantage from the data and information distributed throughout their corporations. Universal Data Access provides high-performance access to a variety of information sources, including relational and non relational data, and an easy-to-use programming interface that is tool and language independent.
The foundation for developing enterprise data interoperability solutions on the Microsoft Windows platform is Microsoft Windows Distributed interNet Applications Architecture (Windows DNA). Windows DNA, which is based on the widely used Component Object Model (COM), specifies how to do the following:
Develop robust, scalable, distributed applications using the Windows platform.
Extend existing data and external applications to support Internet operations.
Support a wide range of client devices maximizing the reach of an application.
Figure 1. Microsoft Windows DNA Architecture
Interoperability and reuse are key attributes of Windows DNA. Unlike traditional software development, which required each application to be built from scratch, the Component Object Model (COM) enables developers to create complex applications using a series of small software objects (COM components). For example, a component might be a credit card authorization procedure or the business rules for calculating shipping costs. The COM programming model speeds up the development process by enabling multiple development teams to work on different parts of an application simultaneously.
COM also offers the advantage of programming language independence. This means that Windows developers can create COM components using tools and languages with which they are familiar, such as Microsoft Visual Basic and Visual C++ . For non-Windows programmers, including mainframe COBOL and Web publishers, COM components can be accessed from simple scripting languages, such as VBScript and JScript development software. Windows DNA simplifies development by providing access to a wide range of services and products developed using a consistent object model COM.
One example of the services available is what we call COM services for interoperability. COM services for interoperability include the network, data, application, and management services that are part of existing Microsoft products, such as Microsoft SNA Server. COM services for interoperability provide a common approach to system integration using the wide range of COM components available today.
An Interoperability Framework
Microsoft has defined a four-layer framework for interoperability based on industry standards for Network, Data, Applications, and Management or NDAM for short. Microsoft provides access to interoperability components in each of these four categories. This document focuses on the Data interoperability layer, providing an overview of the wide range of COM components available for accessing multiple data stores across an enterprise environment.
Figure 2. Microsoft Interoperability Framework
Enterprises run their daily operations relying on multiple data sources, including database servers, legacy flat-file records, e-mail correspondence, personal productivity documents (spreadsheets, reports, presentations), and Web-based information publishing servers. Typically, applications, end-users, and decision-makers access these data sources by employing a variety of nonstandard interfaces. Data interoperability standards offer the transparent and seamless ability to access and modify data throughout the enterprise. Microsoft s data interoperability strategy is called Universal Data Access. Universal Data Access uses COM objects to provide one consistent programming model for access to any type of data, regardless of where that data may be found in the enterprise.
An easy-to-use programming architecture that is both tool and language independent, Universal Data Access provides COM objects for high-performance access to a variety of relational (SQL) and nonrelational information sources. The technologies that make up the Universal Data Access strategy enable you to integrate diverse data sources, create easy-to-maintain solutions, and use your choice of best-of-breed tools, applications, and platform services.
To leverage existing investments, Universal Data Access does not require expensive and time-consuming movement of data into a single data store, nor does it require commitment to a single vendor s data products. Universal Data Access is based on open industry specifications with broad industry support, and works with all major established database platforms.
Figure 3. Universal Data Access Architecture
The Microsoft Data Access Components (MDAC) are the key technologies that enable Universal Data Access. Data-driven client/server applications deployed over the Web or a LAN can use these components to easily integrate information from a variety of sources, both relational and nonrelational. These technologies include Open Database Connectivity (ODBC), OLE DB, and Microsoft ActiveX Data Objects (ADO).
.
ODBC
Open Database Connectivity is an industry standard and a component of Microsoft Windows Open Services Architecture (WOSA). The ODBC interface makes it possible for applications to access relational data stored in almost any database management system (DBMS).
Microsoft's ODBC industry-standard data access interface continues to provide a unified way to access relational data as part of the OLE DB specification. ODBC is a widely accepted application programming interface (API) for database access. It is based on the Call-Level Interface (CLI) specifications from X/Open and ISO/IEC for database APIs and uses Structured Query Language (SQL) as its database access language. Microsoft has implemented a number of ODBC drivers to access diverse data stores. ODBC is widely supported by Microsoft, third party application development products, and end-user productivity applications.
Figure 4. ODBC Architecture
Microsoft offers a number of ODBC drivers as part of the Microsoft Data Access Components, which ships as a feature of many popular Microsoft products, including Microsoft SQL Server , Microsoft Office, Microsoft BackOffice family of products, Microsoft SNA Server, and Microsoft Visual Studio . The following ODBC drivers are included in MDAC version 2.1:
Microsoft ODBC Driver for SQL Server.
Microsoft ODBC Driver for Oracle.
Microsoft ODBC Driver for Microsoft Visual FoxPro .
Microsoft ODBC Driver for Access (Jet engine).
Additionally, Microsoft SNA Server 4.0 with Service Pack 2 ships with a Microsoft ODBC Driver for DB2.
A number of third-party ISVs offer ODBC drivers for many data sources.
In addition, OLE DB includes a bridge to ODBC to enable continued support for the broad range of ODBC relational database drivers available today. The Microsoft OLE DB Provider for ODBC leverages existing ODBC drivers, ensuring immediate access to databases for which an ODBC driver exists, but for which an OLE DB provider has not yet been written.
Note JDBC is a technology for accessing SQL database data from a Java client program. Microsoft offers a JDBC-to-ODBC bridge that allows Java programmers to access back-end data sources using available ODBC drivers. The Microsoft JDBC-ODBC bridge is part of the core set of classes that come with the Microsoft Virtual Machine (Microsoft VM) for Java.
OLE DB
OLE DB is a strategic system-level programming interface to data across the organization. OLE DB is an open specification designed to build on the success of ODBC by providing an open standard for accessing all kinds of data. Whereas ODBC was created to access relational databases, OLE DB is designed for relational and nonrelational information sources.
Figure 5. OLE DB Components
OLE DB encapsulates various database management system functions that enable the creation of software components implementing such services. OLE DB components consist of data providers, which contain and expose data; data consumers, which use data; and service components, which gather and sort data (such as query processors and cursor engines). OLE DB interfaces are designed to help diverse components integrate smoothly so that OLE DB component vendors can bring high-quality OLE DB products to market quickly.
OLE DB data providers
OLE DB data providers implement a set of core OLE DB interfaces that offer basic functionality. This basic functionality enables other OLE DB data providers, service components, and consumer applications to interoperate in a standard, predictable manner. The MDAC Software Development Kit (SDK) includes a set of OLE DB conformance tests that OLE DB component vendors, as well as end-user consumer developers, can run to ensure a standard level of compatibility. In addition, data providers can implement extended functionality as appropriate for a particular data source.
OLE DB data consumers
OLE DB data consumers can be any software programs that require access to a broad range of data, including development tools, personal productivity applications, database products, or OLE DB service components. A major set of OLE DB data consumers are ActiveX Data Objects (ADO), which provide a means to develop flexible and efficient data interoperability solutions using such high-level programming languages as Visual Basic.
OLE DB service components
OLE DB service components implement functionality not natively supported by some simple OLE DB data providers. For example, some basic OLE DB providers do not support rich sorting, filtering, and finding on their data sources. OLE DB service components, such as the Microsoft Cursor Service for OLE DB and the Microsoft Data Shaping Service for OLE DB, can seamlessly integrate with these basic OLE DB data providers to complete the functionality desired or expected by a given OLE DB consumer application. Universal Data Access allows for the development of generic OLE DB consumer applications that access many data sources in a single, uniform manner. Enterprise developers can write COM components that perform a specific function against nonspecific data sources. Such a component might run against a VSAM data set today and run against a SQL Server table tomorrow. This allows enterprises to migrate from one data store to another as efficiencies allow or business needs require.
Resource pooling is another popular function provided by OLE DB service components. When running under Microsoft Transaction Server, Microsoft Internet Information Server, or on standalone basis, OLE DB and ADO applications can make use of OLE DB resource pooling, supported by the OLE DB core services, to enable reuse of OLE DB Data Source proxy objects. Typically, the OLE DB session start-up, from instantiation of the OLE DB data source object (DSO) to creating the underlying network connection to the data source, is the most expensive part of a given transaction or unit of work. This is a critical issue when developing Web-based or multi-tier applications. Maintaining connections to the database in the resource state of the middle-tier component can create scalability issues. Creating a new connection on every page of a Web application is too slow. The solution is OLE DB resource pooling, which enables better scalability and offers better performance.
ADO
Microsoft ActiveX Data Objects (ADO) are a strategic application-level programming interface to data and information wrapped around OLE DB. ADO provides consistent, high-performance access to data and supports a variety of development needs, including the creation of front-end database clients and middle-tier business objects that use applications, tools, languages, or Internet browsers. ADO is designed to be the one data interface needed for single and multi-tier client/server and Web-based data-driven solution development. Its primary benefits are ease of use, high speed, low memory overhead, and a small disk footprint.
ADO provides an easy-to-use programming interface to OLE DB, because it uses a familiar metaphor the COM automation interface, accessible from all leading Rapid Application Development (RAD) tools, database tools, and languages (including scripting languages).
Figure 6. ADO Object Model
ADO uses a flatter and more flexible object model than any previous object-oriented data access technology. Any of the five top-level objects can exist independent of the other four. Unlike DAO, which required constructing a hierarchical chain of objects before accessing data, ADO can be used to access data with just a couple of lines of code.
The secret of ADO s strength is the fact that it can connect to any OLE DB provider and still expose the same programming model, regardless of the specific features offered by a particular provider. However, because each provider is unique in its implementation, how your application interacts with ADO may vary slightly when run against different data providers. Some common differences include ADO connection strings, command execution syntax, and supported data types.
Microsoft offers a number of technologies that facilitate interoperability with data stored on non-Windows systems. Developers can create solutions that take advantage of a large installed base of information sources while working in a familiar environment. The following is a collection of some of the Microsoft technologies available today that provide this capability.
Microsoft OLE DB Provider for DB2
The Microsoft OLE DB Provider for DB2 allows application developers to use familiar object-oriented programming techniques to access DB2 databases over SNA LU6.2 and TCP/IP networks, without requiring knowledge of SNA APPC programming.
Implemented using the open protocol of the Distributed Relational Database Architecture (DRDA), the Microsoft OLE DB Provider for DB2 supports access to remote DB2 data using industry-standard Structured Query Language (SQL) statements. Because the provider is part of Microsoft's universal data access strategy, which is based on OLE DB and ADO, it can interoperate with OLE DB-aware tools and applications in Microsoft Visual Studio 6.0, Microsoft SQL Server 7.0, and Microsoft Office 2000 Developer Edition. These generic consumer applications rely on this OLE DB provider's compatibility as verified using the OLE DB conformance tests. A key requirement is that the provider support the IDBSchemaRowset object. IDBSchemaRowset provides the consumer with a means to query the data source s metadata that describes the target table. Using this schema information, the generic consumer can intelligently process and display the result sets of queries.
Figure 7. OLE DB Provider for DB2 connecting to DB2 for MVS
At run time, generic consumers use IDBSchemaRowset information to make choices on how to behave with a back-end data source. Other information available to generic consumers at run time includes IDBInfo data source-supported keywords and literal information. Typically, consumer applications are designed to modify their behavior based on an expected set of values returned in IDBSchemaRowset and IDBInfo based on well-defined rules in the OLE DB specification. Additionally, the OLE DB Provider for DB2 publishes the data types, including precision and scale limits, in the form of standard OLE DB IDBSchemaRowset DBSCHEMA_PROVIDER_TYPES.
Another use of IDBSchemaRowset data is to populate a list of tables and columns available in the data source s current collection. The OLE DB Provider for DB2 maps IDBSchemaRowset to DB2 system table information. For example, DBSCHEMA_TABLES are provided using information stored in DB2 for OS/390 SYSIBM.SYSTABLES and DB2 for OS/400 QSYS2.SYSTABLES tables. DBSCHEMA_COLUMNS are mapped to DB2 for OS/390 SYSIBM.SYSCOLUMNS and DB2 for OS/400 QSYS2.SYSCOLUMNS information. The OLE DB provider queries these DB2 system tables at run time, returning the data on calls to IDBSchemaRowset TABLES and COLUMNS.
Two examples of this usage:
The Visual Studio Data Designer, which allows developers to preview DB2 tables and drag these tables and columns into the query designer.
The SQL Server Data Transformation Services, which offers the end user an intuitive wizard for picking tables for bulk movement of data between DB2 and any other OLE DB or ODBC data source.
OLE DB providers implement a default read-only, forward-only server cursor that is mapped to the data source s cursor engine whenever possible. In the case of the OLE DB Provider for DB2, the server cursor offered is a forward-only updateable cursor. A server cursor offers the developer the most efficient means to traverse tables on the data source. Some generic consumers may expect and request a scrollable cursor when, as in the case of the OLE DB Provider for DB2, only a forward-only cursor is offered by the provider. In these cases, the generic consumer can request to use a client-side cursor that is implemented on behalf of the provider by the Microsoft Cursor Service for OLE DB. In ADO, a developer can specify the use of the ADO Client Cursor Engine (CCE) by simply specifying
Distributed Query Processor
The Distributed Query Processor (DQP) feature of Microsoft SQL Server 7.0 enables application developers to develop heterogeneous queries that join tables in disparate databases. To access the remote data sources, a user must create a Linked Server definition. The Linked Server encapsulates all of the network, security, and data source-specific information required to connect DQP to the remote data. Linked servers rely on underlying OLE DB providers or ODBC drivers for the actual physical connection to the target data source. Once a linked server has been defined, it can always be referred to with a single logical name as part of a SQL Server dynamic SQL statement or stored procedure. At run time, a linked server resource, such as a remote DB2 table, is treated like a local SQL Server table.
When a client application executes a distributed query using a linked server, SQL Server analyzes the command and sends any requests for linked server data via OLE DB to that data source. If security credentials were specified when the linked server was created, those credentials are passed to the linked server. Otherwise, SQL Server will pass the credentials of the current SQL Server login. When SQL Server receives the returned data, it is processed in conjunction with other portions of the query.
DQP can concurrently access multiple heterogeneous sources on local and remote computers. Additionally, it supports queries against both relational and nonrelational data by using OLE DB interfaces implemented in OLE DB providers. Using DQP, SQL Server administrators and end user developers can create linked server queries that run against multiple data sources with little or no modifications required. For example, a Visual Basic developer can create a single linked server query that selects and inserts data stored in DB2 today, and then can change the linked server name and run the same query against Microsoft SQL Server tomorrow. In this way, DQP provides the developer with an isolation level against changes in the storage engine.
Further, DQP is an efficient tool with which to join information from multiple tables spanning multiple data sources. For example, let s say you are a regional sales manager for a large retail company with subsidiaries located in several countries. Because of mergers and acquisitions, some regional offices store their data in different databases from those of the corporate headquarters. The United Kingdom subsidiary stores its data in DB2; the Australian subsidiary stores its data in Microsoft Access; the Spanish subsidiary stores its data in Microsoft Excel; and the United States subsidiary stores its data in Microsoft SQL Server. You want a report that lists, on a quarterly basis for the last three years, the subsidiaries and the sales locations with the highest quarterly sales figures. Joining the required data tables can be accomplished in real time by using a single distributed query, running on Microsoft SQL Server.
Why Visual Studio 6.0
The Microsoft Visual Studio 6.0 development system is a comprehensive suite of industry-leading development tools for building business applications for Windows 2000 Server, including client/server, multitier, and Web-based solutions. Visual Studio 6.0 includes key enterprise and team development features designed to help developers rapidly build scalable distributed applications that can be easily integrated with existing enterprise systems and applications.
Building on a Successful Base
COM+ builds on the proven success of COM.
COM is in use on 200 million systems worldwide.
COM supports a vibrant component marketplace. The demand for third-party components based on COM has been estimated to be $410 million this year, with a projected 65 percent compound annual growth rate, and it is expected to grow to approximately $3 billion by 2001 (source: Giga Information Group). This base of available components allows developers to choose from a wide variety of components to assemble applications and solutions, which has revolutionized development on Windows platforms.
COM supports thousands of available applications, including the highest-volume applications in the industry.
Major system vendors, such as Hewlett-Packard Co., Digital Equipment Corp., and Siemens Nixdorf Information Systems Inc., have announced plans to ship COM on UNIX and non-UNIX systems within the year, and additional vendor commitments are expected to follow. In addition, Software AG has ported COM to many operating systems, including Solaris and MVS.
COM consists of a well-defined, mature, and stable specification, as well as a reference implementation, which has been widely tested and adopted worldwide as a de facto standard.
COM is supported by the largest number of development tools available for any component or object model on the market.
COM+ enables the creation of the next generation of component-based applications, making it even easier to build and use components and providing richer, extensible services.
New Features in COM+
COM+ is an evolutionary extension to the Component Object Model (COM), the most widely used component technology in the world. COM+ makes it even easier for developers to create software components in any language, using any development tool. COM+ builds on the same factors that have made today's COM the choice of developers worldwide, including the following:
The richest integrated services, including transactions, security, message queuing, and database access, to support the broadest range of application scenarios.
The widest choice of tools from multiple vendors using multiple development languages.
The largest customer base for customizable applications and reusable components.
Proven interoperability with users' and developers' existing investments.
Host Integration Server 2000 solves the problem of integrating the Microsoft Windows operating system with other non-Windows enterprise systems running on platforms such as IBM mainframes, AS/400, and UNIX. By using the powerful and comprehensive bidirectional integration services of Host Integration Server 2000, developers are freed from platform boundaries and can build highly scalable, distributed applications that incorporate existing processes and data without requiring any recoding or "wrapping" of existing code. This allows businesses to quickly build new business-critical Windows DNA 2000 applications while preserving investment in best-of-breed and custom in-house developed solutions. Host Integration Server 2000 includes three levels of integration that lets developers make the most of existing computing resources.
Each Enterprise Application Integration (EAI) project is unique and can include requirements for one or all of the following types of integration provided by Host Integration Server 2000:
Network and Security Integration
Host Integration Server 2000 provides comprehensive managed host access seamlessly connecting legacy host systems with client/server and Web networks. Utilizing Windows 2000 Active Directory service, Host Integration Server integrates host-based security.
Data Integration
Host Integration Server 2000 provides complete, secure access to enterprise data through object-oriented and programmatic access to relational DB2 data and flat file data on mainframes, AS/400, UNIX, Windows 2000, and Windows NT Server.
Application Integration
Utilizing COM Transaction Integrator, developers can build distributed applications that integrate Microsoft Transaction Services (MTS) with IBM host CICS and IMS transactions. Instead of learning the intricacies of host code, Web developers can use COMTI to wrap CICS and IMS transactions and expose them as COM objects available for use in distributed Web applications.
Host Integration Server 2000 also makes the most of Windows DNA 2000 core "plumbing code" such as: security, support for transactions, and queuing, freeing developers to create new functionality. The same core services reduce complexity for system administrators. Tight integration with Windows 2000 makes managing host and application access easier, less expensive, and more secure.
Introducing Microsoft Site Server 3.0
Microsoft Site Server 3.0, a member of the Microsoft BackOffice server family, provides a powerful alternative to the traditional methods of intranet development and maintenance. Microsoft Site Server 3.0 is the powerful intranet server, optimized for Microsoft Windows NT Server operating system with Internet Information Server, for publishing and finding information easier and faster. By deploying Site Server 3.0, businesses can use the intranet to efficiently gather the collective expertise of the organization from wherever it resides in Web sites, databases, file servers, e-mail and deliver it to facilitate the sharing of knowledge and to improve business productivity.
Site Server 3.0 includes a unique set of features that are designed to work together to optimize information sharing across the organization:
Content Management provides a structured publishing process for multiple content authors to submit, tag, and edit content through a drag-and-drop Web interface. Site editors can then approve, edit, and enforce uniform guidelines for content.
Content Deployment enables administrators to deploy content securely and robustly across multiple distributed servers.
Search enables users to perform full-text and property searches across various stores and formats, including HTTP, file systems, Exchange files, and databases.
Personalization & Membership provide easy authoring and targeting of personalized information based on user profiles and behaviors and the ability to present search results in highly customizable user view and personalized Web pages.
Push enables businesses to create delivery channels for Microsoft Internet Explorer 4.0. Channel agents, based on Microsoft Active Channel Server, enable intranet developers and administrators to create channels from databases and file systems as well as Search and Index Server. The Active Channel Multicaster saves valuable network bandwidth by using multicast technology to deliver channels.
Knowledge Manager is a centralized Web-based application that integrates the Site Server knowledge management features to enable users to easily browse, search, share, and subscribe to relevant information.
Analysis transforms raw hits recorded in server log files into valuable information about the requests, visits, and users that interact with an intranet. This allows businesses to measure the effectiveness of an intranet. Analysis also captures content and site structure to identify issues, such as pages with long load times and out-of-date content
A BizTalk-based document is an XML file that deploys the tags from a certain vocabulary and follows the rules that the organization has defined for that type of document. A BizTalk-based document is actually exchanged by two BizTalk servers across a network. In Figure 1 you can see the overall BizTalk architecture, illustrating the role of XML for exchanging data between commercial partners. Both parties continue to manage documents in their own native formats on their own platforms, but data moves back and forth, despite architectural differences, thanks to XML.
Figure 1 BizTalk Architecture
BizTalk is central to Windows DNA 2000, and will be one of the key tools that help build e-commerce solutions. More often than not, today's e-commerce solutions require integration with existing information systems and data residing on host machines.
MICROSOFT WINDOWS DISTRIBUTED INTERNET APPLICATIONS Windows DNA: Building Windows Applications for the Internet Age Windows DNA Technologies.
Windows DNA services are exposed in a unified way through COM for applications to use. These services include component management, Dynamic HTML, Web browser and server, scripting, transactions, message queuing, security, directory, database and data access, systems management, and user interface.
Adhering to open protocols and published interfaces makes it easy to integrate other vendor solutions and provides broad interoperability with existing systems.
Because Windows DNA is based on COM and open Internet standards, developers can use any language or tool to create compatible applications.
Microsoft developed the Windows Distributed interNet Application Architecture (Windows DNA) as a way to fully integrate the Web with the rc-tier model of development. Windows DNA defines a framework for delivering solutions that meet the demanding requirements of corporate computing, the Internet, intranets, and global electronic commerce, while reducing overall development and deployment costs.
FEATURES AND ADVANTAGES OF WINDOWS DNA
DNA helps to design and build multi-tier client/server applications.
DNA provides client transparency.
DNA applications provide full transactional processing support.
DNA can be used to create applications that are fault tolerant.
DNA is ideal for distributed applications.
DNA is ideal for distributed applications.
The DNA methodology covers many existing technologies to help design and implement robust, distributed applications. It visualizes this whole application as a series of tiers, with the client at the top and the data store at the bottom. The core of DNA is the use of business objects in a middle tier of the application. Also, in DNA, business objects are implemented as software components. These components can be accessed by the client interface application or by another component, and can themselves call on other components, data stores, etc. Componentization of business rules brings many benefits, such as easier maintenance, encapsulation of the rules, protection of intellectual copyright, etc. Hence, DNA is an approach to design that can speed up overall development time, while creating more reliable and fault tolerant applications that are easily distributable over a whole variety of networks.
Several more good reasons why companies should base their applications on Windows DNA. Because the architecture is built on open protocols and industry standards,
solutions from other vendors integrate easily into the environment. This helps ensure interoperability with mission-critical business applications, such as corporate databases and enterprise resource planning systems. An open approach also facilitates compatibility with existing computing systems, which means that companies can continue to take advantage of their legacy systems as opposed to replacing them.
TOP WINDOWS DNA PERFORMANCE MISTAKES
Throughput refers to the amount of work (number of transactions) an application can perform in a measured period of time and is often calculated in transactions per second (tps). Scalability refers to the amount of change in linear throughput that occurs when resources are either increased or decreased. It is what allows an application to support anywhere from a handful to thousands of users, by simply adding or subtracting resources as necessary to "scale" the application. Finally, transaction time refers to the amount of time needed to acquire the necessary resources, plus the amount of time the transaction takes actually using these resources.
Producing a good /-tier application often entails a series of judgments in planning and implementing the final product. When those decisions are poorly made, development teams can encounter time-consuming and often difficult to solve performance problems after the application has been installed and implemented. Fortunately, many of these problems can be anticipated and prevented. This article shows you how to find and eliminate them early in the development process. The mistakes that follow were identified by Microsoft Consulting Services (MCS) consultants worldwide.
Misunderstanding the Relationship between Performance and Scalability
Performance and scalability are not the same, but neither are they at odds. For example, an application may process information at an incredibly fast rate as long as the number of users sending it information is less than 100. When that application reaches the point at which 10,000 users are simultaneously providing input, the performance may degrade substantially, because scalability wasn't high enough in the list of considerations during the development cycle. On the other hand, that same high-performance application may be partially rewritten in a subsequent iteration and have no problem handling 100,000 customers at one time. By then, however, a substantial number of customers may have migrated to a product someone else got right the first time.Sometimes applications seek scalability in terms of number of concurrent users strictly through performance, with the idea being.
Sometimes applications seek scalability in terms of number of concurrent users strictly through performance, with the idea being that the faster a server application runs, the more users can be supported on a single server. The problem with this approach is that increasing the number of simultaneous users may create a bottleneck that will actually reduce the level of performance as the load increases. One cause of this kind of behavior is caching state and data in the middle tier. By avoiding such caching in the design phase of the development process, countless hours of backtracking and rewriting code can be avoided.
Acquiring the necessary resources can be slowed by such factors as network latency, disk access speed, database locking scheme, and resource contention. Added to that are elements that can affect resource usage time, such as network latency, user input, and sheer volume of work. Windows DNA application developers should concentrate on keeping resource acquisition and resource usage times as low as possible.
ways to manage some of these factors:
Avoid involving user interaction as part of a transaction.
Avoid network interaction as part of a transaction.
Acquire resources late and release them early.
Make more resources available. Otherwise, use MTS to pool resources that are in short supply or are expensive to create.
Use MTS to share resources between users because it is usually more expensive to create a new resource than to reuse an existing one.
It must be scalable so that the largest number of simultaneous users can be logged on without compromising throughput to an unacceptable level.
A middle tier in an n-tier application is necessarily complex because of the role it plays in the overall application. The specific tasks it performs can be separated into three general categories that are essential to Windows DNA applications. The first task involves receiving input from the presentation tier. This input can be done programmatically or may come directly from a user. It may include information about (or a request for) almost anything. Second, a middle tier is responsible for interacting with the data services to perform the business operations that the application was designed to automate. For example, this might include sorting and combining information from different mailing lists to target a specific audience that was never previously considered to be a cohesive group. Finally, a middle tier returns processed information to the presentation tier so it can be used however the program or user sees fit. Within these three areas, performance can degrade significantly when developers use programming practices that are either little understood or mistakenly embraced as the "right" thing to do.
Performing Data-Centric Work in a Middle Tier.Developers sometimes fall into the trap of including data-centered tasks with the business services work in a middle tier instead of the data-services tier where they belong. Rules are frequently too rigid to account for all cases but it would be very unusual to find a justification for breaking this one. If data-centered tasks are included in middle tier, your Windows DNA application is likely to perform more poorly than it would otherwise. For example, it would be a mistake to retrieve multiple data sets from different tables and then join, sort, or search the data in middle-tier objects. The database is designed to handle this kind of activity and removing it to a middle tier is almost certainly a bad practice.
Even with the ever-increasing processor power available for database servers, poorly tuned indexes and queries can bring an otherwise robust system to its knees. It is quite common to see developers coding stored procedures or queries without consulting the database administrator (DBA), or even running a project with no DBA involvement.
CONCLUSION
The Windows DNA architecture and the Windows NT platform offer many distinct advantages to customers and their ISV partners. Its key benefits include:
Providing a comprehensive and integrated platform for distributed applications, freeing developers from the burden of building the required infrastructure or assembling it using a piecemeal approach.
Easy interoperability with existing enterprise applications and legacy systems to extend current investments.
Making it faster and easier to build distributed applications by providing a pervasive component model, extensive prebuilt application services, and a wide choice of programming language and tools support. Windows DNA applications have proven themselves in a wide range of circumstances, and the value they represent in the modern distributed computing environment has been thoroughly demonstrated. They have, however, also shown themselves to require careful planning and thorough testing throughout the development process. Avoiding the kinds of mistakes noted in this article should reduce the amount of resources required to produce the kind of Windows DNA application you want. Performance and load testing is unavoidable. Do it in a manner that simulates real-world conditions for your particular application and you'll be rewarded with an n-tier application that works and works well.
FUTURE SCOPE
Microsoft compelled to release .NET. First and foremost .NET Is a framework that covers all the layers of software development above the Operating System. It provides the richest level of integration among presentation technologies, Component technologies and data technologies ever seen on a Microsoft or any platform .Secondly the entire architecture has been created to make it as easy to develop Internet Application as it is to develop for the desktop.This .NET architecture provide a wrapping of COM technologies
More than 200,000 developers are working on .NET in India. 64% of developers' community uses .NET currently. INDIGO: The Future Technology for Building Distributed Application
Indigo is Microsoft's unified programming model for building
Unified programming model for building service oriented applications with managed code .it is devised by Shevchuk while working on .NET framework to enable developers to build secure reliable transacted Web services that integrate across platforms and interoperate with existing investments. Indigo combines and extends the capabilities of existing Microsoft distributed system technologies including Enterprise Service, System messaging .NET Remoting including distance cross-process ,cross-machine ,cross-subnet ,cross-intranet ,cross-topologies protocols and security models.Indigo is build on and extends the .NET framework 2.0 which is a part of the windows release code named longhorn and it will be also made available on XP and 2003 server editions.
REFERENCES
For more information on Universal Data Access, see microsoftdata/
For more on the COM specification, see the COM Web site, located at microsoftcom/
For additional information see Q218590 INF: "Configuring Data Sources for the Microsoft OLE DB Provider (DB2)" at http://support.microsoftsearch/default.asp.
For more information see microsoftdata/oledb/
For more information see microsoftdata/ado/.
For more information, see microsoftjava/.
For more information on ODBC, see microsoftdata/odbc/.
For more information on these third-party offerings, see microsoftsql/.
CONTENTS
1. Abstract
2. Introduction to Windows DNA
3. Guiding Principles of Windows DNA
4. Architecture of Windows DNA
DNA-Architecture for Distributed Applications
An Interoperability Framework
5. Microsoft Windows DNA Applications
6. Features and Advantages of Windows DNA
7. Top Windows DNA Performance Mistakes
8. Conclusion
9. References
10. Future Scope
Microsoft Windows Distributed interNet Applications Architecture (Windows DNA) is the application development model for the Windows platform. Windows DNA specifies how to: develop robust, scalable, distributed applications using the Windows platform; extend existing data and external applications to support the Internet; and support a wide range of client devices maximizing the reach of an application. Developers are free from the burden of building or assembling the required infrastructure for distributed applications and can focus on delivering business solutions.
Windows DNA addresses requirements at all tiers of modern distributed applications: presentation, business logic, and data. Like the familiar PC environment, Windows DNA enables developers to build tightly integrated applications by accessing a rich set of application services in the Windows platform using a wide range of familiar tools. These services are exposed in a unified way through the Component Object Model (COM). Windows DNA provides customers with a roadmap for creating successful solutions that build on their existing computing investments and will take them into the future. Using Windows DNA, any developer will be able to build or extend existing applications to combine the power and richness of the PC, the robustness of client/server computing, and the universal reach and global communications capabilities of the Internet
INTRODUCTION
The increased presence of Internet technologies is enabling global sharing of information not only from small and large businesses, but individuals as well. The Internet has sparked a new creativity in many, resulting in many new businesses popping up overnight, running 24 hours a day, seven days a week. Competition and the increased pace of change are putting ever-increasing demands for an application platform that enables application developers to build and rapidly deploy highly adaptive applications in order to gain strategic advantage.
Introducing Windows DNA: Framework for a New Generation of Computing Solutions
Windows DNA refers to the Windows Distributed interNet Application architecture, launched by Microsoft."Windows DNA is essentially a 'blueprint' that enables corporate developers and independent software vendors (ISVs) to design and build distributed business applications using technologies that are inherent to the Windows platform,it consists of a conceptual model and a series of guidelines to help developers make the right choices when creating new software applications." Applications based on Windows DNA will be deployed primarily by businesses, from small companies to large enterprise organizations. Consumers are likely to use many of the applications built to take advantage of Windows DNA, such as electronic commerce Web sites and online banking applications.
A major force driving the need for Windows DNA is the Internet, which has dramatically changed the computing landscape. Five years ago, the process of developing programs used by one person on one computer was relatively straightforward. By contrast, some of today's most powerful applications support thousands of simultaneous users, need to run 24 hours a day, and must be accessible from a wide variety of devices from handheld computers to high-performance workstations. To meet these demanding requirements, application developers need adequate planning tools and guidance on how to incorporate the appropriate technologies. The Windows DNA architecture addresses this need.
Microsoft Windows Distributed interNet Applications Architecture (Windows DNA) is Microsoft's framework for building a new generation of highly adaptable business solutions that enable companies to fully exploit the benefits of the Digital Nervous System. Windows DNA is the first application architecture to fully embrace and integrate the Internet, client/server, and PC models of computing for a new class of distributed computing solutions. Using the Windows DNA model, customers can build modern, scalable, multitier business applications that can be delivered over any network. Windows DNA applications can improve the flow of information within and without the organization, are dynamic and flexible to change as business needs evolve, and can be easily integrated with existing systems and data. Because Windows DNA applications leverage deeply integrated Windows platform services that work together, organizations can focus on delivering business solutions rather than on being systems integrators.
Guiding Principles of Windows DNA:
Web computing without compromise.Organizations want to create solutions that fully exploit the global reach and "on demand" communication capabilities of the Internet, while empowering end users with the flexibility and control of today's PC applications. In short, they want to take advantage of the Internet without compromising their ability to exploit advances in PC technology.
Interoperability. Organizations want the new applications they build to work with their existing applications and to extend those applications with new functionality. They require solutions that adhere to open protocols and standards so that other vendor solutions can be integrated. They reject approaches that force them to rewrite the legions of applications still in active use today and the thousands still under development.
True integration. In order for organizations to successfully deploy truly scalable and manageable distributed applications, key capabilities such as security, management, transaction monitoring, component services, and directory services need to be developed, tested, and delivered as integral features of the underlying platform. In many other platforms, these critical services are provided as piecemeal, non-integrated offerings often from different vendors, which force IT professionals to function as system integrators.
Lower cost of ownership. Organizations want to provide their customers with applications that are easier to deploy and manage, and easier to change and evolve over time. They require solutionsthat do not involve intensive effort and massive resources to deploy into a working environment, and that reduce their cost of ownership both on the desktop and server administration side.
Faster time to market.Organizations want to be able to achieve all of the above while meeting tight application delivery schedules, using mainstream development tools, and without need for massive re-education or a "paradigm shift" in the way they build software. Expose services and functionality through the underlying "plumbing" to reduce the amount of code developers must write.
Reduced complexity. Integrate key services directly into the operating system and expose them in a unified way through the components. Reduce the need for information technology (IT) professionals to function as system integrators so they can focus on solving the business problem.
Windows DNA Architecture
Windows DNA architecture employs standard Windows-based services to address the requirements of each tier in the multi tiered solution: user interface and navigation, business logic, and data storage. The services used in Windows DNA, which are integrated through the Component Object Model (COM), include:
1. Dynamic HTML (DHTML)
2. Active Server Pages (ASP)
3. COM components
4. Component Services
5. Active Directory Services
6. Windows security services
7. Microsoft Message Queuing
8. Microsoft Data Access Components
The most common used standards in the Web Services world are:
1. XML Schema: For message data typing and structuring. It allows defining a common vocabulary that the sending and receiving party may understand for achieving the message interchange goal.
2. WSDL: For associating messages and message exchange patterns (logic interface) with service names and network addresses (endpoints acting as physical interface).
3. WS-Addressing: For including endpoint addressing and reference properties associated with endpoints. Many of the other extended specifications require WS-Addressing support for defining endpoints and reference properties in communication patterns.
4. WS-Policy: For associating quality of service requirements with a WSDL definition. WS-Policy is a framework that includes policy declarations for various aspects of security, transactions, and reliability.
5. WS-Security: For providing message integrity, authentication and confidentiality, security token exchange, message session security, security policy expression, and security for a federation of services within a system.
6. WS-Metadata Exchange: For querying and discovering metadata associated with a Web service, including the ability to fetch a WSDL file and associated WS-Policy definitions.
The challenge was to design a system that would allow the actual presentation layer to run on distributed servers but deliver the data required to produce that presentation using Web services technology. Because of this decision, the system is inherently scalable at the front end. There is no single point of failure for the Web presentation layer. In fact, since most customers are using hosting services that employ multiple front-end servers using failover and round-robin DNS, we get the benefit of their existing hosts redundancy and scalability.
This architecture allowed us to focus on the next layer of the distributed architecture. Many architects use the term business services layer" to describe the layer of service that sits behind the presentation services in a 3-tier system.
But after years of system design, development, and research on Windows DNA 3-tier systems
Microsoft has added several key aspects to the architecture with Windows 2000. This section contains:
1. Component Services
2. Dynamic HTML: Dynamic Hypertext Markup Language (DHTML).
3. Windows Script Components
4. XML: Extensible Markup Language (XML).
5. Active Directory Service Interfaces and IIS
COM allows developers to create complex applications using a series of small software objects.
COM also offers the advantage of programming language independence. That means developers can create COM components using the tools and languages they're familiar with, such as Visual Basic, C, C++ and Java. An easy way to look at it is that COM serves as the glue between the tiers of the architecture, allowing Windows DNA applications to communicate in a highly distributed environment.
DNA - Architecture for Distributed Applications
The following picture shows the different pieces within the DNA architecture, and how they work together:
Server machine:- Placing your business objects on the server increases your control over the entire application, and over configuration issues. It also increases security aspects of the system, and reduces the client-side software footprint.
Central Database:-By keeping all data in a central location, you open up opportunities for data sharing between clients and for central reporting. Business objects need only a central point-of-entry into the data store. ASP is a scripting language that is supported by IIS. It is a combination of HTML, VBScript, and COM. These scripts run on the web server, and then converted to HTML for the client response. ASP provides default components for interaction with the server or with a database.
Dynamic HTML:-This extension to the HTML standard provides precise placement of objects on the screen, data binding, effects, and dynamic modification capabilities.
Custom Graphics:-Graphics and presentation are the final piece to this puzzle. A consistent GUI provides customers with a pleasing means of interfacing with your application.
Cooperating Components:-Microsoft's Windows DNA strategy rests on Microsoft's vision of cooperating components that are built based on the binary standard called the Component Object Model (COM). COM is the most widely used component software model in the world, available on more than 150 million desktops and servers today. It provides the richest set of integrated services, the widest choice of easy-to-use tools, and the largest set of available applications. In addition, it provides the only currently viable market for reusable, off-the-shelf client and server components.
COM enables software developers to build applications from binary software components that can be deployed at any tier of the application model. These components provide support for packaging, partitioning, and distributed application functionality. COM enables applications to be developed with components by encapsulating any type of code or application functionality, such as a user interface
control or line of business object. A component may have one or more interfaces; each exposes a set of methods and properties that can be queried and set by other components and applications. For example, a customer component might expose various properties such as name, address, and telephone number.
Client Environments and Presentation Tier:-Today, many application developers using cooperating components target the development of their applications to the Windows platform to take full advantage of the rich user interface Windows has to offer. Likewise, customers have come to expect a rich, highly functional user interface from their applications. The extended reach of information and services to customers that the Internet has enabled has created a new challenge for the application developer. The application developer today must develop a user interface that is distributable, available on Windows and non-Windows platforms, and supports a wide range of client environments, from handheld wireless devices to high-end workstations. Yet, applications must be rich with features to stay competitive and maintain the functionality that customers have come to expect.
The business logic tier is the heart of the application, where the application-specific processing and business rules are maintained. Business logic placed in components bridge the client environments and the data tiers. The Windows DNA application platform has been developed through years of innovation in supporting high-volume, transactional, large-scale application deployments, and provides a powerful run-time environment for hosting business logic components
the application platform for developing Windows DNA applications includes Web services, messaging services, and component services. Web Services
Integrated with Microsoft's application platform is a high-performance gateway to the presentation tier. Microsoft's Internet Information Server enables the development of Web-based business applications that can be extended over the Internet or deployed over corporate intranets. With IIS, Microsoft introduced a new paradigm to the Internet transactional applications. Transactions are the plumbing that makes it possible to run real business applications with rapid development, easy scalability, and reliability.
Microsoft broadened COM's applicability beyond the desktop application to also include distributed applications by introducing Microsoft Transaction Server (MTS). MTS was an extension to the COM programming model that provided services for the development, deployment, and management of component-based distributed applications. MTS was a foundation of application platform services that facilitated the development of distributed applications for the Windows platform in a much simpler, more cost-effective manner than other alternatives. COM+ is the next evolutionary step of COM and MTS. The unification of the programming models inherent in COM and MTS services makes it easier to develop distributed applications by eliminating the tedious nuances associated with developing, debugging, deploying, and maintaining an application that relies on COM for certain services and MTS for others. The benefits to the application developer is to make it faster, easier, and ultimately cheaper to develop distributed applications by reducing the amount of code required to leverage underlying system services.
To continue to broaden COM and the services offered today in MTS 2.0, COM+ consists of enhancements to existing services as well as new services to the application platform. They include:
Bring your own transaction. COM components are able to participate in transactions managed by non-COM+ transaction processing (TP) environments that support the Transaction Internet Protocol (TIP).
Expanded security. Support for both role-based security and process-access-permissions security. In the role-based security model, access to various parts of an application is granted or denied based on the logical group or role that the caller has been assigned to (for example, administrator, full-time employee, or part-time employee). COM+ expands on the current implementation of role-based security by including method-level security for both custom and IDispatch(Ex)-based interfaces.
Centralized administration. The Component Services Explorer, a replacement for today's MTS Explorer and DCOMCNFG, presents a unified administrative model, making it easier to deploy, manage, and monitor rc-tiered applications by eliminating the overhead of using numerous individual administration tools.
Queued components. For asynchronous deferred execution when cooperating components are disconnected, this is in addition to the session-based, synchronous client/server programming model, where the client maintains a logical connection to the server today.
Event notification. For times when a loosely coupled event notification mechanism is desirable, COM+ Events is a unicast/multicast, publish/subscribe event mechanism that allows multiple clients to "subscribe" to events that are "published" by various servers. This is in addition to the existing event notification framework delivered with connection points.
Load balancing. Load balancing allows component-based applications to distribute their workload across an application cluster in a client-transparent manner.
Microsoft Message Queue Server (MSMQ) provides loosely coupled and reliable network communications services based on a messaging queuing model. MSMQ makes it easy to integrate applications by implementing a push-style business event delivery environment between applications, and to build reliable applications that work over unreliable but cost-effective networks. MSMQ also offers seamless interoperability with other message queuing products, such as IBM's MQSeries, through products available from Microsoft's independent software vendor (ISV) partners.
WINDOWS DNA Universal Data Access
Universal Data Access is Microsoft's strategy for providing access to information across the enterprise. Today, companies building database solutions face a number of challenges as they seek to gain maximum business advantage from the data and information distributed throughout their corporations. Universal Data Access provides high-performance access to a variety of information sources, including relational and non relational data, and an easy-to-use programming interface that is tool and language independent.
The foundation for developing enterprise data interoperability solutions on the Microsoft Windows platform is Microsoft Windows Distributed interNet Applications Architecture (Windows DNA). Windows DNA, which is based on the widely used Component Object Model (COM), specifies how to do the following:
Develop robust, scalable, distributed applications using the Windows platform.
Extend existing data and external applications to support Internet operations.
Support a wide range of client devices maximizing the reach of an application.
Figure 1. Microsoft Windows DNA Architecture
Interoperability and reuse are key attributes of Windows DNA. Unlike traditional software development, which required each application to be built from scratch, the Component Object Model (COM) enables developers to create complex applications using a series of small software objects (COM components). For example, a component might be a credit card authorization procedure or the business rules for calculating shipping costs. The COM programming model speeds up the development process by enabling multiple development teams to work on different parts of an application simultaneously.
COM also offers the advantage of programming language independence. This means that Windows developers can create COM components using tools and languages with which they are familiar, such as Microsoft Visual Basic and Visual C++ . For non-Windows programmers, including mainframe COBOL and Web publishers, COM components can be accessed from simple scripting languages, such as VBScript and JScript development software. Windows DNA simplifies development by providing access to a wide range of services and products developed using a consistent object model COM.
One example of the services available is what we call COM services for interoperability. COM services for interoperability include the network, data, application, and management services that are part of existing Microsoft products, such as Microsoft SNA Server. COM services for interoperability provide a common approach to system integration using the wide range of COM components available today.
An Interoperability Framework
Microsoft has defined a four-layer framework for interoperability based on industry standards for Network, Data, Applications, and Management or NDAM for short. Microsoft provides access to interoperability components in each of these four categories. This document focuses on the Data interoperability layer, providing an overview of the wide range of COM components available for accessing multiple data stores across an enterprise environment.
Figure 2. Microsoft Interoperability Framework
Enterprises run their daily operations relying on multiple data sources, including database servers, legacy flat-file records, e-mail correspondence, personal productivity documents (spreadsheets, reports, presentations), and Web-based information publishing servers. Typically, applications, end-users, and decision-makers access these data sources by employing a variety of nonstandard interfaces. Data interoperability standards offer the transparent and seamless ability to access and modify data throughout the enterprise. Microsoft s data interoperability strategy is called Universal Data Access. Universal Data Access uses COM objects to provide one consistent programming model for access to any type of data, regardless of where that data may be found in the enterprise.
An easy-to-use programming architecture that is both tool and language independent, Universal Data Access provides COM objects for high-performance access to a variety of relational (SQL) and nonrelational information sources. The technologies that make up the Universal Data Access strategy enable you to integrate diverse data sources, create easy-to-maintain solutions, and use your choice of best-of-breed tools, applications, and platform services.
To leverage existing investments, Universal Data Access does not require expensive and time-consuming movement of data into a single data store, nor does it require commitment to a single vendor s data products. Universal Data Access is based on open industry specifications with broad industry support, and works with all major established database platforms.
Figure 3. Universal Data Access Architecture
The Microsoft Data Access Components (MDAC) are the key technologies that enable Universal Data Access. Data-driven client/server applications deployed over the Web or a LAN can use these components to easily integrate information from a variety of sources, both relational and nonrelational. These technologies include Open Database Connectivity (ODBC), OLE DB, and Microsoft ActiveX Data Objects (ADO).
.
ODBC
Open Database Connectivity is an industry standard and a component of Microsoft Windows Open Services Architecture (WOSA). The ODBC interface makes it possible for applications to access relational data stored in almost any database management system (DBMS).
Microsoft's ODBC industry-standard data access interface continues to provide a unified way to access relational data as part of the OLE DB specification. ODBC is a widely accepted application programming interface (API) for database access. It is based on the Call-Level Interface (CLI) specifications from X/Open and ISO/IEC for database APIs and uses Structured Query Language (SQL) as its database access language. Microsoft has implemented a number of ODBC drivers to access diverse data stores. ODBC is widely supported by Microsoft, third party application development products, and end-user productivity applications.
Figure 4. ODBC Architecture
Microsoft offers a number of ODBC drivers as part of the Microsoft Data Access Components, which ships as a feature of many popular Microsoft products, including Microsoft SQL Server , Microsoft Office, Microsoft BackOffice family of products, Microsoft SNA Server, and Microsoft Visual Studio . The following ODBC drivers are included in MDAC version 2.1:
Microsoft ODBC Driver for SQL Server.
Microsoft ODBC Driver for Oracle.
Microsoft ODBC Driver for Microsoft Visual FoxPro .
Microsoft ODBC Driver for Access (Jet engine).
Additionally, Microsoft SNA Server 4.0 with Service Pack 2 ships with a Microsoft ODBC Driver for DB2.
A number of third-party ISVs offer ODBC drivers for many data sources.
In addition, OLE DB includes a bridge to ODBC to enable continued support for the broad range of ODBC relational database drivers available today. The Microsoft OLE DB Provider for ODBC leverages existing ODBC drivers, ensuring immediate access to databases for which an ODBC driver exists, but for which an OLE DB provider has not yet been written.
Note JDBC is a technology for accessing SQL database data from a Java client program. Microsoft offers a JDBC-to-ODBC bridge that allows Java programmers to access back-end data sources using available ODBC drivers. The Microsoft JDBC-ODBC bridge is part of the core set of classes that come with the Microsoft Virtual Machine (Microsoft VM) for Java.
OLE DB
OLE DB is a strategic system-level programming interface to data across the organization. OLE DB is an open specification designed to build on the success of ODBC by providing an open standard for accessing all kinds of data. Whereas ODBC was created to access relational databases, OLE DB is designed for relational and nonrelational information sources.
Figure 5. OLE DB Components
OLE DB encapsulates various database management system functions that enable the creation of software components implementing such services. OLE DB components consist of data providers, which contain and expose data; data consumers, which use data; and service components, which gather and sort data (such as query processors and cursor engines). OLE DB interfaces are designed to help diverse components integrate smoothly so that OLE DB component vendors can bring high-quality OLE DB products to market quickly.
OLE DB data providers
OLE DB data providers implement a set of core OLE DB interfaces that offer basic functionality. This basic functionality enables other OLE DB data providers, service components, and consumer applications to interoperate in a standard, predictable manner. The MDAC Software Development Kit (SDK) includes a set of OLE DB conformance tests that OLE DB component vendors, as well as end-user consumer developers, can run to ensure a standard level of compatibility. In addition, data providers can implement extended functionality as appropriate for a particular data source.
OLE DB data consumers
OLE DB data consumers can be any software programs that require access to a broad range of data, including development tools, personal productivity applications, database products, or OLE DB service components. A major set of OLE DB data consumers are ActiveX Data Objects (ADO), which provide a means to develop flexible and efficient data interoperability solutions using such high-level programming languages as Visual Basic.
OLE DB service components
OLE DB service components implement functionality not natively supported by some simple OLE DB data providers. For example, some basic OLE DB providers do not support rich sorting, filtering, and finding on their data sources. OLE DB service components, such as the Microsoft Cursor Service for OLE DB and the Microsoft Data Shaping Service for OLE DB, can seamlessly integrate with these basic OLE DB data providers to complete the functionality desired or expected by a given OLE DB consumer application. Universal Data Access allows for the development of generic OLE DB consumer applications that access many data sources in a single, uniform manner. Enterprise developers can write COM components that perform a specific function against nonspecific data sources. Such a component might run against a VSAM data set today and run against a SQL Server table tomorrow. This allows enterprises to migrate from one data store to another as efficiencies allow or business needs require.
Resource pooling is another popular function provided by OLE DB service components. When running under Microsoft Transaction Server, Microsoft Internet Information Server, or on standalone basis, OLE DB and ADO applications can make use of OLE DB resource pooling, supported by the OLE DB core services, to enable reuse of OLE DB Data Source proxy objects. Typically, the OLE DB session start-up, from instantiation of the OLE DB data source object (DSO) to creating the underlying network connection to the data source, is the most expensive part of a given transaction or unit of work. This is a critical issue when developing Web-based or multi-tier applications. Maintaining connections to the database in the resource state of the middle-tier component can create scalability issues. Creating a new connection on every page of a Web application is too slow. The solution is OLE DB resource pooling, which enables better scalability and offers better performance.
ADO
Microsoft ActiveX Data Objects (ADO) are a strategic application-level programming interface to data and information wrapped around OLE DB. ADO provides consistent, high-performance access to data and supports a variety of development needs, including the creation of front-end database clients and middle-tier business objects that use applications, tools, languages, or Internet browsers. ADO is designed to be the one data interface needed for single and multi-tier client/server and Web-based data-driven solution development. Its primary benefits are ease of use, high speed, low memory overhead, and a small disk footprint.
ADO provides an easy-to-use programming interface to OLE DB, because it uses a familiar metaphor the COM automation interface, accessible from all leading Rapid Application Development (RAD) tools, database tools, and languages (including scripting languages).
Figure 6. ADO Object Model
ADO uses a flatter and more flexible object model than any previous object-oriented data access technology. Any of the five top-level objects can exist independent of the other four. Unlike DAO, which required constructing a hierarchical chain of objects before accessing data, ADO can be used to access data with just a couple of lines of code.
The secret of ADO s strength is the fact that it can connect to any OLE DB provider and still expose the same programming model, regardless of the specific features offered by a particular provider. However, because each provider is unique in its implementation, how your application interacts with ADO may vary slightly when run against different data providers. Some common differences include ADO connection strings, command execution syntax, and supported data types.
Microsoft offers a number of technologies that facilitate interoperability with data stored on non-Windows systems. Developers can create solutions that take advantage of a large installed base of information sources while working in a familiar environment. The following is a collection of some of the Microsoft technologies available today that provide this capability.
Microsoft OLE DB Provider for DB2
The Microsoft OLE DB Provider for DB2 allows application developers to use familiar object-oriented programming techniques to access DB2 databases over SNA LU6.2 and TCP/IP networks, without requiring knowledge of SNA APPC programming.
Implemented using the open protocol of the Distributed Relational Database Architecture (DRDA), the Microsoft OLE DB Provider for DB2 supports access to remote DB2 data using industry-standard Structured Query Language (SQL) statements. Because the provider is part of Microsoft's universal data access strategy, which is based on OLE DB and ADO, it can interoperate with OLE DB-aware tools and applications in Microsoft Visual Studio 6.0, Microsoft SQL Server 7.0, and Microsoft Office 2000 Developer Edition. These generic consumer applications rely on this OLE DB provider's compatibility as verified using the OLE DB conformance tests. A key requirement is that the provider support the IDBSchemaRowset object. IDBSchemaRowset provides the consumer with a means to query the data source s metadata that describes the target table. Using this schema information, the generic consumer can intelligently process and display the result sets of queries.
Figure 7. OLE DB Provider for DB2 connecting to DB2 for MVS
At run time, generic consumers use IDBSchemaRowset information to make choices on how to behave with a back-end data source. Other information available to generic consumers at run time includes IDBInfo data source-supported keywords and literal information. Typically, consumer applications are designed to modify their behavior based on an expected set of values returned in IDBSchemaRowset and IDBInfo based on well-defined rules in the OLE DB specification. Additionally, the OLE DB Provider for DB2 publishes the data types, including precision and scale limits, in the form of standard OLE DB IDBSchemaRowset DBSCHEMA_PROVIDER_TYPES.
Another use of IDBSchemaRowset data is to populate a list of tables and columns available in the data source s current collection. The OLE DB Provider for DB2 maps IDBSchemaRowset to DB2 system table information. For example, DBSCHEMA_TABLES are provided using information stored in DB2 for OS/390 SYSIBM.SYSTABLES and DB2 for OS/400 QSYS2.SYSTABLES tables. DBSCHEMA_COLUMNS are mapped to DB2 for OS/390 SYSIBM.SYSCOLUMNS and DB2 for OS/400 QSYS2.SYSCOLUMNS information. The OLE DB provider queries these DB2 system tables at run time, returning the data on calls to IDBSchemaRowset TABLES and COLUMNS.
Two examples of this usage:
The Visual Studio Data Designer, which allows developers to preview DB2 tables and drag these tables and columns into the query designer.
The SQL Server Data Transformation Services, which offers the end user an intuitive wizard for picking tables for bulk movement of data between DB2 and any other OLE DB or ODBC data source.
OLE DB providers implement a default read-only, forward-only server cursor that is mapped to the data source s cursor engine whenever possible. In the case of the OLE DB Provider for DB2, the server cursor offered is a forward-only updateable cursor. A server cursor offers the developer the most efficient means to traverse tables on the data source. Some generic consumers may expect and request a scrollable cursor when, as in the case of the OLE DB Provider for DB2, only a forward-only cursor is offered by the provider. In these cases, the generic consumer can request to use a client-side cursor that is implemented on behalf of the provider by the Microsoft Cursor Service for OLE DB. In ADO, a developer can specify the use of the ADO Client Cursor Engine (CCE) by simply specifying
Distributed Query Processor
The Distributed Query Processor (DQP) feature of Microsoft SQL Server 7.0 enables application developers to develop heterogeneous queries that join tables in disparate databases. To access the remote data sources, a user must create a Linked Server definition. The Linked Server encapsulates all of the network, security, and data source-specific information required to connect DQP to the remote data. Linked servers rely on underlying OLE DB providers or ODBC drivers for the actual physical connection to the target data source. Once a linked server has been defined, it can always be referred to with a single logical name as part of a SQL Server dynamic SQL statement or stored procedure. At run time, a linked server resource, such as a remote DB2 table, is treated like a local SQL Server table.
When a client application executes a distributed query using a linked server, SQL Server analyzes the command and sends any requests for linked server data via OLE DB to that data source. If security credentials were specified when the linked server was created, those credentials are passed to the linked server. Otherwise, SQL Server will pass the credentials of the current SQL Server login. When SQL Server receives the returned data, it is processed in conjunction with other portions of the query.
DQP can concurrently access multiple heterogeneous sources on local and remote computers. Additionally, it supports queries against both relational and nonrelational data by using OLE DB interfaces implemented in OLE DB providers. Using DQP, SQL Server administrators and end user developers can create linked server queries that run against multiple data sources with little or no modifications required. For example, a Visual Basic developer can create a single linked server query that selects and inserts data stored in DB2 today, and then can change the linked server name and run the same query against Microsoft SQL Server tomorrow. In this way, DQP provides the developer with an isolation level against changes in the storage engine.
Further, DQP is an efficient tool with which to join information from multiple tables spanning multiple data sources. For example, let s say you are a regional sales manager for a large retail company with subsidiaries located in several countries. Because of mergers and acquisitions, some regional offices store their data in different databases from those of the corporate headquarters. The United Kingdom subsidiary stores its data in DB2; the Australian subsidiary stores its data in Microsoft Access; the Spanish subsidiary stores its data in Microsoft Excel; and the United States subsidiary stores its data in Microsoft SQL Server. You want a report that lists, on a quarterly basis for the last three years, the subsidiaries and the sales locations with the highest quarterly sales figures. Joining the required data tables can be accomplished in real time by using a single distributed query, running on Microsoft SQL Server.
Why Visual Studio 6.0
The Microsoft Visual Studio 6.0 development system is a comprehensive suite of industry-leading development tools for building business applications for Windows 2000 Server, including client/server, multitier, and Web-based solutions. Visual Studio 6.0 includes key enterprise and team development features designed to help developers rapidly build scalable distributed applications that can be easily integrated with existing enterprise systems and applications.
Building on a Successful Base
COM+ builds on the proven success of COM.
COM is in use on 200 million systems worldwide.
COM supports a vibrant component marketplace. The demand for third-party components based on COM has been estimated to be $410 million this year, with a projected 65 percent compound annual growth rate, and it is expected to grow to approximately $3 billion by 2001 (source: Giga Information Group). This base of available components allows developers to choose from a wide variety of components to assemble applications and solutions, which has revolutionized development on Windows platforms.
COM supports thousands of available applications, including the highest-volume applications in the industry.
Major system vendors, such as Hewlett-Packard Co., Digital Equipment Corp., and Siemens Nixdorf Information Systems Inc., have announced plans to ship COM on UNIX and non-UNIX systems within the year, and additional vendor commitments are expected to follow. In addition, Software AG has ported COM to many operating systems, including Solaris and MVS.
COM consists of a well-defined, mature, and stable specification, as well as a reference implementation, which has been widely tested and adopted worldwide as a de facto standard.
COM is supported by the largest number of development tools available for any component or object model on the market.
COM+ enables the creation of the next generation of component-based applications, making it even easier to build and use components and providing richer, extensible services.
New Features in COM+
COM+ is an evolutionary extension to the Component Object Model (COM), the most widely used component technology in the world. COM+ makes it even easier for developers to create software components in any language, using any development tool. COM+ builds on the same factors that have made today's COM the choice of developers worldwide, including the following:
The richest integrated services, including transactions, security, message queuing, and database access, to support the broadest range of application scenarios.
The widest choice of tools from multiple vendors using multiple development languages.
The largest customer base for customizable applications and reusable components.
Proven interoperability with users' and developers' existing investments.
Host Integration Server 2000 solves the problem of integrating the Microsoft Windows operating system with other non-Windows enterprise systems running on platforms such as IBM mainframes, AS/400, and UNIX. By using the powerful and comprehensive bidirectional integration services of Host Integration Server 2000, developers are freed from platform boundaries and can build highly scalable, distributed applications that incorporate existing processes and data without requiring any recoding or "wrapping" of existing code. This allows businesses to quickly build new business-critical Windows DNA 2000 applications while preserving investment in best-of-breed and custom in-house developed solutions. Host Integration Server 2000 includes three levels of integration that lets developers make the most of existing computing resources.
Each Enterprise Application Integration (EAI) project is unique and can include requirements for one or all of the following types of integration provided by Host Integration Server 2000:
Network and Security Integration
Host Integration Server 2000 provides comprehensive managed host access seamlessly connecting legacy host systems with client/server and Web networks. Utilizing Windows 2000 Active Directory service, Host Integration Server integrates host-based security.
Data Integration
Host Integration Server 2000 provides complete, secure access to enterprise data through object-oriented and programmatic access to relational DB2 data and flat file data on mainframes, AS/400, UNIX, Windows 2000, and Windows NT Server.
Application Integration
Utilizing COM Transaction Integrator, developers can build distributed applications that integrate Microsoft Transaction Services (MTS) with IBM host CICS and IMS transactions. Instead of learning the intricacies of host code, Web developers can use COMTI to wrap CICS and IMS transactions and expose them as COM objects available for use in distributed Web applications.
Host Integration Server 2000 also makes the most of Windows DNA 2000 core "plumbing code" such as: security, support for transactions, and queuing, freeing developers to create new functionality. The same core services reduce complexity for system administrators. Tight integration with Windows 2000 makes managing host and application access easier, less expensive, and more secure.
Introducing Microsoft Site Server 3.0
Microsoft Site Server 3.0, a member of the Microsoft BackOffice server family, provides a powerful alternative to the traditional methods of intranet development and maintenance. Microsoft Site Server 3.0 is the powerful intranet server, optimized for Microsoft Windows NT Server operating system with Internet Information Server, for publishing and finding information easier and faster. By deploying Site Server 3.0, businesses can use the intranet to efficiently gather the collective expertise of the organization from wherever it resides in Web sites, databases, file servers, e-mail and deliver it to facilitate the sharing of knowledge and to improve business productivity.
Site Server 3.0 includes a unique set of features that are designed to work together to optimize information sharing across the organization:
Content Management provides a structured publishing process for multiple content authors to submit, tag, and edit content through a drag-and-drop Web interface. Site editors can then approve, edit, and enforce uniform guidelines for content.
Content Deployment enables administrators to deploy content securely and robustly across multiple distributed servers.
Search enables users to perform full-text and property searches across various stores and formats, including HTTP, file systems, Exchange files, and databases.
Personalization & Membership provide easy authoring and targeting of personalized information based on user profiles and behaviors and the ability to present search results in highly customizable user view and personalized Web pages.
Push enables businesses to create delivery channels for Microsoft Internet Explorer 4.0. Channel agents, based on Microsoft Active Channel Server, enable intranet developers and administrators to create channels from databases and file systems as well as Search and Index Server. The Active Channel Multicaster saves valuable network bandwidth by using multicast technology to deliver channels.
Knowledge Manager is a centralized Web-based application that integrates the Site Server knowledge management features to enable users to easily browse, search, share, and subscribe to relevant information.
Analysis transforms raw hits recorded in server log files into valuable information about the requests, visits, and users that interact with an intranet. This allows businesses to measure the effectiveness of an intranet. Analysis also captures content and site structure to identify issues, such as pages with long load times and out-of-date content
A BizTalk-based document is an XML file that deploys the tags from a certain vocabulary and follows the rules that the organization has defined for that type of document. A BizTalk-based document is actually exchanged by two BizTalk servers across a network. In Figure 1 you can see the overall BizTalk architecture, illustrating the role of XML for exchanging data between commercial partners. Both parties continue to manage documents in their own native formats on their own platforms, but data moves back and forth, despite architectural differences, thanks to XML.
Figure 1 BizTalk Architecture
BizTalk is central to Windows DNA 2000, and will be one of the key tools that help build e-commerce solutions. More often than not, today's e-commerce solutions require integration with existing information systems and data residing on host machines.
MICROSOFT WINDOWS DISTRIBUTED INTERNET APPLICATIONS Windows DNA: Building Windows Applications for the Internet Age Windows DNA Technologies.
Windows DNA services are exposed in a unified way through COM for applications to use. These services include component management, Dynamic HTML, Web browser and server, scripting, transactions, message queuing, security, directory, database and data access, systems management, and user interface.
Adhering to open protocols and published interfaces makes it easy to integrate other vendor solutions and provides broad interoperability with existing systems.
Because Windows DNA is based on COM and open Internet standards, developers can use any language or tool to create compatible applications.
Microsoft developed the Windows Distributed interNet Application Architecture (Windows DNA) as a way to fully integrate the Web with the rc-tier model of development. Windows DNA defines a framework for delivering solutions that meet the demanding requirements of corporate computing, the Internet, intranets, and global electronic commerce, while reducing overall development and deployment costs.
FEATURES AND ADVANTAGES OF WINDOWS DNA
DNA helps to design and build multi-tier client/server applications.
DNA provides client transparency.
DNA applications provide full transactional processing support.
DNA can be used to create applications that are fault tolerant.
DNA is ideal for distributed applications.
DNA is ideal for distributed applications.
The DNA methodology covers many existing technologies to help design and implement robust, distributed applications. It visualizes this whole application as a series of tiers, with the client at the top and the data store at the bottom. The core of DNA is the use of business objects in a middle tier of the application. Also, in DNA, business objects are implemented as software components. These components can be accessed by the client interface application or by another component, and can themselves call on other components, data stores, etc. Componentization of business rules brings many benefits, such as easier maintenance, encapsulation of the rules, protection of intellectual copyright, etc. Hence, DNA is an approach to design that can speed up overall development time, while creating more reliable and fault tolerant applications that are easily distributable over a whole variety of networks.
Several more good reasons why companies should base their applications on Windows DNA. Because the architecture is built on open protocols and industry standards,
solutions from other vendors integrate easily into the environment. This helps ensure interoperability with mission-critical business applications, such as corporate databases and enterprise resource planning systems. An open approach also facilitates compatibility with existing computing systems, which means that companies can continue to take advantage of their legacy systems as opposed to replacing them.
TOP WINDOWS DNA PERFORMANCE MISTAKES
Throughput refers to the amount of work (number of transactions) an application can perform in a measured period of time and is often calculated in transactions per second (tps). Scalability refers to the amount of change in linear throughput that occurs when resources are either increased or decreased. It is what allows an application to support anywhere from a handful to thousands of users, by simply adding or subtracting resources as necessary to "scale" the application. Finally, transaction time refers to the amount of time needed to acquire the necessary resources, plus the amount of time the transaction takes actually using these resources.
Producing a good /-tier application often entails a series of judgments in planning and implementing the final product. When those decisions are poorly made, development teams can encounter time-consuming and often difficult to solve performance problems after the application has been installed and implemented. Fortunately, many of these problems can be anticipated and prevented. This article shows you how to find and eliminate them early in the development process. The mistakes that follow were identified by Microsoft Consulting Services (MCS) consultants worldwide.
Misunderstanding the Relationship between Performance and Scalability
Performance and scalability are not the same, but neither are they at odds. For example, an application may process information at an incredibly fast rate as long as the number of users sending it information is less than 100. When that application reaches the point at which 10,000 users are simultaneously providing input, the performance may degrade substantially, because scalability wasn't high enough in the list of considerations during the development cycle. On the other hand, that same high-performance application may be partially rewritten in a subsequent iteration and have no problem handling 100,000 customers at one time. By then, however, a substantial number of customers may have migrated to a product someone else got right the first time.Sometimes applications seek scalability in terms of number of concurrent users strictly through performance, with the idea being.
Sometimes applications seek scalability in terms of number of concurrent users strictly through performance, with the idea being that the faster a server application runs, the more users can be supported on a single server. The problem with this approach is that increasing the number of simultaneous users may create a bottleneck that will actually reduce the level of performance as the load increases. One cause of this kind of behavior is caching state and data in the middle tier. By avoiding such caching in the design phase of the development process, countless hours of backtracking and rewriting code can be avoided.
Acquiring the necessary resources can be slowed by such factors as network latency, disk access speed, database locking scheme, and resource contention. Added to that are elements that can affect resource usage time, such as network latency, user input, and sheer volume of work. Windows DNA application developers should concentrate on keeping resource acquisition and resource usage times as low as possible.
ways to manage some of these factors:
Avoid involving user interaction as part of a transaction.
Avoid network interaction as part of a transaction.
Acquire resources late and release them early.
Make more resources available. Otherwise, use MTS to pool resources that are in short supply or are expensive to create.
Use MTS to share resources between users because it is usually more expensive to create a new resource than to reuse an existing one.
It must be scalable so that the largest number of simultaneous users can be logged on without compromising throughput to an unacceptable level.
A middle tier in an n-tier application is necessarily complex because of the role it plays in the overall application. The specific tasks it performs can be separated into three general categories that are essential to Windows DNA applications. The first task involves receiving input from the presentation tier. This input can be done programmatically or may come directly from a user. It may include information about (or a request for) almost anything. Second, a middle tier is responsible for interacting with the data services to perform the business operations that the application was designed to automate. For example, this might include sorting and combining information from different mailing lists to target a specific audience that was never previously considered to be a cohesive group. Finally, a middle tier returns processed information to the presentation tier so it can be used however the program or user sees fit. Within these three areas, performance can degrade significantly when developers use programming practices that are either little understood or mistakenly embraced as the "right" thing to do.
Performing Data-Centric Work in a Middle Tier.Developers sometimes fall into the trap of including data-centered tasks with the business services work in a middle tier instead of the data-services tier where they belong. Rules are frequently too rigid to account for all cases but it would be very unusual to find a justification for breaking this one. If data-centered tasks are included in middle tier, your Windows DNA application is likely to perform more poorly than it would otherwise. For example, it would be a mistake to retrieve multiple data sets from different tables and then join, sort, or search the data in middle-tier objects. The database is designed to handle this kind of activity and removing it to a middle tier is almost certainly a bad practice.
Even with the ever-increasing processor power available for database servers, poorly tuned indexes and queries can bring an otherwise robust system to its knees. It is quite common to see developers coding stored procedures or queries without consulting the database administrator (DBA), or even running a project with no DBA involvement.
CONCLUSION
The Windows DNA architecture and the Windows NT platform offer many distinct advantages to customers and their ISV partners. Its key benefits include:
Providing a comprehensive and integrated platform for distributed applications, freeing developers from the burden of building the required infrastructure or assembling it using a piecemeal approach.
Easy interoperability with existing enterprise applications and legacy systems to extend current investments.
Making it faster and easier to build distributed applications by providing a pervasive component model, extensive prebuilt application services, and a wide choice of programming language and tools support. Windows DNA applications have proven themselves in a wide range of circumstances, and the value they represent in the modern distributed computing environment has been thoroughly demonstrated. They have, however, also shown themselves to require careful planning and thorough testing throughout the development process. Avoiding the kinds of mistakes noted in this article should reduce the amount of resources required to produce the kind of Windows DNA application you want. Performance and load testing is unavoidable. Do it in a manner that simulates real-world conditions for your particular application and you'll be rewarded with an n-tier application that works and works well.
FUTURE SCOPE
Microsoft compelled to release .NET. First and foremost .NET Is a framework that covers all the layers of software development above the Operating System. It provides the richest level of integration among presentation technologies, Component technologies and data technologies ever seen on a Microsoft or any platform .Secondly the entire architecture has been created to make it as easy to develop Internet Application as it is to develop for the desktop.This .NET architecture provide a wrapping of COM technologies
More than 200,000 developers are working on .NET in India. 64% of developers' community uses .NET currently. INDIGO: The Future Technology for Building Distributed Application
Indigo is Microsoft's unified programming model for building
Unified programming model for building service oriented applications with managed code .it is devised by Shevchuk while working on .NET framework to enable developers to build secure reliable transacted Web services that integrate across platforms and interoperate with existing investments. Indigo combines and extends the capabilities of existing Microsoft distributed system technologies including Enterprise Service, System messaging .NET Remoting including distance cross-process ,cross-machine ,cross-subnet ,cross-intranet ,cross-topologies protocols and security models.Indigo is build on and extends the .NET framework 2.0 which is a part of the windows release code named longhorn and it will be also made available on XP and 2003 server editions.
REFERENCES
For more information on Universal Data Access, see microsoftdata/
For more on the COM specification, see the COM Web site, located at microsoftcom/
For additional information see Q218590 INF: "Configuring Data Sources for the Microsoft OLE DB Provider (DB2)" at http://support.microsoftsearch/default.asp.
For more information see microsoftdata/oledb/
For more information see microsoftdata/ado/.
For more information, see microsoftjava/.
For more information on ODBC, see microsoftdata/odbc/.
For more information on these third-party offerings, see microsoftsql/.
CONTENTS
1. Abstract
2. Introduction to Windows DNA
3. Guiding Principles of Windows DNA
4. Architecture of Windows DNA
DNA-Architecture for Distributed Applications
An Interoperability Framework
5. Microsoft Windows DNA Applications
6. Features and Advantages of Windows DNA
7. Top Windows DNA Performance Mistakes
8. Conclusion
9. References
10. Future Scope