The New Age Of Integration

Date:   Thursday , February 27, 2003

OVER THE PAST HALF DOZEN YEARS, TREMENDOUS strides have been made integrating disparate software systems. Flexibility, visibility and degree of control have all improved to a level that was only a dream when the term EAI was coined. Yet in this article, I will argue that the benefits to be achieved by integration are almost entirely ahead of us. Most systems are still isolated. Many processes that could be automated are manual. Development and administration tools need further refinement. Most important, the true value of integration is the new things it enables

With any technology come opportunities which are only indirectly related to its adoption: As surely as automobiles begat pizza delivery, and TCP/IP enabled The Internet, integration technologies will make way for new business optimization and management strategies which are only dimly visible today.

How do we get there? What do we want from our integration tool sets? It is not perfect generality, because generality is unachievable with any foreseeable technology. Further, generality brings with it complexity. It is not about features, or technology. Rather, I propose that the goal of system integration is to reduce the complexity of multicomponent software systems.

I focus on complexity reduction because it is the surest way to drive improvement: From cost reduction to better decision making, the ease with which people understand their environment and the consequent quality of their decisions are the drivers of process improvement. The implication is that the integration challenge is not entirely solvable by any single approach or software suite. Just as routers did not displace servers, and databases did not displace file systems; EAI tools, app servers and new technologies will all prosper as the market becomes segmented.

A functional division already exists between ‘classic’ EAI systems, which provide extremely general feature sets and typically assume the existence of little or no standardization, and n-tier development platforms (app servers, .NET, etc.), which are more technology-specific and therefore somewhat lighter and less general. The emerging space between is beginning to fill with companies focused on Application Routers, Web Services, and other technologies that require the existence of standards and earlier generations of integration technology.

All the participants in this ecosystem (including ERP vendors, SI’s and customers) implicitly assume a new system topology. This is a key point. Failure to understand this shift in architecture will lead to unsuccessful attempts to apply old thinking to new problems.

There are three characteristics of this new architecture: It is distributed, it is component-based and it requires standards. The architecture places support, application, orchestration, visualization and analysis services in separate layers or functional categories. Specialization also reduces complexity.

Distributed applications have their own set of challenges that companies like TIBCO have spent over a decade to understand and master. Suffice it to say that reliability, transaction tracking and administration become much more difficult in such systems. Components have been around at least since TIBCO pioneered the idea of service-oriented architectures in the mid-90’s. The addition of standards is what makes component architectures cost-effective and practical for most new systems. Much has been written on standards in the past several years. I will repeat none of it here. The most important thing to remember about standards is that they need not be complete or accepted by everyone in order to provide useful simpli-fication.

When we view the integration challenge in light of this new topology, many confusing and seemingly contradictory initiatives fall into place. Data repositories, directories and the new category of grid services (load balancing, fault tolerance, capacity sharing, etc.) clearly fall into the support layer, where they may be accessed and used by all components.

Componentization of ERP systems, Web Services, distributed application development tools (app servers, .NET, etc.) comprise most of the application layer. The orchestration layer is perhaps most confusing, as this has traditionally been part of application development in large distributed systems. This layer is, at present, undergoing the greatest change. EAI vendors, app server vendors (including .NET, notionally, with app servers) and ERP vendors will all tell you they do orchestration. New players in Application Routing and Web Services claim part of this space as well. The truth is that all these camps can rightfully claim to solve some orchestration problems.

EAI vendors such as TIBCO have unparalleled capabilities for solving the generic integration problem. Think here of your PC, with a back plane, representing the EAI vendor, and a set of cards, representing the application components. Implicit in this design is that services will be repurposed over time, which justifies the investment in a general solution. For problems of this type, full service integration vendors will continue to dominate for the foreseeable future.

At the other end of the spectrum lie problems that are really more like a traditional distributed application. Relatively few services, often built on an homogeneous technology platform, possibly physically distributed, but nonetheless conceptually related and likely to remain related (think of the auto dealer who must access three credit services, the auto factory and a trucking service, but whose process never varies). This kind of problem can often be solved with application servers or even extensions to existing ERP systems (ERP manufacturers’ integration tools generally focus on solving this class of problem). The expectation here is that, though components may change version, the basic system topology remains the same.

Between these two camps, a new market is appearing. This market assumes that some standardization exists, and that applications are being designed with the understanding that they are part of a component system. The standardization comes partly from new initiatives using XML standards, but today is driven largely by the app server and EAI vendors themselves: Every one of these vendors offers some capacity to expose proprietary systems as web services. The Application Router/Web Services vendors rely on them, just as the early EAI vendors relied on the ubiquitous acceptance of client/server technology, without which the first wave of integration would have been impossible.

This new space is composed largely of a novel application architecture. Conventional applications, even distributed ones, tend to keep session information in one place, then to contact other systems in turn to change or enrich that information. The new class of applications turns this inside out. These flow-based applications carry sufficient context with the transaction to allow validation and error handling as part of the transaction. Examples of such applications include trade management systems, mediation and correlation applications in telecommunications and many supply chain activities.

The most interesting question arising from all these changes in the application landscape is, “OK – if this all works, what can I do with it?” This will be as fertile a ground for new technology and new business opportunity as any we have seen in the evolution of computing.

First, we should expect extension of simple alerting services to pattern recognition. When a set of conditions occurs together, what does it mean? Notice that this real time activity will be tightly tied to data mining. In order to understand what a pattern means this time, I need to understand what it has meant in the past.

Real time analytics are the second area with promise. Once the interesting patterns have been identified, I will want them on a dashboard, preferably with modeling and decision support tools.

Third, a new class of problems arises from constraint satisfaction. This is particularly interesting, as problems in this arena tend to have high business impact and direct bottom line implications. Content routing, scheduling, sensitivity analysis and process optimization all fall under the heading of constraint satisfaction.

A moment’s reflection will reveal that all these new areas require working, component-based systems to be effective. That’s the only way one can have access to the necessary information in real time. It is easy to see that these new technologies are likely to be more domain-specific than previous software packages. Analytics in supply chain has quite a different meaning than it does on Wall Street. That’s OK. The same technologies that enable generic component systems for end users will make it possible for vendors to economically create tools and products with higher business value for smaller markets, and to do so repeat ably.

Many challenges lie before us if we are to make this vision real. We must manage total cost of ownership, incorporate the new standards, understand and use new architectural paradigms. We must find ways to reach the ninety percent of business systems as yet untouched with cost-effective tools that are usable by mere mortals, while matching the robustness and security we expect from centralized systems.

The payoff is worth the effort. As integration becomes the ‘how,’ rather than the ‘what,’ we can focus on our joint goal – immediate business value. From reduced cost, greater flexibility, better visibility and more intelligent operation of our business systems will come much greater success than any of us can envision today, and that success will be shared by all of us.

Fred Meyer is the chief strategy officer at TIBCO Software (NASD:TBCO) and leads it’s corporate strategy. His responsibilities include managing the Emerging Technologies Group, working on technical infrastructure direction. Since joining TIBCO in 1996, Meyer has helped to expand TIBCO's leadership in key areas, including Web Services, B2B, telecommunications, financial services, transportation and logistics.