point
Menu
Magazines
Browse by year:
March - 2010 - issue > Technology
Ten Years After Y2K The ERP Legacy
Marc Hebert
Wednesday, March 3, 2010
As the new decade dawns, it’s a good time to take stock of a fascinating milestone in enterprise applications. ERP just turned twenty, just as we mark the tenth anniversary of the Y2K phenomenon. Many of us will mark the occasion of the Oracle Applications Users Group 20th anniversary conference in Las Vegas in April. It will bring back memories, both exhilarating and painful, of this generation of ERP applications built on relational databases and open systems.

The Y2K Investment Explosion

It is useful to recall how Y2K concerns during the late 1990s fueled a dramatic explosion of investment in ERP software and services, resulting in a bubble of ERP implementation that has left an important legacy on today’s IT environment. Many of us were aghast at the gargantuan size of ERP implementation projects at many of our large global corporations ? $50 million to $100 million in size, back when the dollar was still worth fifty cents.

Today, we mark the tenth anniversary of this historic bubble in a decidedly different IT budget climate. We observe that today’s typical IT projects are an order of magnitude or two smaller than they were during the excesses of Y2K. Moreover, these projects are approved only after rigorous ROI justification. can you imagine what would have happened, had such stringent ROI been required during Y2K?

The ERP ROI Paradox

In fact, it seems clear that the huge investment in ERP and its associated IT infrastructure really did not yield anything near the ROI achieved with most other corporate capital investments at least not in the first five years or more. Perhaps, we should give credit to the farsightedness of yesterday’s IT planners who marched forward with huge investments without demonstrable near-term ROI. But, it is certainly arguable that the dramatic increases in economic productivity, at least in the US economy, over the past five years and even during the current recession, are in no small measure due to the economies of scale in agile business processes enabled by strong ERP infrastructure laid down during Y2K. And so, it is tempting to conclude that those ERP investments based on weak ROI, in fact, are now generating enormous ROI ten years and more later.

Today’s ERP Legacy: Data Cholesterol

There are many interesting influences worth exploring from the Y2K ERP investment bubble. One that is especially fascinating is a byproduct of the relational database architectures that ERPs are founded on data cholesterol. Those of us who designed and built enterprise applications on Oracle, DB2, Sybase, Informix, SQL Server, and others were preoccupied with leveraging the breakthrough advantages of relational databases. These include the creation of complex data structures with infinite flexibility to model and automate a wide range of business processes for the first time.

Our preoccupation with creating these breakthrough data models and loading massive amounts of data into them blinded us from the difficulty of pulling the data out once it’s in there. None of the leading enterprise software vendors ever made it a priority for their customers to easily archive or purge the data. we were simply too enamored with the goal of creating integrated repositories of business information to worry that customers might not want that data forever. The complexities of ERP data models with their referential integrity make pulling data out very difficult, else application problems result. So, ten years after Y2K, many corporations are suffering from data cholesterol. Even mid-sized companies are amassing terabyte plus databases that are growing at 30-70 percent every year.

There are several reasons why databases of this size are not ideal, despite continuing price-performance improvements in computing power and storage fueled by Moore’s Law. First, the current legal and regulatory environment creates potential liabilities from keeping data longer than legally necessary. Second, maintaining predictable application performance is much tougher with these data volumes. Third, while CPU and storage costs continue to decline, it is no longer easy to free up IT budget to throw hardware at the problem. Fourth, the labor cost associated with managing terabyte-sized application environments is growing at a time when CIOs must do more with less. And finally, the proliferation of large amounts of production data in non-production systems (for development, test, and training purposes) exposes the company to data privacy liability.

This perfect storm of forces on today’s IT organizations makes data cholesterol an urgent issue for many of them.

The New Era: Proactive Data Management: So, what to do about this structural problem of exploding data growth? Enter the new era of data management software products available from a wide range of software vendors, including IBM (Optim), Informatica (Applimation), HP (Outerbay), Solix, and others. These products include:

Data archiving: The ability to extract data from relational structures as whole business objects with full data integrity, enabling companies to implement data retention policies that comply with legal requirements by archiving and purging data, while reducing storage costs and improving application performance. These products offer out-of-the-box business object support for the leading ERP and CRM packages, along with the ability to define complete business objects for custom applications.

Application decommissioning: The ability to archive data in legacy applications and retiring them, thereby reducing application maintenance and hardware costs.

Test data management: The ability to flexibly subset production data for test, development, and training purposes, thereby reducing storage costs and reducing the time to create, refresh, and manage test systems (and enabling faster application enhancements).

Data privacy: The ability to de-identify sensitive data in non-production systems, while maintaining their context for accurate testing purposes, thereby ensuring data privacy regulatory compliance.

In this era of tight IT budgets, data management software offer compelling near-term ROI in CPU and storage cost reduction and labor productivity improvement.

In Conclusion

Organizations with production databases of 500 gigabytes or more in size may benefit from one or more data management solutions described above. By doing so, they will extend the useful life of their ERP systems and increase their agility to support the challenging economic climate of the new decade.

Marc Hebert, Chief Operating Officer, Estuate


Twitter
Share on LinkedIn
facebook