point
Menu
Magazines
Browse by year:
May - 2004 - issue > Feature:Data Center Automation
Who's Managing Your Data Center?
Pradeep Shankar & Robin Mathews
Tuesday, July 8, 2008
Managing 50,000 servers across 155 data centers worldwide can turn into a Herculean task. It also speaks volumes about the huge maintenance costs and large number of system administrators per server that would be needed to even attempt this. EDS has steered its way out by automating its entire IT infrastructure; and has seen improved efficiencies in its data center environment by automating the complete lifecycle of managing business applications and the underlying infrastructure. It has also saved $100 million along the way, using automation software.

How does automation software really help you manage your data center? Suppose W32.Blaster Internet worm slips into one of the servers. The bug can freeze your corporate website, shut down servers, and slow down the network. The attack can be controlled and the havoc can be stopped. The automation software lets IT administrators apply a patch to thousands of servers with the click of a mouse. The process would take weeks if done manually.

Patch management is just one of the many things that can be done using an automation software. The IT administrator can use it for application provisioning, software upgrade, server configuration, and reconcile IT operations with business policies. Imagine doing all these tasks manually, especially when servers have grown to gargantuan proportions. Imagine thousands of machines, running dozens of operating systems, with heterogeneous configurations—machines with different clock speeds, disk capacity, number of network interfaces, or connectivity to the network. These servers have different support hardware, such as back-up power supply systems (UPS), cable configuration, and so on; not taking into account different OS versions, patch levels, and installed utilities. Added to this there are numerous applications running on different platforms. Different groups, in an enterprise, often have access to different machines, and can remotely change the software or upgrade or shift around resources in the data center. Keeping track of all these numerous and often simultaneous processes is beyond the ability of IT managers.

Data center automation has come as manna from heaven. The IT manager’s problem is an opportunity for others. A slew of vendors like HP, IBM, Sun, Opsware, Moonlight, Bladelogic, and others have jumped onto the bandwagon. Their promise: cut costs, speed up deployment, ease problem diagnostics, protect applications and dramatically improve their quality of operations.

Market Trends
The largest trend in IT is the vast migration from Client/ Server architecture to Web-based architecture, which has been occurring for the last eight years. Even the number of server shipments is on the rise. In 1995 about half-a-million servers were shipped worldwide. Last year some 12.4 million were shipped, and this year the estimate is that the server shipments will grow by 5 million.

The ten-fold increase in the server shipments indicates the thrust on enterprise to manage them. “There is no [manual] new way to manage the processes and technologies,” notes one analyst. The shift in technology architecture and the significant server explosion is a huge opportunity for companies that are developing data center automation tools.

With increase in servers, the number of applications developed is also on the rise. New platforms like J2EE and .NET have reduced the time to develop applications significantly (10:1). This would only mean lower cost of development and deployment of applications, resulting in a much larger number of applications.

A ten-fold increase in number of applications and servers within the enterprise has resulted in three big problems. First is cost. Analysts say that 75 per cent of technology budget is related to ongoing operations. Secondly security. Securing that many servers and applications is much harder than it used to be. Third is related to quality. Quality of applications is crucial because not only will the internal employees use the applications but also customers and partners deliver load.

“So the punch line of that is we have to move from the manual way to an automated way of managing servers,” says Ben Horowitz, CEO of Opsware. “Enterprise needs solutions that can manage servers automatically across the life cycle. It is very much like a traffic control system. If you have only 5 or 10 servers you do not need an automation system. If you have 1000, you better have them.”

“Growth in this market will be bolstered by the need for IT departments to do more with less, so as to achieve operational efficiencies and cost savings. A key driver will be the ability to increase the number of servers that can be managed by an individual system administrator,” says Tim Grieser, VP, Enterprise System Management at Gartner

Pie Carving
The market opportunity is real. With 13 million servers worldwide today, which are estimated to increase by another 5 million this year, data automation software companies are here to benefit. And Horowitz is laboring hard to ensure that he will have the larger size of the pie. “The addressable market for us is companies where there are more than 100 servers, which is about 30 % of that total. Our current solution-per-server is $2500, leaving us a market opportunity of $5.4 billion today,” he claims.

Others too have a similar goal to carve up the pie to their size.
San Mateo, CA-based Moonlight is a case in point. Founder Bobby Mukherjee allows anyone to log on to his site and download Moonlight’s product and run the evaluation, without having to spend any upfront dollar. “This is a compelling differentiator, especially for prospects who are looking for solutions in this space,” says Mukherjee. “In marketing technical products—like the one that Moonlight markets—the savvy customer will buy based on technical merit. So the trick is to get prospective customers to lay hands on the technology as quickly as possible.” The free-download concept might be good. But even after giving away free trials, Mukherjee has managed to hold on to 20 clients, probably comparable with Opsware’s 15.
Leading the fray is Waltham, MA-based BladeLogic (read cover story) that has some marquee clients like Wal-Mart and USAA, AT&T Wireless, Sprint, and General Electric.
BladeLogic’s rival Opsware, claims a good customer traction particularly in verticals where there are many servers—financial services, insurance, government, defense and service providers, but Ronni Colville at Gartner is cautious about the company. “If Opsware did not have the business relationship with EDS, and the resulting revenue of $50-60m, it would be no different than BladeLogic,” says the Research Director, Enterprise Management at the firm.

This is not a numbers game. “To succeed in the data-center automation market, software vendors need to deliver specific capabilities to be considered a viable competitor,” says Dev Ittycheria, BladeLogic’s president and CEO. They include the ability to accommodate spontaneous or ad hoc changes in system configurations, manage new software releases, and reconcile IT operations with business policies, he says.

How long can an enterprise afford to be manual is the question that these indepenpendent players constantly ask their prospective customers. “The penetration of sophisticated automation tools in data center is extraordinarily low. A large majority of enterprises out there are just staring now or do not have anything at all with sophisticated solutions,” says Dave Tabors, general partner at Battery Ventures. But is there room for new entrants? “For someone [investors] to fund a startup in this space, now, we’ve got to answer the question: What differentiated offering is this new company going bring into the marketplace? Though the opportunity in the space is enormous, it is extremely difficult for the new entrant to grab a reasonable market share. In any space, there are always four to five venture-backed companies that are successful. In this space there are already good venture-backed companies in the market.”

The Established Equalizers
The competitive landscape is quickly changing. While independent players like Blade Logic, Opsware and Moonlight compete to automate the data centers, established players like Hewlett-Packard, Sun, IBM and Microsoft are seeking new high-growth markets. They have taken the ‘utility computing route.’ The idea is to let companies tap into computing power as they need it, creating a seamless, automated system. Utility computing—also called adaptive, or on-demand computing—is to offer computing power as a usage-based service, like water or electricity. “But the players realized that they couldn’t get it off the ground without some serious attention to the “plumbing,” says Colville. And in the last eighteen months, the acquisitions seem to bear proof to Colville’s statement.

With the acquisitions, the bigger players have gone about finding the missing pieces of the utility computing puzzle. There are three parts of the play: storage, application and server. There are enough examples of acquisitions in all the three spaces. Take for instance Sun Microsystem’s November 2002 acquisition of server provisioning company Terraspring. Last year, Sun acquired Pirus Networks that provided and storage virtualization. Soon after that a purchase in the application provisioning space seemed like the inevitable next step for Sun.

The acquisition of CenterRun for $66 million has helped Sun fill a critical hole in its N1 utility computing strategy. Sun’s vision for N1 is to take different data-center components and use virtualization software to give a company a holistic view of its technology resources.
IBM provides utility services with its On-Demand Computing offering. Last year, IBM stepped up its efforts by purchasing ThinkDynamics, a software company that automatically provisions servers, storage and software based on changing demand. According to Forrester, “Think Dynamics gives IBM a datacenter automation product directly competitive with Sun’s N1 Provisioning Server and HP’s Utility Data Center (UDC).” Building on its Think Dynamics acquisition, IBM has delivered two products that move server automation into the mainstream. IBM’s message has been that companies can achieve utility computing either by outsourcing their data centers to IBM Global Services or working with Global Services to pull together a number of hardware and software products from IBM and its partners. IBM is also pushing a data-center approach in which customers can set up utility computing themselves.

HP has a comprehensive strategy—the Adaptive Enterprise—which promises to make the job of deploying applications, bringing servers up and down, and tracking data center configurations considerably easier. In the last six months, HP has acquired Baltimore Technologies Identity Management Solution, Talking Blocks, and Persist Technologies. Earlier this year, HP snapped up IT automation companies Consera and Novadigm. “These [acquisitions] give HP a more complete tool set to respond to IBM’s autonomics efforts,” says Rick Sturm, an analyst at research firm Enterprise Management. “These developments are important for anybody trying to move into the automatic, adaptive operations space.”

“With Adaptive Enterprise solution, one can analyze the demand for resource, prioritize the resource and move the resource—automatically and dynamically—to where it is needed. This is done through our virtualization capability. Utility Data Center is a piece of this entire puzzle, wherein the entire data center can be virtualized,” says Nick van der Zweep, director of utility computing and virtualization for HP.

However competing with IBM could be tough, which has an expansive consulting business that gives it an edge in selling and configuring its systems. “We watch IBM very closely,” admits Zweep, but is quick to add, “I don’t think they have large number of reference customers [as we have] in this space.”

Even as Zweep watches IBM’s moves, he believes that “software is going to be critical to HP’s business moving forward, especially if the adaptive enterprise strategy has to take off.” He doesn’t rule out more acquisitions in this space. Colville adds a note, “Computer Associates (CA) is ominously quiet. I think it is watching the revenue model mature, before taking a swipe at the market, which it surely will.”

While Sun, IBM and HP are making every effort to make their utility computing play interesting, Microsoft is touting its Dynamic Systems Initiative, a software-architecture and application development approach to running data centers with minimal human intervention. The first set of technologies to support the initiative are integrated in the Windows Server 2003 operating system, with additional development tools, apps, and management products scheduled to ship over the next few years.

There is an interesting play in the storage space too. After all, storage is also an integral part of the data center. Just like the hardware vendors, storage players are also tripping over each other in the race to announce a utility computing strategy. Eager to expand quickly beyond its core business, storage software maker Veritas took a big step by spending $600 million for acquiring two software companies—Precise Software (provided software measures application performance) and Jareva (provided software efficient use of existing storage assets). Veritas’ interest in the space did not end there. In January this year the company acquired application virtualization technology developer Ejasent for $59 million in cash. These acquisitions inch the company closer to fulfilling on the utility computing promise.

Even as Veritas makes agressive moves, EMC has been closely watching it. Last December, EMC acquired virtual machine software vendor VMware for $635 million in cash. The acquisition heralds EMC’s foray the data center management software space. An acquisition by itself doesn’t mean anything. “The key challenge is to integrate the companies into the existing portfolio,” says Jose Iglesias, Vice President of Integrated Product Division at Veritas. “The offering in the market place should be an integrated product rather than separate products. When the customer buys software, he is making a long-term investment. Unlike hardware, software is not as easily interchangeable.”

Analysts believe that Veritas has been successful in integrating Jareva into its utility-computing vision. With this, Veritas could emerge as a top-tier competitor in this space as its utility computing promise becomes more clearly instantiated in product and service offerings.

“No customer has a single brand of hardware. They typically have multiple platforms. Our [Veritas] approach has been a heterogeneous support from the beginning. If bigger players want to stay in the race by way of change behavior, they have to support other platforms,” predicts Iglesias.
What should customers do? Because of the sprawling, interconnected nature of the technology, consumers must make their first choices wisely. Sharing data between applications will be critical, which means a single vendor can have a lock on certain platforms. “If an organization uses the HP or IBM approach, it will tend to be a commitment,” says Sturm.

Independent vendors like Blade Logic, Moonlight and Opsware are echoing Sturm’s observation. “We think that an independent company will win in this market. If you look at the biggest new markets of last twenty years, you will find that it is always new companies that win. In the relational database market, Oracle walked away as a winner despite competition from IBM. In storage software management software, Veritas built a better file system that worked on all platforms and gained competitive advantage over IBM and HP. In the application server market, it looks like BEA will be the clear winner,” says Horowitz.
“Nobody wants to be stacked up against IBM, Sun or HP,” says Mukherjee, “Customers do not like to get tied into technology stack with a single vendor. Anybody who is going in for automation software will essentially look for independent vendors.”

However, Zweep does not buy this argument. “Any automation software can help you install software on to servers and configure them. But real power comes when enterprises get more adaptive to move resources one application to another.” That is where the bigger players will have an advantage. It looks like the transition to a utility model will be progressive, starting with application automation and progressively spreading to the entire data center.

“The big worry is, how will all of these players [independent vendors] are going to make money?” questions Richard Fichera, Vice President, Forrester research. “It is quite obvious that whatever point management solutions the enterprise deploys will have to eventually integrate into larger management framework. I doubt whether the independent players will exist three years from now.”
How soon will independent players get swallowed? “It is my strong belief that this market has sufficient depth and breadth. There is a clear opportunity for independent companies to grow up and be real companies, which would mean a public company with multiple hundred-million-dollar revenues,” argues Dave Tabors. “Obviously it is a broader, system software and system management space because of which companies with large market cap want a play in this market. This would mean that potential acquirers would be present in the market in a big way.”

The Dilemma
So, should a CIO go for deploying solutions offered by independent vendors? “For him [CIO] elemental-level managing solutions provided by independent companies are a good option, but only for the short term. In the longer run, the CIO has to deal with an aggregate entity, which is very complex. The larger picture is managing distributed software plus all hardware pieces that works with storage network and servers. This is where solutions offered by bigger players start looking attractive in a complex environment,” says Fichera.

The competitive landscape seems to be as complex as the components of the data center itself. Taking a cut of the entire space reveals that while the bigger players continue to promote utility computing strategies, independent software providers are delivering data-center management applications to help business-technology managers move toward a utility-computing architecture. So whomever you begin with, ultimately, it looks like you will soon come to accept the realities of the utility computing vision.
Twitter
Share on LinkedIn
facebook