One hundred and sixty servers hummed, air-conditioning fans whirred and lights flashed inside St. Peter's Health Care Services' cramped data center -- the high-tech heartbeat of the Albany, N.Y.-based hospital. Life-and-death data from 150 applications coursed through wires and wireless connections, reaching the most remote corners of the hospital and handling everything from nurse scheduling to digital-image processing.
And the heartbeat was growing louder. "I saw server space constraints coming," says CIO Jonathan Goldberg.
In mid-2005, Goldberg weighed a colossal decision: whether to consolidate servers using virtual server machines. Sure, the bean counters would be happy: Virtual servers render the operating system that is hard-wired into physical servers superfluous, so St. Peter's could run more servers on fewer machines at less cost.
But the thought of virtualizing the hospital's core systems made Goldberg shudder. Would there be a performance drop-off? Could critical systems run reliably? Most important, would software developers now and in the future support their applications on a virtual server? "I needed to do some definite soul-searching," Goldberg says.
If Goldberg could have gazed into the future instead, he would have seen that server virtualization is a virtual lock today. The technology is widely praised for its near-flawless performance, trouble-free adoption and consistent return on investment. The world's largest tech vendors -- EMC, Microsoft, IBM, Hewlett-Packard -- have invested heavily in virtual server offerings. Semiconductor giants Intel and AMD are building virtual server support into their chips. And virtual servers have gained solid traction in the midmarket, where IT dollars run scarce and data center space is at a premium.
While many of Goldberg's earlier questions have been answered, new ones about the future of server virtualization continue to crop up. Chief among them is the cascading effect of a virtual server on a data center. Simply put, making changes to core technology like a server may require changes to other systems, such as storage. "As midmarket CIOs start playing with more virtual servers, they are going to realize that it fundamentally changes the way they look at a data center," says Forrester Research Inc. analyst Frank Gillett.
Goldberg eventually decided to deploy virtual servers -- a "several-hundred-thousand-dollar investment" -- and today $400-million St. Peter's Health Care Services is enjoying the early benefits of the technology. Of the hospital's 160 physical servers, 110 were virtualized and consolidated onto five physical servers using EMC's VMware. Fifty-five servers couldn't be virtualized because their Food and Drug Administration-certified status prevented alterations. A few older servers were retired and shipped elsewhere. The rest have stickers that read "do not power on," and their decommissioning translates into a 25% reduction of power in the data center (prior to server virtualization, all the physical servers were turned on). These hibernating servers give Goldberg some headroom if he needs more computing power down the road.
Three-fourths of third-party applications at St. Peter's Health Care Services run on VMware, including a major workforce management system. "My sense is that they will all get there," Goldberg says. "With any new contract with application vendors, we really want them to support virtual servers."
But Goldberg's virtual server adventure is just beginning. He wonders what the future will look like as hardware and software vendors jockey for position. "I'm keeping an eye on server virtualization being embedded in the hardware by hardware vendors," Goldberg says. "Will this come to fruition? If it does, would there be a reason" to continue to have the VMware layer?
Server virtualization, says Gillett, is simply the catalyst for a much broader set of infrastructure makeovers. "It's the tip of the iceberg," he says.
The Top Five Challenges of Server Virtualization
|If you're embarking on a virtualization project, here are the most important factors to consider, according to Shane Colombo, a practice leader at U.K.-based C&C Technology Consulting Ltd.
The Dawn of Virtualization
The earliest form of server virtualization appeared in Unix-based servers that supported partitioning. Founded in 1998 and now owned by EMC, VMware brought server virtualization -- with servers that run x86 processors from AMD and Intel -- to the masses at the turn of the millennium. Server virtualization software divides one physical server into multiple isolated virtual environments. Throughout its evolution, virtualization technology has been called "partitioning," "workload management," "server provisioning" and, finally, "server automation." Essentially, the term means running multiple operating systems or multiple instances of the same operating system on one server.
By decoupling the application from the hardware, the virtual server can now be moved easily between physical servers, and applications using different operating systems can share physical server resources. This means companies don't have to buy and maintain a bank of physical servers. Most CIOs choose to run between two and six virtual machines on a single physical server, according to Forrester, while some run 15 or more. "One guy had 40 on a box," Gillett says.
Server virtualization has transformed the data center of the Los Angeles Angels baseball team, says CIO Al Castro. Before the days of server virtualization, Angel Stadium had 35 physical servers. "The amount of underutilization was just pathetic," says Castro. "You have these boxes, and they are just being used at 5% and 10% of their capacity." Many of the underutilized servers were tied to single tasks: one for accounting, another for customer relationship management, another for IT functions, another for email. Now Castro is using virtual machines so that these applications can share resources, thus optimizing physical servers. Today the outfit has 19 servers, three of which are virtual.
Not only can server virtualization reduce the total number of servers, but there's a lot to be saved from related costs. Fewer physical servers means less data center real estate, fewer air-conditioning units and fewer administrators. "One way to look at it is that the [traditional] administrator-to-server ratio is 1-to-20," says IDC analyst Stephen Elliot, "but with server virtualization it's 1-to-200, on average."
From the cost perspective, says Elliot, "a key note is that the admin costs don't really go away, but admin time can be reallocated when moved to virtual infrastructure. Other cost reductions come from power reduction, hardware reduction and consolidation, OS license reduction, staff reallocation, increase in process efficiency, lower provisioning costs."
Server virtualization success stories continue to pour in from the front lines. A Forrester survey released last year indicates that almost 60% of companies with 500 or more employees have already implemented virtual servers, doubling the adoption rate reported in a similar study in 2005. Adoption among small and medium-sized businesses is only slightly lower than that among enterprises, says Forrester. VMware asserts that up to 40% of its customer base falls into the midmarket category.
Matt Brudzynski, a senior research analyst at Info-Tech Research Group, believes that midmarket companies are actually making a deeper commitment to virtualization than their enterprise counterparts, though he says server virtualization doesn't make much sense for companies with fewer than 15 servers. "The initial assumption was that small and medium companies just aren't using it," he says. "But when we delved into it, we found that, sure, larger companies have a larger adoption rate because they all dabble in it. But because of political complexity and risk aversion, on average only 10% of their infrastructure is virtualized.
"The deciding factor for ease of implementation had to do with how open the organization was to a major infrastructure change," he adds. "What we found was a direct correlation to that mind-set based on the size of the company."
Nevertheless, most CIOs tend to take a conservative approach to virtual server adoption. They often first deploy virtual servers in noncritical areas, such as test and development, file and print, and Web servers. "From the midmarket CIO standpoint, it's not a question of if but how much," says IDC's Elliot. "For file, print and basic services today, it's a no-brainer."
Many CIOs start their virtualization journey with storage before deploying virtual servers for applications. Susan Cerrone Abely, CIO and vice president of Roger Williams Medical Center, a $160-million hospital in Providence, R.I., implemented virtual storage to handle massive data spikes. "We can generate 400 gigabytes of data over four months," she says. "We needed to be much more adept at scaling up."
The 220-bed medical center consolidated and virtualized both its storage and primary server infrastructure last year by implementing an HP StorageWorks Enterprise Virtual Array 8000 storage area network (SAN) with 4 terabytes of initial capacity, along with 41 HP ProLiant BL20p server blades. When Cerrone Abely first researched virtualization, she discovered the hospital's application vendor, Meditech Inc., had not yet authorized its product to run on any of the major virtualization vendors, such as VMware. So she focused on virtualizing the hospital's storage instead.
"Storage virtualization, to me, means I went out and bought 400 gigabytes of storage, and I can allocate 500 gig here and 500 elsewhere -- and then say, 'I need to move that first 500 somewhere else.' It's just a large storage capacity that I allocate virtually to wherever I want it to be allocated.
"The way it's architected now, we drop in a server, it images right off the SAN, and we're ready to go," Cerrone Abely says. "It has lived up to everything that was promised in terms of scalability."
That's not to say that virtualization projects are easy. Myriad issues need to be ironed out before more CIOs consider virtualizing their mission-critical applications and, ultimately, their entire data center. First, some applications don't perform well in virtualized environments, such as large databases, anything I/O intensive, and large-scale applications that require their own machines. Virtual machines are also another layer of software and have their own resource requirements; thus there's an additional drain on I/O performance. Some IT vendors can add to the problem. "They'll promise that 12 virtual machines can be put on a processor, and the reality is about six" in most cases, says Brudzynski.
Another pitfall surrounds licensing. "We have talked to a few executives who were breaking the law, and they didn't even know it," says Brudzynski. "Basically, it has to do with CPU charges. Your host, your physical server, might have four CPUs, even though the virtual machine has one."
Licensing is often married to the number of instances, or images, a company is allowed to run, or the number of processors per machine, explains Al Gillen, an analyst at IDC. "If your license says you can only put one image down, you can't put two images down just because you have two virtual machines," Gillen says. Most vendors' pricing works this way.
But in 2005, Microsoft launched a per-instance instead of a per-processor licensing policy. EMC'sVMware has the lion's share of the virtualization market; EMC is aware of the potential licensing problems for CIOs and has begun working with several vendors to change licensing structures, he says.
Still, complying with licensing requirements is one task that can stymie busy midmarket CIOs. "From a CIO's perspective it's a very complex thing,'' says Gillen. "Here's the real zinger: You may have figured out which licenses you have in place, and tomorrow you decide you are going to move one application from a machine to another and find out your licensing is not compatible with your new scenario."
Though many CIOs report few problems with their server virtualization projects, Gillen says that many could be headed for licensing headaches. "That euphoria you are hearing is associated with virtualization 1.0. Virtualization 2.0 is going to mean you have to manage these systems. As soon as you start talking about moving stuff from one machine to another, that ugly licensing issue comes back to haunt you."
A New IT Architecture?
As vendors scramble to fix licensing and performance issues, a larger quandary looms. The fundamental challenge for CIOs who want to do more with server virtualization is that they must rethink their IT architecture by organizing data components as shared resources. Virtual servers affect everything they touch -- clustering, file systems, backup, storage, management -- and do so differently from physical servers. They can also be easily moved around the network from one physical server to another to optimize a data center's resources.
That means CIOs need automation management tools and processes to track virtual servers and quickly reconfigure network connections to firewalls and load balancers, among other technologies. "It's not practical to move virtual servers around if I have to manually reconfigure it," says Gillett.
But the management technology isn't quite there yet. "Systems management software should recognize the difference between physical servers and virtual servers and manage those differences because they are not the same," says Forrester's Gillett. "This will be a two- or three-year journey with everybody working through the implications." In the meantime, that puts the onus on user development of smart processes and procedures. For example, Patrick Bellor, director of infrastructure at Midwest Generation LLC, a Chicago-based power producer and sister company of Southern California Edison, has rewritten his policy book to handle incoming virtual servers; Midwest Generation has been in virtual server project planning mode for some eight months. With VMware, the goal is to whittle down 87 physical servers to about two. As of December, only four servers had been virtualized.
What's taking so long? "We're developing a roadmap for the next five years," says Bellor. "If you're just sticking in a CD, installing VMware and creating all these servers, you're going to have a management problem. It is imperative that folks focus on the process of managing this environment."
For instance, Bellor says he's fleshing out how his department will handle technology requests from the applications team on virtual servers as well as track those requests and bring them to market. Midwest Generation is working with software vendor Network General, a provider of enterprise application and network performance analysis tools, to understand how a virtual environment will affect the distributed sniffer (an application that detects network bottlenecks), storage area network and other areas of the data center. "It's definitely a culture change," he says. "I can't go up to the server, look at the monitor and see what I used to be able to see."
Smart management also means anticipating virtual sprawl, an irony given that virtual servers promise to prevent physical server sprawl. "Virtual servers are so easy to put up," says St. Peter's Health Care Services' Goldberg. Generally speaking, more technology projects will be green-lighted in a virtual server environment because they won't be weighed down by the expense of a physical server. Virtual test servers are already popping up at St. Peter's Health Care Services, Goldberg says, "and we have to remember to take them down."
A Virtual World
Much of server virtualization's future lies in the hands of tech vendors. EMC's VMware posted $310 million in virtual machine software sales in 2005, capturing some 55% of the market (for the market share breakdown, see "Virtual Market Share," at right).
"Most virtualization layers don't interoperate with 'competitive' virtualization layers," says Elliot. A virtual machine is not a universal translator; rather, it's a one-to-one translator between the hardware operating system and the guest OS (or operating system running on the virtual machine). So vendors must decide which hardware operating system and guest OSes they want to support on their virtual machine.
VMware overtook IBM in 2004 because Big Blue's virtual machine was largely available only on its mainframe and midrange Risx Unix Systems, says IDC. Meanwhile, Microsoft plans to deliver the next version of its Virtual Server this year to coincide with the launch of its long-anticipated Longhorn Server. Microsoft is working with XenSource to expand potential guest operating systems beyond just Windows, such as Novell Inc.'s NetWare, the SCO Group's OpenServer, UnixWare and older Windows, IDC says.
Last year VMware began offering its virtual server for free in the hopes that users will upgrade to its flagship virtual ESX Server. The VMware Virtual Server can be installed on any existing physical server's host operating system and partitions the server into multiple virtual machines by abstracting processor, memory, storage and networking resources. ESX Server, on the other hand, installs directly on the "bare metal" hardware and allows multiple unmodified operating systems and their applications to run on virtual machines. "Because ESX Server runs directly on the hardware, it is more scalable and higher performing," says Raghu Raghuram, VP of product and solutions marketing at VMware.
VMware also launched VMware Infrastructure 3, a package that combines ESX Server 3.0 with a management console called VirtualCenter 2.0 to help companies manage IT environments through virtualization. With VirtualCenter, CIOs can monitor virtual machines' performance.
Indeed, midmarket CIOs want to see what the management of a virtual environment will look like down the road. CIO Barry Kadets of the Gem Group Inc., a Lawrence, Mass.-based maker of promotional products, has a server virtualization project under way using VMware, and software management is his top concern. "You need software that can manage more than one use of a machine," he says. "It's not hard, but it's not simple."
As companies continue to push virtual servers deeper into the data center -- by having them handle more mission-critical applications -- IDC foresees a portfolio of uses that will drive some of the biggest trends in technology. If applications are no longer tied to underlying hardware, for instance, companies can more easily create and manage a service-oriented architecture, or SOA.
"Virtual servers are a foundational technology," says Elliot. "This is one of those technologies that can trigger cultural change and cause enlightenment of IT."
Tom Kaneshige and Ellen O'Brien were senior editors at CIO Decisions. To comment on this story, email email@example.com.