One hundred and sixty servers hummed, air-conditioning fans whirred and lights flashed inside St. Peter's Health Care Services' cramped data center -- the high-tech heartbeat of the Albany, N.Y.-based hospital. Life-and-death data from 150 applications coursed through wires and wireless connections, reaching the most remote corners of the hospital and handling everything from nurse scheduling to digital-image processing.
And the heartbeat was growing louder. "I saw server space constraints coming," says CIO Jonathan Goldberg.
In mid-2005, Goldberg weighed a colossal decision: whether to consolidate servers using virtual server machines. Sure, the bean counters would be happy: Virtual servers render the operating system that is hard-wired into physical servers superfluous, so St. Peter's could run more servers on fewer machines at less cost.
But the thought of virtualizing the hospital's core systems made Goldberg shudder. Would there be a performance drop-off? Could critical systems run reliably? Most important, would software developers now and in the future support their applications on a virtual server? "I needed to do some definite soul-searching," Goldberg says.
If Goldberg could have gazed into the future instead, he would have seen that server virtualization is a virtual lock today. The technology is widely praised for its near-flawless performance, trouble-free adoption and consistent return on investment. The world's largest tech vendors -- EMC, Microsoft, IBM, Hewlett-Packard -- have invested heavily in virtual server offerings. Semiconductor giants Intel and AMD are building virtual server support into their chips. And virtual servers have gained solid traction in the midmarket, where IT dollars run scarce and data center space is at a premium.
This was first published in February 2007