CIO primer: Virtualization basics

Virtualization is coming into its own -- and successful IT managers at companies of all sizes are figuring ways the technology can help them.

This Content Component encountered an error

If there is one technology that took hold in the enterprise in 2005, it's virtualization. The software moved from test and development and into the data center faster than anyone imagined, including the experts. And while there are sure to be some bumps in the road in 2006 as the technology moves from awareness to adoption, you can bet most IT pros will be running virtualization in the data center before the end of next year.

What is it?

Virtualization is software that allows a piece of hardware to run multiple operating system images at once. The technology got its start on mainframes decades ago, allowing Big Iron to avoid wasting expensive processing power.

In 1998, EMC subsidiary VMware Inc. got into the virtualization business, and jumped from startup to powerhouse in just a few years by taking virtualization to the x86 platform.

VMware's timing was perfect.

The standard for x86 servers was to run one application on one server. And that was OK, when servers were cheap to buy and cheap to run. But then things started to get messy as more and more servers were added to meet application demand. Data center managers were facing serious utilization issues, spending millions on wasted server space.

But with virtualization, that could all change. Instead of running one application on a machine, people could run several applications, all isolated into virtual operating system images that wouldn't affect each other.

With the recent push for IT efficiency and consolidation, the technology has seen remarkable adoption rates and customer satisfaction. Framingham, Mass-based IDC estimates that 75% of all companies with 500 or more employees are deploying virtual servers. And of those customers currently using server virtualization, they plan for 45% of new servers purchased next year to be virtualized.

Who makes it?

Storage, networking and server equipment are all turning an eye toward virtualization. Almost every vendor has a stake in it, including startups and open source players.

According to IDC, the mainframe, iSeries and high-end Unix systems account for the bulk of customer spending on virtualized servers today.

Joe Clabby, vice president and practice director at Boston-based Summit Strategies, said mainframes are running the largest virtualized workloads. But Windows and Linux servers are seeing the most rapid growth, thanks to software from VMware, Microsoft and Xen.

VMware cornered the market for x86 servers, and almost everyone agrees they are the de facto leader on the platform. A new survey of VMware customers found that 90% are using virtualization in production environments and 25% are standardizing all their x86 applications on virtual infrastructure.

"Not to sound boastful, but we've become the leader by bringing virtualization to the most broadly deployed platform in the world -- the x86 server," said Raghu Raghuram, vice president of platform products at VMware. "We're pretty far out front in deployed customers, products and revenue."

Microsoft came a little later to the virtualization game with its Virtual Server technology. Redmond unveiled Virtual Server R2 Standard Edition earlier this month and is working on building the virtualization technology into the next version of its operating system, code-named Longhorn.

Though Windows has only had Virtual Server since 2004, competitors and analysts agree that Microsoft is a big player in any space it chooses to be in because of its huge customer base.

Xen, a virtualization technology from the open source community, rolled out version 3.0 earlier this month. Developers are working to include the code in the next version of the Linux kernel, and software companies Red Hat Inc. and Novell Inc. are working to virtualize their operating system offerings as well.

More CIO technology trends

Building the case for MDM

2006 Outlook: Open source, offshore and more

Xen is currently available in Red Hat's free testing version of Linux, Fedora, but it's listed as an unstable feature. But Brian Stein, engineering manager in the emerging technology group at Red Hat, is confident Xen will be hardened and ready for release by the time RHEL 5 is ready to launch next year.

"No one expects all of the kinks to be worked out in an initial release," Stein said. "It's starting to stabilize and we're pretty confident Xen will be ready for RHEL 5. We have plenty of runway."

Stein said one of the areas where Red Hat will be playing catch-up is with management tools, such as VMware's VMotion. He said Xen will be used more for single machine consolidation than data center provisioning and management, but aggregated management tools will follow.

"Red Hat plans to see how the battle for the management space plays out, see what 80% of the users are clamoring for, and then work from there. We don't have a stake in the ground because it's going to change," Stein said. "We plan to develop strong single machine management and radiate that out to policy-based [management]. We see ourselves getting there. But it's naÏve to think that the open source community is going to drive us there."

In addition to these players, some major companies that historically had little to do with virtualization are getting involved in the technology. Networking giant Cisco Systems Inc. started promoting technology it acquired earlier this year to virtualize at the network level. IT resource management company BMC Software Inc. just launched a management tool for the virtual environment. And Advanced Micro Devices and Intel have unveiled details regarding virtualization efforts at the chip level.

What are IT pros doing with it?

In 2005, every company seemed to be selling virtualization-focused products and strategies. But with all these technologies at all the various layers of the IT stack, can the functionalities fit together?

Probably not, according to Tony Iams, vice president and senior analyst with Rye Brook, N.Y.-based Ideas International. "Users aren't ready to mix-and-match technologies. They're using certain technologies for specific tasks," Iams said.

Clabby agreed. "There is a lot of overlap, but at this point virtualization is not a mix-and-match solution. Most people are focusing on one thing -- networking, storage or systems -- and basing the decision [to virtualize] on needs. Is storage utilization your biggest priority? Then you go with storage virtualization. If you think your systems environment is out of control, you go with server virtualization."

According to Iams, most of the current usage comes from the following:

  • Testing: IT pros are using virtualized resources for testing and development resources; carving off parts of servers already in use rather than buying new ones to try new things.
  • Consolidation: Organizations are reducing the number of servers by increasing machine utilization.
  • Support for legacy environments: Users can run applications in a discontinued or unsupported environment by running it in a virtual machine, giving them time to find a way off that operating system.

    What's next?

    Experts said the next step for virtualization technology is management -- treating the server farm as a virtual pool of computing resources, enabling companies to move virtual machines around or tie them together into large machine images.

    "When you sit back and look at virtualizing the entire data center, that's when management becomes the critical issue," Iams said. "I can see the whole topic of systems management evolving into management of virtual systems in the future."

    To see real value from virtualization, analysts said IT will need automated, policy-based management. With virtualization software creating all sorts of virtual machines, someone is going to have to go through and add patches, software, user profiles and more to those operating system images.

    That provisioning is a job that could be automated, based on policies. But it won't be easy for vendors or IT pros.

    "In principle, people love policy-based management," Iams said. "It ties IT policies to business priorities. But, for that to work, there is a certain amount of automation built on top of that. There are technical barriers to making that happen and social barriers. Companies have built up practices over time, so any move toward automation is going to involve some trepidation and fear."

    Companies are going to want to use fewer people to provision computing resources because humans are very expensive when it comes to managing workloads. But Clabby said automated management software won't hit an awareness level until next year, and he doesn't expect any widespread deployment for two to three years out.

    "This year, these uses and technologies became talking points between customers and vendors. Next year you're going to see the deliverables from vendors and it's going to be a very critical year," Iams said.

  • Dig deeper on Virtualization for the enterprise

    Pro+

    Features

    Enjoy the benefits of Pro+ membership, learn more and join.

    0 comments

    Oldest 

    Forgot Password?

    No problem! Submit your e-mail address below. We'll send you an email containing your password.

    Your password has been sent to:

    -ADS BY GOOGLE

    SearchCompliance

    SearchHealthIT

    SearchCloudComputing

    SearchMobileComputing

    SearchDataCenter

    Close