The thing about selling technology in a down economy is that there are far fewer bandwagons to climb aboard, so the few that prove sturdy tend to get mighty crowded. Take a look at utility computing, for example. All the major vendors have a utility vision. IBM is the greybeard in the group with its on-demand initiative. Hewlett-Packard calls its version "adaptive enterprise," while Sun opts for the mysterious N1 moniker.
With that kind of firepower behind the idea, it's small wonder that utility computing is one hot topic. But there's genuine appeal behind the hype as well. Utility computing -- the idea of being able to plug into computing power as you do electricity, paying only for resources that are consumed -- has piqued the interest of chief information officers (CIOs) looking to save money by maximizing the efficiency of their often underutilized infrastructure. Companies with a huge informational infrastructure, such as American Express and JP Morgan, seem sold on the concept. Moreover, some analysts say that ignoring utility computing could be a major mistake.
"CIOs that want to remain in the IT business have to stay on top of this," says Thomas Bittman, a vice president at Stamford, Conn.-based research firm Gartner Inc. "Otherwise, their CEO is going to eventually say, 'IBM says [it] can do the same thing you can do for half the price,' and the IT shop will get outsourced."
But however important it is, implementing utility computing is far from
The technical side of utility computing
Here's the thing: The concept of utility computing isn't new. Web hosting companies, for example, offer a service very similar to the utility computing model.
"It's not a revolutionary idea in and of itself," agrees Jeffrey M. Kaplan, the managing director of Wellesley, Mass.-based ThinkStrategies. "The difficult economy is a major reason that utility computing has appeared as a top tech trend in such places as Gartner and The Wall Street Journal." In fact, Kaplan and other experts expect to see utility computing gain mainstream acceptance during the next several years.
But while the concept isn't new, the task of figuring out how to make it work certainly is. There are several ways to implement utility computing. Big companies, for example, will want to optimize their existing infrastructure by creating an in-house utility, while smaller companies might choose a utility service provider to exploit economies of scale. Either way, there are several categories of technology that will play a part in any utility implementation. Savvy CIOs should evaluate these technologies before plunging in full force.
Resource management technologies
In order to draw upon technology resources as a giant pool rather than the mix of siloed systems and standalone hardware common today, CIOs should pull together a variety of different technologies.
The first is virtualization technologies, says David Kelly, principal at Upside Research, in Newton, Mass. "You want to be able to treat technology as a pool of resources that might change and come from different sources over time. So instead of tying applications to specific resources, CIOs need to tie them to pools of resources." For example, technologies such as the partitioning tools offered by VMware Inc. allow companies to carve up space on a server dynamically, changing partition sizes to match demand.
Prioritization and provisioning software also fall into this category. After all, if IT usage is to be provided on a pay-as-you-go basis, CIOs need to find a way to allocate resources at a very granular level. "If a department needs one-tenth of a processor now, companies need to be able to ratchet down the usage to that level," says Bittman. Eventually, Bittman says, these pools of resources could grow into computer grids, although he says that not every utility will need to be run on a grid.
If technology consumers will be paying only for what they actually use, CIOs must create the equivalent of a meter that measures electrical use -- a complicated task, at best. While there are a fair number of technologies that measure hardware usage, software usage is a different matter. The licensing issues alone are immense, points out Corey Ferengul, a vice president at Meta Group, a Stamford, Conn.-based research company. Software vendors aren't likely to be receptive to licensing schemes based on usage, since they would cut into margins significantly.
Companies will want to automate as many processes as possible so that they can take place on the fly. For example, IT staffers won't have a week to add a server into the pool if demand is there right now, points out Ferengul. "Automation tools will take these tasks down to minutes," he says.
According to Bittman, the market for automation tools is very active right now. "We've got 10 different players running toward the goal line and, at the same time, have large vendors on the sidelines gobbling them up." Witness, for example, IBM's recent acquisition of Think Dynamics, an automation vendor based in Toronto.
To treat an enterprise's technical infrastructure as one pool of resources, a lot of work must be done to make sure that disparate systems can talk to each other and share information. This means heavy-duty integration work, which most analysts predict will fall to the lot of Web services to execute. "That presumes that, by the time all this stuff gets here, Web services will be ready, of course," says Ferengul.
Bottom line: This list adds up to a lot of work for CIOs. Granted, some will choose to lessen the burden by going with a utility service provider, but that adds issues -- such as bandwidth and connectivity -- at the same time it takes other issues away. "From an IT standpoint, it's like having to rebuild an entire organization, much like the transformation that manufacturing plants had to make to support just-in-time manufacturing," says Ferengul. "It's just huge."
This was first published in May 2003