When people hear "utility computing," they tend to think of traditional utility models and envision an IT environment in which users can obtain and pay for computing power wherever and whenever they want. However, traditional utilities are not particularly responsive or flexible. Others see utility computing as a new method to more rapidly access business applications on demand to meet corporate objectives.
As discussed in the accompanying Executive Report, utility computing is actually a combination of these ideas. It can be defined as the automated deployment and management of computing power and applications on an on-demand basis.
| Provided by:
Key demand-side drivers for utility computing include complexity, cost control, and competition. Supply-side vendor forces driving the utility computing market include slowdown in sales of traditional legacy hardware products and software applications; escalating competition to win greater customer loyalty as existing products and applications become commodities; and the availability of new enabling technologies that make it possible to package, price, and deliver computing power and applications more flexibly.
The theoretical benefits of utility computing fall into two categories -- greater operating efficiencies and more innovative operations -- and can be summarized as follows:
- Higher system utilization and optimization
- Better application management and maintenance
- More predictable operating costs
- Greater ROI
- Renewed focus on core competencies
- New business-application and revenue-generating opportunities
- Stronger competitive advantage
In addition, competition among a widening array of hardware and software vendors, telecommunications carriers, specialized service providers, and even regional value-added resellers (VARs) is creating a buyer's market for enterprises seeking utility computing solutions. IT vendors and service providers are therefore rearchitecting their products and restructuring their go-to-market and competitive strategies to win a share of this emerging market.
In essence, utility computing combines hardware technologies, software applications, operational processes, and technical skills to meet an enterprise's business objectives.
A number of key technologies are enabling utility computing to become a reality, including service provisioning, Web services and service-oriented architecture, virtualization, blade technology, and grid computing. In addition, there are also more established technologies -- such as IP networking, security, and service-level management -- that are essential building blocks for utility computing.
One of the biggest potential obstacles to broad-based market acceptance of utility computing is the lack of industry standards that can ensure the interoperability of multivendor solutions. A number of standards groups are addressing this challenge. However, standards wars are not new to the IT industry; in fact, many standards initiatives have failed in the past because of proprietary battles between industry leaders.
Another critical challenge in the utility computing model is the ability to manage a more dynamic, heterogeneous IT environment. This means going beyond the primary IT management functions of fault and problem resolution to more effectively handle asset, configuration, security, and performance management tasks.
Most enterprises do not have the inhouse staff or financial resources to unilaterally implement and effectively manage a utility computing model on their own. In addition, most enterprises view utility computing as a means toward their ultimate goal of downsizing their internal focus on IT operations. Many enterprises therefore equate the migration to utility computing with outsourcing all or selectively outsourcing (aka out-tasking) some of their IT operations.
Whether seeking a wholesale transfer of IT operations or a specialized service solution to meet a specific business requirement, utility computing has become synonymous with outsourcing and is the latest example of the "shared services" model, which began in the 1960s and 1970s. The dot-com era and the emergence of the Web hosting business created a new breed of security, storage, and external service providers (xSPs) offering out-tasking services. Utility computing takes advantage of the best ingredients of traditional outsourcing and xSP services. And as an increasing number of enterprises want to solve their IT and business-application needs together via a single outsourcing arrangement, business process outsourcing (BPO) has also become closely associated with utility computing solutions.
Of course there are still cynics who remain skeptical of the ability of vendors to fulfill their utility computing promises; however, many enterprises believe they are already generating real benefits from this new IT architecture by leveraging a variety of existing xSP and BPO services. These enterprises are not concerned about the exact name for or definition of utility computing. Instead, they see multiple paths to developing a utility computing environment. This view has also heightened the competitive climate by permitting a wide array of players to fight for a share of the utility computing market.
Market acceptance of utility computing is growing, but enterprises still face numerous obstacles to successfully adopting this new IT model. Many enterprises will have to reshape the way they operate in order to successfully implement utility computing. The good news is that enterprises have a variety of product-based or service-oriented alternatives for implementing utility computing. The challenge lies in determining the right path or mix of approaches to adopt.
Click here to download a free copy of the Cutter Consortium Executive report, "Utility Computing: A Reality Check and Prescription for Success" by Jeff Kaplan.
Cutter Consortium helps companies leverage IT for competitive advantage and business success through its comprehensive range of consulting, training and content, provided by the leading expert practitioners in business and IT.