Rows of flashing slot machines buzz throughout Viejas Enterprises' tribal casino. An outdoor arena erupts with light and sound as entertainers such as country singer Randy Travis and comedian Bill Cosby take the stage. Energy emanates freely from the epicenter of this Indian reservation 30 miles east of San Diego -- that is, except inside its data center.
|To view our complete multimedia package, visit our Green IT and the data center special report.|
Two years ago CIO Moti Vyas' IT staff was plugging new servers into the nearest outlet inside a cramped server room, only to receive calls from the facilities department. You're drawing too much power from the circuit, they said. More power would also be needed to cool the new servers, yet the server room was nearing its power threshold. All this because Viejas Enterprises was trying to keep up with Californians' growing appetite for gambling and entertainment.
So Vyas added another server room to gain a little more power and space -- a temporary fix as he planned a grander solution. Viejas Enterprises eventually made a multimillion-dollar investment to build an energy-efficient data center. "There were business drivers and technology drivers," Vyas says. "Business drivers were to support today's business and plan for tomorrow's needs. The technology drivers were power, cooling due to blade servers and virtualization, security and future-proofing."
Like many CIOs, Vyas had hit a wall when faced with the need to scale up his server room, largely because of energy concerns. Last year U.S. data centers consumed more than 60 billion kilowatt-hours of electricity at a cost of about $4.5 billion, according to the Environmental Protection Agency (EPA). A good chunk of this power -- up to 60% in some cases -- is needed to cool servers. Data centers accounted for almost 2% of this country's total energy consumption, not to mention massive harmful carbon emissions.
These numbers have risen quickly, nearly 40% between 1999 and 2005, according to a survey by the Uptime Institute, a provider of educational and consulting services. And they may double in the next five years to more than 100 billion kilowatt-hours, according to the EPA.
"Electricity, which is the lifeblood of data centers, is going up," says Andrew Fanara, product development team leader for the EPA Energy Star program, a voluntary labeling program initiated in 1992 to promote energy efficiency in products such as computer hardware. "When demand for something goes up, prices go up. We've been starting to see that for quite a few years, and you probably can expect more of that in the future."
If your eyes are glazing over at these massive numbers, here's one from AFCOM, an association of data center professionals, that will wake you up: Over the next five years, power failures and limits on power availability will halt data center operations at more than 90% of all companies. Market researcher Gartner Inc. predicts 50% of IT managers will not have enough power to run their data centers by 2008. Expect a rise in outages, along with a pressing need to add more space and power to meet computing demands.
Faced with such an acute need, Vyas spent six months researching and crafting an energy-efficient data center design. Everything from business-case analysis to power supplies to the local weather factored into the plan. Emerging technologies, of course, played a major role. "The new paradigms of blade servers and virtualization forced us to revisit and change our design completely, from cooling to airflow to power requirements," Vyas says.
Today, Viejas Enterprises has one of the most energy-efficient data centers in the midmarket. The single-story data center makes good use of blade servers and virtualization, giving it a high level of computing density. The layout and air conditioning systems keep servers relatively cool. Backup generators are at the ready. The data center draws electricity from a separate power line from the local utility. "If the power line goes down, we immediately switch to a UPS [uninterruptible power supply] and then, in a few seconds, switch to a generator," Vyas says.
The Vendor Challenge
Top tech vendors aren't far behind the need to stem the tide of rising energy usage with server virtualization tools and state-of-the-art server cooling services. Hewlett-Packard Co. began offering a service called thermal zone mapping to help customers identify airflows and mixing patterns inside the data center. The service creates a three-dimensional image of a data center that shows the "zones of influence" of a computer room air conditioner (CRAC). The service's target customer is a company with 50 blade servers or 10 10-kilowatt racks.
"These customers will run into heat and cost issues they've never thought about before," says Brian Brouillette, a vice president with HP Services. Average cost for the highest level of thermal zone mapping, including thermal sensors: $100,000. Armed with this knowledge, CIOs can move CRACs and servers to achieve optimal cooling.
Meanwhile, at LinuxWorld in San Francisco this year, IBM announced that dozens of customers had bought IBM tools that consolidate Unix and x86 workloads onto IBM System p servers. The tools aim to help customers such as Volkswagen and Telefonica Moviles, a mobile operator in Spain, become more energy efficient. "It balances better for each service, each server," says Miguel Angel Garcia Hafner, technology manager for value-added services at Telefonica Moviles.
HP and IBM have also launched massive data center consolidation projects of their own to lower energy costs and contain carbon emissions -- and to act as proof points for their energy-efficient offerings. HP consolidated 87 data centers into six, while IBM plans to consolidate 3,900 servers onto 33 System z mainframes running Linux. Dubbed "Big Green," IBM's $1 billion project includes building a state-of-the-art data center in Colorado. "We'll double our compute and data capacity without increasing our power consumption or carbon footprint," says Richard Lechner, vice president of IT optimization at IBM.
The Green Light
HP and IBM are among the tech companies that have banded together this year to ride the green wave by forming a nonprofit consortium, The Green Grid. The consortium is developing ways to measure and benchmark a data center's energy efficiency. It's a daunting task that covers many parts of energy consumption: CPUs, power supplies, servers, applications, building construction and humidity, among others. In addition to standard data center energy measurements, The Green Grid wants to create a green seal of approval for individual technology products, says Colette LaForce, vice president of marketing at Rackable Systems Inc. and a Green Grid board member. Other Green Grid members include Microsoft, Sun Microsystems Inc., Dell Inc., Intel Corp., Advanced Micro Devices Inc. (AMD) and VMware Inc.
Green products can cost a bit more than traditional data center gear, but a range of products are now available, says COO Wendy Cebula of VistaPrint Ltd., a $255 million printing company based in Bermuda. They include energy-efficient power supplies, voltage regulators and chips, such as more energy-efficient chipsets from Intel and AMD (a CPU uses more than 50% of the power required to run a server, according to Eaton Corp., an electrical systems and component vendor). New UPS systems lose 70% less energy than older UPS systems at typical loads, according to a Green Grid report.
Two years ago, VistaPrint's data center hosting provider in Bermuda pressured the company to rethink its energy usage because of the rising cost of energy. So, VistaPrint embraced virtual servers to reduce energy usage by 75%, replaced year-old physical servers with energy-efficient ones and bought a set of air conditioners that push hot air outside. "We definitely accelerated investment -- the rollout of some of these changes -- to capture some of the green benefits," Cebula says. VistaPrint expects to save nearly $500,000 over three years and reduce its output of carbon emissions by several tons this year.
Indeed, server virtualization tops the list of best practices for saving energy in the data center (see Energy-saving measures), as many CIOs have been able to consolidate virtual servers onto a handful of physical servers. Along with blade servers from stalwarts IBM, HP and Dell, server virtualization tools from high-flyer VMware (whose IPO this summer raised almost $1 billion) have lifted computing density to new heights.
Best Practices in Energy Conservation
|Many CIOs have been able to save energy or just cold hard cash by consolidating virtual servers onto a handful of physical servers.
At $400 million St. Peter's Health Care Services, 110 physical servers were virtualized and consolidated onto five physical servers. At Agile (recently acquired by Oracle), hardware savings drove the virtualization move. Last year, Sunny Azadeh, senior vice president of IT at Agile, was on the verge of purchasing 3,000 servers at $7,000 each. Instead, she implemented virtual servers using products from FastScale Technology and avoided the $2.1 million cost, which over three years turned out to be 2 cents per share. While Azadeh personally wanted to avert the impact that 3,000 servers would have on the environment (and the utility bill), she admits that capital spending and good timing were the real green lights for the project.
Because a data center's inventory turns over every five years or so, CIOs have the opportunity to slowly transform their data centers into energy-efficient ones. They can also adopt best practices to reduce energy usage right now. "The feeling is that there are a lot of easy opportunities out there," says Bill Tschudi, principal investigator for the applications team at Lawrence Berkeley National Lab, adding that "30% to 40% improvements is not too hard."
The Sleep Option
One practice among many is to put servers to sleep, much like a screen saver. A typical x86 server consumes 30% to 40% of maximum power even when producing no work at all, according to Eaton Corp., an electrical systems and component vendor. Of course, powering down a server even at night requires some forethought about application spikes and availability. Peter Boergermann of Citizens & Northern Bank in Wellsboro, Pa., warns that sleep mode on servers could disrupt a data center's ability to maintain levels of service, as servers power up and down. "Obviously, if you're standing away from the PC and go away to a meeting and it goes into sleep mode, it's not a big deal," he says. "But if you start turning sleep mode on with servers, that can cause you some problems."
Another good practice is to make sure power supplies are properly sized for the load. IT equipment is rated to work with input power voltages ranging from 100 volts to 240 volts of alternating current, yet most equipment runs off lower voltage power. Eaton, for example, says an HP ProLiant DL380 Generation 5 server operates at 82% efficiency at 120 volts, 84% at 208 volts, and 85% at 230 volts.
"A power supply is most efficient when it's used as close to its capacity as possible," says Paul Hammann, data center architect at Powerset, a developer of a natural language search engine. These energy-saving techniques don't seem like much, he says, but they all add up.
Market researcher IDC predicts that 10% of all servers sold in the U.S. this year will be blades. "We're seeing an explosion of volume-density servers, with 2007 as a crossover year," says Jean Bozman, vice president of global enterprise server solutions at IDC.
Unfortunately, advancements in power infrastructure haven't kept up with data center technology. Batteries, generators and fire extinguishers look pretty much the same as they did decades ago. "Data center power is 1940s technology," says Dr. Werner Vogels, vice president and CTO of Amazon.com. This means a data center has a finite amount of available energy that won't change anytime soon. Thus, CIOs must find ways to improve server densities and cooling techniques in order to take advantage of the limited space and reduce wasted energy.
On the server front, blade servers bring a high level of density and power efficiency to the data center. On average, a blade chassis can hold eight to 16 blades. That density means the chassis exhausts a lot of heat in a very small area. Airflow needs to be fast and have enough cooling concentration to keep blades from overheating. At 30 kilowatts of power per rack, a data center will need two five-ton CRACs for cooling, according to Eaton. Emerson Network Power reports that cooling accounts for 37% of electricity usage within a well-designed data center.
A data center cannot run on blades alone because it would be impossible to cool. Even HP admits today's data center tops out at around 35% blades, although the blade maker is working to double this figure through better management and cooling practices. "If you don't focus on cooling, your data center could be unusable," says Vyas, whose new data center contains 20% blades in five chassis. "Within minutes all the servers will literally melt, especially the blade servers."
Viejas Enterprises' new data center has raised floors and perforated tiles and is designed with hot and cold aisles. A raised floor allows cold air to flow to hard-to-get-to areas. If more cooling is required for a specific aisle or group of racks, existing tiles can be swapped for tiles with a higher percent of perforation. However, there's a limit to the percentage because perforated tiles must be able to handle certain weight-load requirements.
Virtualization plays a key role in cooling, too. "It's a key enabler not so much in the consolidation perspective but of pooling resources and moving workloads from one system to another," IBM's Lechner says. "This can eliminate a hot spot or identify a system that's underutilized for long periods of time, so you move the remaining workload off that system and shut it down completely and save the energy associated with it."
Then there are internal processes that keep in energy. Vyas' data center has a separate room where staff members prepare and test server applications, thus limiting the number of times anyone has to go into the actual server room. Emerson Network Power advises CIOs to keep data center doors closed and use a vapor seal to control humidity levels.
Creative Methods: From Cows to Roof Plants
Creative thinking about energy conservation is fast becoming part of a CIO's job. And there's no shortage of ideas, from cow manure's methane gas as a cheap energy source to natural cooling via retractable roofs. In fact, today CIOs can tap winter air for "free cooling" through air economizer systems. Liquid cooling is also getting a second look. And then there are solar panels to tap the sun for "free" energy.
For Web hosting company AISO.Net of Romoland, Calif., some 120 solar panels power water-cooled servers in its data center. AISO.Net plans to grow drought-resistant plants on its roof to reduce cooling and heating requirements by 50%. "We've always been green-minded and felt that this would be the right thing to do," says Phil Nail, CTO and co-owner along with his wife, Sherry.
Besides embracing virtual servers, VistaPrint also decided to build a new data center in Windsor, Canada, where hydroelectric power and square footage costs are 60% cheaper than in, say, Lexington, Mass., a location VistaPrint considered. "Green was a factor in choosing that location," Cebula says.
Tech giants are also building data centers near cheap power. Microsoft and Cisco are reportedly looking at Iceland to erect mega data centers, tapping into the country's geo-thermal and hydroelectric power. Google has built a data center on the banks of the Columbia River in Oregon. Another technique is to move workloads to data centers in different time zones to take advantage of lower utility rates throughout the day.
For midmarket CIOs considering a remote location to build a new data center, Aaron Branham, vice president of technology and operations at VistaPrint, has some advice: "It's cheaper to do it yourself if you have large enough economies of scale and a solid growth plan. Startups tend to have less money than more mature companies [so] it might not be possible to pay for a data center at this time. At the same time, picking a very small ISP to co-lo with you could get you in trouble if you are growing rapidly. In general, if you know you are going to need more than 20 racks, it might make sense to do it yourself." A new data center costs around $2,500 per square foot, depending on the location, says Paul Perez, vice president of scalable data center infrastructure at HP.
The EPA Weighs In
|Late last year, Congress asked the EPA to study energy consumption in data centers. This summer the EPA delivered its 133-page report, "Report to Congress on Server and Data Center Energy Efficiency, Public Law 109-431," authored by researchers from the Lawrence Berkeley National Lab.
The report contains guidelines, not regulations or recommendations for legislation. Industry watchers believe the EPA has little interest in regulating data centers. The EPA's next step is to develop metrics that help CIOs benchmark their energy consumption and policies.
The report suggests that CIOs can reduce a typical server's energy usage by 25% or more through existing technologies and design strategies, such as server virtualization. The EPA recommends voluntary tax incentive programs to drive adoption of best practices and calls for the federal government to challenge CEOs to conduct energy-efficiency assessments, implement improvements and report energy performance in their data centers on a voluntary basis. "Every data center should have a meter on it," says Andrew Fanara, Energy Star product development team leader at the EPA. "We feel that all we need is a little dose of competition, not regulation."
The report has received mixed reactions: Some think it went too far, others not far enough.
Robert McFarlane, data center consultant and president of Interport, a division of Shen, Milsom & Wilke Inc., a technology consultancy, is an outspoken critic of Congress' legislative track record on technology issues and believes the report gives politicians an easy target. "Since we have no real energy policy in this country, this gives Congress and the president an opportunity to push policy that looks good, especially in an election year, and blame the helpless, namely the people running the data centers," he says.
Yet Rakesh Kumar, analyst at market research firm Gartner, says, "We were looking for a stronger carrot and much bigger stick. ... The tax incentives are marginal and there should be, in our opinion, some threat of legislation."
Peter Boergermann of Citizens & Northern Bank in Wellsboro, Pa., is wary of seeing yellow Energy Star stickers slapped on servers. He believes the tech industry, particularly software vendors, should take the lead in reducing power consumption. Many of his issues are inside the box: Just scrolling through a window causes one of his applications to use 100% of its server capacity, for instance. The critical application, which manages property appraisals, also can't be virtualized lest it get bogged down fighting for resources with other virtualized applications.
"Let's make applications more efficient," Boergermann says. "It needs to go back to the development stage so that applications are written that are green."
While a company can attain substantial energy cost savings from locating a data center near cheap power sources, a CIO needs to consider other factors, such as telecom costs, the impact on the local economy and the availability of IT talent, which is likely to be scarce in remote areas. There can also be latency issues; computer-based financial trading institutions tend to shy away from remote data centers because of their latency. "Microseconds count," says James Houghton, vice president and head of utility product management at Wachovia Corp. VistaPrint also weighed a laundry list of international concerns before settling on Canada, from tech support to the potential for natural and geopolitical disasters.
Remote data centers, though, might not even be "green" at all. Critics claim that building a data center in an area offering cheap hydroelectric power isn't completely environmentally friendly. For instance, dams on major rivers harm salmon fisheries. The EPA has taken measures to remove dams on the Snake River. The bottom line: Taking advantage of cheaper energy isn't the same as being energy efficient or green.
Regardless, most CIOs don't make green-based decisions -- they make greenback ones. In a recent Forrester Research Inc. survey, 78% of 124 IT procurement and operations respondents throughout North America and Europe said they don't write green IT into their evaluation and selection criteria for IT systems and devices. Of course, a CIO's traditional goals are to help a company make money and cut costs in the process. Many green technologies just don't yet have the needed ROI. "We can't build a data center that is completely energy efficient, because we have a business to run," Vyas says.
Case in point: A facilities manager in the San Francisco Bay Area said that he recently looked into installing solar panels at the behest of his CIO, but the high cost of retrofitting roofs to handle the weight of the proposed panels quashed the idea. It would've taken years for lower energy costs to offset the retrofit, he says. Now the company will consider solar panels only when building a new facility.
|A recent survey of 374 data center managers conducted by SearchDataCenter.com found that lack of space was the most significant limiting factor for a company's data center growth, followed by power capacity, network bandwidth and cooling capacity. On the energy-saving front, the results were as follows:
50%+ said they have saved energy through server virtualization.
32% had made efforts to improve under-floor air conditioning efficiency.
17.5% had implemented power-down features on servers not in use.
11% had tried direct current power in the data center.
7.7% had tried liquid cooling for increased data center cooling efficiency.
27% hadn't taken any measures to minimize their data center power usage.
Water Cooling Redux
Credit Suisse takes a unique approach to water cooling: freezing ice at night during lower utility rates to use for cooling during the day. The project was discussed during a panel on green projects at the Next Generation Data Center event this summer -- and the audience scoffed that energy savings is not energy efficiency. "All kinds of people are saying all kinds of things, but they have to face facts," says Kfir Godrich, CTO of EYP Mission Critical Facilities, which worked on the Credit Suisse project. "Believe me, all these financial [institutions] are leaders in what they are doing. If they are doing these kinds of things, it's something the market should look at and learn from in a positive way."
Either way, most CIOs wouldn't go back to water cooling for servers, given the risk of flooding. In a survey by SearchDataCenter.com, 65% of respondents said they would never use liquid cooling in their data centers (see right).
Fred Stack, vice president of marketing at Liebert Precision Cooling, a part of Emerson Network Power, says most CIOs wrongly equate water cooling with liquid cooling. For instance, liquid cooling can be Freon, which has 380 times the heat capacity of air, 40 times that of water. And if Freon leaks, it turns into a gas. "When it comes to liquid cooling, there must be a mind change," Stack says.
Another innovative approach to data center efficiency: a data center in a trailer. Sun Microsystems' Blackbox and Rackable Systems' Concentro have already hit the market. While a trailer can be hauled to remote areas offering cheap energy prices, the primary energy efficiency comes from an optimized density and cooling design. Known in the industry as a "closed loop," the data center's airflow is circulated rather than released outside. Rackable claims that its self-contained cooling technology saves 80% in cooling costs over traditional methods. (Vyas considered Concentro but felt the solution was too proprietary.)
Many forces are coming together to drive energy efficiencies in the data center, from cost savings to virtualization and blade technology to the green movement that's sweeping the nation. "Just as data center managers today manage their data center for application availability, security and end-to-end transaction performance, we think that moving forward they are going to have to manage energy optimization," IBM's Lechner says.
CIOs agree. Two out of five IT managers polled by IDC said that power and cooling was their top data center concern this year, while availability and redundancy came in second. The utility bill, once solely the province of the facilities manager, is now sometimes in the CIO's domain, or at the least, the CIO is accountable for its expense. "Our IT is considered a business unit. Energy cost is in the IT budget, not in facilities," says Vyas. For good or ill, CIOs are now in the energy management business.
Tom Kaneshige was a senior features editor at CIO Decisions. To comment on this story, email email@example.com.