As power, cooling and floor space become more of a premium in data centers, CIOs need a new partner: the facilities...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
Many CIOs may recognize this, but some just can't bring themselves to make nice.
Analyst Rakesh Kumar, of Stamford Conn.-based Gartner Inc., said there has been a dearth of data center build-out for about six years, dating to the market crash near the turn of the century. As a result, dialogs between CIOs and facilities managers have been rare.
Kumar said collaboration between CIOs and facilities will be a top issue over the next few years, as more and more companies plan to build or expand data centers. He said global data center floor space will see a compound annual growth rate of 5-7% over the next five to 10 years.
IT will decide what sort of computing systems will go into that data center, but it will usually be the facilities organization that finds a location for that data center and hires and manages the architects, engineers and contractors who build and outfit the site.
When IT and facilities organizations sit down to discuss a data center project, they might find they don't speak the same language. And there is rarely an organizational structure within a company that would bring them together, Kumar said.
"There hasn't been a need for them to talk," Kumar said. "The budget for the CIO is often separate from the facilities management budget. There has been no real need to have that dialog, until this problem with electrical power and cooling came up."
Steven Olson, IT infrastructure manager at the newspaper Las Vegas Review-Journal, recently built a new data center. He worked well with his facilities department during the project, but before that he and the department didn't always see eye-to-eye on how to manage his old, overburdened data center.
"We had some disagreements on how to handle cooling," Olson said. "A lot of the cooling we were using in the old environment was primarily industrial-grade cooling devices, not necessarily computer-grade cooling devices. We would have just more and more cooling thrown at it. The people who were doing the duct work just didn't know how to work in that environment. They were used to cooling office environments, which is different from a computing environment. It was just a level of misunderstanding."
Steve Yellen, vice president of marketing at Aperture Technologies Inc., a Stamford, Conn.-based maker of data center management software, said there has long been a "wall" between IT and facilities.
"IT would fling equipment over the wall, and facilities would install this and install this and install this," Yellen said. "They didn't have a bad relationship, but they weren't integrated. Now these blade servers are so hot, the facilities guy is saying, 'Wait a second, I can't handle this stuff.'"
Steve Sams, vice president of site and facilities services at IBM, said complex power and cooling requirements of high-density blade servers are forcing CIOs and facilities managers to find a way to talk, whether the CIO is planning to build a new data center or maximize the efficiency of his existing one.
Sams said many data centers were built to accommodate old rack-based servers. Those racks would each generate between 1 and 3 kilowatts of heat. This formula was simple and homogenous. CIOs knew they could fill up a rack with servers and facilities managers knew the data center would run out of floor space before it ran out of cooling and power capacity.
"In the past the IT executive would say we're on a technology growth curve of 10 or 20% every few years, and he would provide that input the same way human resources would say we're on a trajectory to hire 20% more people per year," Sams said. "So facilities would look for more office space at the same time they would be looking for more data center space."
The advent of high-density blade servers changed all that.
Sams said fire suppression is a good example of how CIOs and facilities managers speak different languages. He said a data center could have a $5,000, $100,000 or $1 million fire suppression system in place.
"But the impact of that fire prevention is significant between those solutions," Sams said. "The cheapest would just have sprinkler heads everywhere, but that will destroy millions of dollars of technology. The next solution puts out the fire, but people have to physically get out of the data center in a certain amount of time. The last solution doesn't put people at risk. Those tradeoffs in data center design need to be made clear."
CIOs may recognize the value of collaboration with facilities management, but when they open a dialog, they might find a facilities team that lacks the resources to deal with the sophisticated challenges of cooling and powering high-density servers.
Kumar, of Gartner, said facilities management organizations are often underresourced. "They're seen as much of a cost and overhead organization, and not as adding real value to the business. In some cases, they might even be outsourced. And when they're running on a thin margin, they don't have the motivation to perform above expectations."
Kumar said the structure of a facilities management organization is also sometimes a barrier to collaboration with the CIO.
"Geography is one of the biggest problems we see happening, particularly with global organizations with operations around the world," Kumar said. "They may have a corporate facilities group, but they often have a federated structure. Most IT organizations have moved to a more unified, streamlined structure. Often a streamlined IT organization does not mirror itself well with the federated facilities management organization. It creates a lack of communication and a lack of transparency."
Arledge, of WakeMed Health and Hospitals, recently worked with IBM to build a new data center. "We were in a data center that was completely maxed out in terms of square-footage, cooling capacity and power capacity. We had no more room for additional racks. If we put a server in, we had to take a server out."
Arledge said his relationship with facilities management was solid before the project started because his organization had been opening new locations. Whenever a new clinic opened, he would work with facilities to coordinate the installation of data and telecommunications infrastructure.
"If there were no good communication, there would have been a lot of finger-pointing, a lot of anger, a lot of unmet expectations," Arledge said. "But we were nowhere near that, because we did have good communication. When issues arose, we were able to fix them."
Let us know what you think about the story; email: Shamus McGillicuddy, News Writer