"The biggest challenge is building something that will last for years," said Kathleen Agnew, who recently worked on a data center consolidation and virtualization project as ITS operations manager at King County Library System (KCLS) Service Center in Issaquah, Wash. King County spans 2,134 square miles on Puget Sound (including Seattle), nearly twice the geographic size of the average U.S. county. Home to more than 1.9 million people, it's also the 14th most populous county in the nation.
During the KCLS consolidation project, the IT and facilities departments worked together to bring in an electrical engineer to help define the request for proposal, particularly with regard to the UPS system; an architect to assist with pulling permits; and a mechanical engineer to ensure that the consolidated data center is structurally sound, Agnew said. "In this day and age, you have to call them all in," she said.
IT may own the blueprint of the future, but facilities owns the blueprint of the building, and that usually determines where pipes and cables are laid, as well as where vents and access control points are located. IT's involvement at the beginning of a data center consolidation and virtualization project helps eliminate the need for expensive retrofitting later.
Likewise, facilities staffers need to be aware of IT issues such as network security, said experts such as Richard Donaldson, a board member of DataCenterPulse.org, a nonprofit group of data center owners and operators in 68 countries. "There is still a very distinct wall between IT and facilities that will need to be broken down as all things will live 'on net,'" he said. "This implies basic network security."
If they haven't done so already, CIOs need to reach out to facilities and begin the discussion about how business processes will invariably change as automation begins, Donaldson said.
IT and facilities must look through the same lens
Part of the problem has been that IT and facilities departments traditionally use different monitoring tools, even for something as simple as temperature. Facilities departments typically use a building management system that employs its own software and communication protocols, and isn't open to TCP/IP or an open systems view. This creates a situation in which IT has little visibility into the day-to-day performance characteristics of the facility. When the power and cooling is insufficient for the IT hardware -- as may become the case in a densely populated virtualized environment utilizing blade servers -- the result is system outage.
One solution is to introduce an advanced monitoring/management platform that treats IT and facilities as a single entity; a converged view emerges that includes everything from servers, switches and storage equipment to power generation, distribution, cooling and environmental systems. Rakesh Kumar, an analyst at Gartner Inc. in Stamford, Conn., said he expects the deployment of such proactive energy monitoring systems to evolve as a standard business practice during the next three years. The best of these systems can scale to collect performance metrics in real time, regardless of the number of devices or the distance between them. Active monitoring can also be used to enable the redistribution of power loads across multiple UPS devices.
Experts even recommend that people in the IT and facilities departments sit together to develop relationships that will benefit the business through innovation and quick response to potential crises.
Let us know what you think about the story; email Laura Smith, Features Writer.