SMITHFIELD, R.I. -- The problem at Bryant University wasn't that the data center was too big or too small. The problem was the school didn't have a data center.
Six years ago, servers were like forgotten Easter eggs from hunts past. They showed up in classrooms, supply closets, under desks. They were maintained by a patchwork of staff and students, who would leave for other endeavors and cast servers into unsupervised limbo.
In short, it was a mess that worked only because it had to.
"There were closets in classrooms that grew into what some people would call a data center," said Arthur Gloster, vice president of information services at the roughly 3,700 student university, which just four years ago was Bryant College. "Everybody had their own sandbox to play in."
Today there is only one sandbox, with everything in one 500-square-foot data center, of which only about 300 square feet is filled. Virtualization has dropped the actual number of physical servers from around 78 to about half that.
Now tucked away in a basement room on the fringes of a small, surprisingly rural campus, Bryant's data center is clean, small and doing its job by staying out of the way.
"It's been almost a year now and there hasn't been one hiccup. Not one," said Richard Siedzik, just shouting to be heard over fans that, it being a Friday on a college campus and all, were whirring along lower than they might be on, say, the night before finals.
Siedzik, director of computer and telecommunications services at Bryant, has been showing off the school's data center success at IT conferences for most of the past year. Anybody who asks will hear about the ease of modular implementation, literally carrying the server cabinets inside in chunks that are up then and running within a day. They'll hear about the beauty of in-line, or direct cooling, where high-heat blade servers and others get their own personal fans pointed right at them.
They may even hear the word green. Siedzik right now can only assume he has reduced IT's power consumption -- thanks to actually building a data center, for the first time the school can accurately measure its data usage.
Not that green was the point.
"We were not even looking at energy efficiency, to be honest with you," Gloster said. No, that success was just a byproduct of working with IBM, which has emphasized the green aspect of its work in recent years. But, in the midst of a data center tear-down and rebuild, why not take the energy savings anyway?
The big building closest to the parking lot at Bryant is the George E. Bello Center for Information and Technology. It is an apt title for a school so concerned with being a wired campus.
When Gloster arrived on campus six years ago, with a string of IT work at academic institutions behind him, he was given oversight of both IT and the school's library.
It was a natural fit, considering academic libraries are less about books and more about technology every year.
Today, Bryant gives each new student a laptop and gives them new ones as they enter their junior year. Classrooms, on a campus built in 1971 when the school moved from nearby Providence, are all outfitted with in-ceiling projectors. From there, professors can pull up a wealth of information, including the library's films, and view it on a wall. Dorms are wireless and student phones are Voice over Internet Protocol. Local first responders are linked to each other through Bryant's Cisco IP Interoperability and Collaboration System (IPICS) network.
All that, of course, depends on a reliable data center -- which Bryant didn't have.
"It just gets to be a nightmare in terms of the processes, to manage that server environment," said Gerald Baseel, the IBM client executive who helped the university design and build a new data center.
For Gloster and Siedzik, the toughest part of getting the new data center was making their case to the school.
"This was a campus need and not an IT want," Siedzik said. "It was to protect our information assets.
"We had to bring our business case to senior management," he said.
Approval, bidding, design and implementation of the data center took just longer than a year. Planning began in late 2005. IBM signed on in 2006. Design and planning was complete within the year. The data center room was cleared out and prepared in January 2007. The crew waited until the end of the spring semester to bring the data center in.
Siedzik repeats the next fact as if he's still a bit surprised it went so smoothly: IBM techs had the data center moved in and online within one-and-a-half days. No more closets. Bryant University had a data center.
In the room
There are a few things about Bryant's otherwise nondescript data center that deserve some description.
First, there is no raised flooring. Siedzik gives two reasons for that.
One: It wouldn't fit. Standing in his data center, Siedzik pointed to the top of the server cabinets, which nearly touched the ceiling.
Two: Cooling could potentially be more precise, eschewing fans that work overtime to cool an entire room. As it is, the servers are packed together, with sets of IBM BladeCenter servers ready to incinerate if they don't get specialized attention.
So the school opted for a piped-in refrigerant to help with small fans pointed directly at the servers. As each server ramps up, the fans accelerate to match its intensity. If a pipe bursts, large drip pans just above the server cabinets are in place to force the runoff onto the floor. The school used American Power Conversion Corp. for this in-line cooling technique.
At the end of the row of cabinets are pipe fixtures for another set of cabinets. That's the "modular" part of the room. The school has space to expand quickly, if necessary. The work, Siedzik said, is as simple as carrying another cabinet in and hooking it up.
Also what's worth noticing is what's not there: all of the 70-something servers the school started with. Using products from VMware Inc., Bryant has managed to cut the number of physical servers nearly in half and is looking to lower the number further.
"Every day we're looking at where we can virtualize other servers throughout the campus," Siedzik said.
Gloster began virtualization work at the school not long after he arrived. The technology still hadn't received broad adoption in the midmarket yet, but Gloster said he was willing to roll the dice a few times in exchange for the value of consolidation.
"I've never been risk-averse, I guess," he mused. "Maybe I should be."
There is still a bit of work to be done. A security card reader has yet to be installed on the data center door, which is secured with standard locks. As it stands, though, few from the roughly 25-person IT staff ever enter the room, Siedzik said.
But with nearly a year of operation complete, Gloster and Siedzik are feeling good about the new data center. Total cost on the project came in at less than $1 million. And they've begun to receive a handful of awards for the implementation of the IPICS system.
In the meantime, the university will see savings begin to accumulate. Siedzik said the combination of consolidation, virtualization and new equipment should bring drops of about 40% in software subscription, support and maintenance costs; 30% in administrative staff time; and as much as 20% in data occupancy.
Let us know what you think about the story; email email@example.com.