Disaster recovery and business continuity planning guide for CIOsExecution: Recovering from disaster, focusing on business continuity <<previous|next>> :Disaster planning: Bank builds data center for the long haul
Iowa floods put Mercy Medical IT backup plan to test
By Linda Tucci, Senior News Writer
22 Jul 2008 | SearchCIO.com
For purposes of full disclosure, says CIO Jeff Cash, "I was in Europe the night this all happened."
Cash is responsible for IT, telecommunications and some supply chain functions at Mercy Medical Center, a 370-bed regional medical center located 10 blocks up from the Cedar River in Cedar Rapids, Iowa. This refers to the flood waters that last month inundated the city's 100-year flood plain then whooshed over the 500-year flood plain to the doorstep of Mercy Medical and forced the evacuation of the hospital's 180 patients.
A scary scenario, but it wouldn't have much mattered if Cash were in Timbuktu. In the three and a half years since he joined Mercy Medical, Cash and his team have methodically prepared for the forces of nature that have taken a toll on Iowans this year, from winter ice storms and spring tornados to the flooding Cedar River, which crested at 32 feet, nearly five times its normal 6-foot level.
Information is a lifeline for medical centers. The hospital houses two data centers, one above ground and one below at opposite ends of the center's large campus. Critical services, electronic medical records, storage area networks (SANs) and data services are all fully redundant between the two.
"We've built them with the expectation that we may have to make them mobile and have been working toward that in the last couple of years," Cash said.
Mercy Medical had many business functions online when Cash arrived -- but not its medical records. Moving to electronic records meant that "infrastructure needed to be an afterthought," not a stumbling block, Cash said. So before he started rolling out any new services, IT rebuilt the infrastructure in the hospital from the ground up.
IT replaced the entire Cisco Systems Inc. network. All of the cable plant was redone and the fiber network expanded. The hospital's traditional servers were replaced with Hewlett-Packard Co. blade servers. Spares are "on the shelf," Cash said, so if one blade goes down another can be popped in. Mercy Medical uses VMware Inc. tools to expand the use of each of the blades even further. All the data lives on both SANs. Backup tapes are kept offsite in a big storage vault, couriered daily.
Between the two data centers and standardized technology, IT is "pretty fault tolerant," Cash said, but "we never let anything get much older than three years old, so if we had to bring in new equipment for the data centers, we wouldn't have to worry about figuring out the hardware."
With a robust network in place, Cash moved the hospital off its PBX telephone exchange to Voice over Internet Protocol (VoIP) supported by servers in its two beefed-up data centers. The Internet phone system proved its worth during last month's emergency.
"The massive amount of relocations we did was made considerably easier by being able to unplug a phone, move it to a new location and plug it back in," Cash said.
In addition to the VoIP phones, Mercy Medical nurses wear a wirelessly connected communication device around their necks that uses voice recognition to contact medical personnel in other parts of the hospital. "During all the chaos, the Vocera system was a tremendous help," Cash said.
IT redeployed for flood work
Not every contingency was planned for, Cash said. On the night of June 13, five feet of water surrounded about half of the hospital. A crew of 1,000 volunteers sandbagged through the evening until lightning started and the hospital pulled them inside.
About two inches of water lipped over the sandbags. The hospital popped open escalator shafts and installed temporary sump pumps, instructing volunteers to squeegee the seeping water into the shafts. Some water seeped through ceilings, so crews started taking out the ceilings. Waste water threatened when some fixtures burst.
The hospital was running on generators, but the decision was made to evacuate patients shortly after the water seeped in, because it was still unclear how high the river would go, Cash said. Emergency crews, including the National Guard, were soon on hand to assist with moving patients and evacuating the entire first and basement levels.
The 220,000-square foot area is packed with vital services, including the pharmacy, radiology, lab services and EKG equipment, as well as physician offices, maintenance, linens and the data centers, Cash said. Everything from four feet and below needed to be ripped out.
IT staff was reorganized from top to bottom, with people dispersed to help with patients, move medications out of the pharmacy and take equipment from the basement data center to a clean site and dirty site set up in the hospital parking garage. With the blade architecture, network switches, everything fiber-connected and overhead power, moving the basement data center was a matter of pulling network cables out of the back of cabinets and hooking up to power from the ceiling.
Mercy uses Qwest iQ Networking services to connect voice and data communications for the hospital's 20-location network. The system from Qwest Communications International Inc. enters from two places into the hospital's basement and terminates in a Qwest cabinet there. As the water came in, a call was put to Qwest.
"We'd been talking to them for a couple of years about moving that cabinet, so we called and said, 'Today's the day, and we need it done in about 10 minutes,'" Cash recounted.
Qwest showed up quickly. Two guys walked through the water to the cabinet, unbolted it and moved it to the primary data center above ground. The fiber was rerouted and plugged in, and the hospital was up and running on Qwest's sonic rings in about 90 minutes. Internal communications never went down. In addition, IT also keeps a suite of emergency cell phones, labeled for specific areas, charged and in good shape as a backup.
Cash stresses that his business continuity strategy was not completely watertight, but the holes and the "on the fly" fixes are not exactly damning. The hospital hosts its own website, so there was some scrambling to get a backup of the current website to a website developer. The developer, which happened to be local, was able to bring the site up in a couple of hours, so the hospital didn't have to worry about losing touch with its community.
In fact, the hospital was down for one day over the weekend for repairs. By Monday the cancer center was up and running, followed by the women's center. A new temporary radiology suite was built on mobile trailers. By the end of two weeks, full services were restored.
Had the hospital been completely shut down, patients still needed to be treated and their medical records accessed. Cash had deployed PatientKeeper, a separate Web-based system with its own redundant databases that works in concert with the hospital's system from Medical Information Technology Inc., updating medical records in real time. Doctors can access to five years' worth of data in the PatientKeeper with a personal digital assistant, downloading records, if need be, to a smartphone.
"We took a snapshot of the PatientKeeper and sent it to the vendor in Boston. They were able to bring up an online copy of the electronic record for us to test, so we knew it was available within an hour," Cash said. If the hospital was forced to turn off all systems, patients and physicians could still access records. The system's single sign-on solution, however, also meant some last-minute efforts to come up with temporary passwords for the docs, Cash said.
The exercise, hard as it was, was a reassuring trial run for IT, Cash said. The primary data center occupies prime real estate on the first floor. "If we get a new, nice data center at some point, we knew we wanted a mobile data center so we could pick up and move."
Let us know what you think about the story; email: Linda Tucci, Senior News Writer