In the IT department at Austin Radiological Association (ARA), CIO Todd Thomas focuses on a system monitoring tool that provides an end-to-end view of the company's environment, drilling past the core elements of storage, network, server and desktop to the operating system and application.
ARA, which provides outpatient imaging services and professional services to hospitals in central Texas, used to run a classic element manager to monitor its network "for up-and-down status," Thomas said. With the software contract coming up in 2010, the time was right to add visibility through the hypervisor, he said. ARA is 30% into a virtualization of its production servers. "We wanted an end-to-end capability," he added, but the trick was monitoring server and network performance across multiple business services and tying metrics to the services.
ARA chose AccelOps Inc.'s system monitoring tool, which takes in data points from various domains and presents them in a dashboard, to manage the virtual infrastructure. "Because they provide it as a virtual appliance, it installs very easily to provide a holistic view," Thomas said.
Thomas plans to use the monitoring platform to educate the business about IT services, service levels and responsibilities. "We'll present a dashboard to desktop support analysts to show the services we support," he said. "The goal is to resolve problems more quickly, and to manage end user expectations more effectively."
System monitoring tools define 'normal' behavior throughout the day
At Baron Capital Inc. in New York, Henry Mayorga relies on Reston, Va.-based Netuitive Inc.'s system monitoring tool for visibility into a virtualized environment that's more than two years old and spans 95% of the company's production servers. Mayorga, Baron Capital's manager of network technologies, began experimenting with VMware 3.0 on a couple of noncritical servers, then moved email and production for the back office into the environment, "and before we knew it, the print servers were there, and the main controllers," he said. "We found it was very effective for us to deploy and manage the lifecycle."
Although he enjoyed the benefits of centralized management and improved business continuity and disaster recovery capabilities, Mayorga discovered that "the problem with a product like VMware is that it adds complexity." Virtualization shields the hardware, storage subsystem, operating systems and applications -- layers that have different performance characteristics, he said. It's possible to get information -- through the simple network management protocol, or SNMP, for storage, for example -- but "the problem is, there's a lot of information coming at you," he added. All these measurements are relative to the overall performance of the system. If the entire machine is affected, it's even more complex, he said.
Moreover, traditional monitoring platforms from such companies as Hewlett-Packard Co. and Quest Software Inc. are based on IT staff setting threshold performance, Mayorga said. "Maybe those parameters are not a problem, [but] this is a huge issue when you are trying to monitor a virtualized environment," he said. "At the end of the month, it might be fine to have 90% utilization; in the middle of the month, not so much."
Mayorga chose Netuitive's monitoring platform because it "gives me a performance envelope that is meaningful, and understands what performance measurements I should look at, over time intervals." The solution isn't perfect: Normalizing data from multiple sources is very difficult, he added. "It's not about the graphics. It's about how is it performing now -- how is it performing over a period of time -- and over the longer span of time."
Virtual system monitoring tools required
ARA and Baron Capital are on the front line of a trend being driven by virtualization: new system monitoring tools that look beyond the core elements in a virtualized environment to ferret out problems and even anticipate them. These tools free IT departments from reactive mode and enable proactive decision making, experts said.
Virtualization shields the hardware, storage subsystem, operating systems and applications -- layers that have different performance characteristics.
Henry Mayorga, manager of network technologies, Baron Capital Inc.
"Most enterprises now have an infrastructure that is less static, and will be looking to new monitoring capabilities," said David Williams, research vice president at Gartner Inc., an analyst firm based in Stamford, Conn. "Technology that was designed in the client/server era is not appropriate today. The mid-'90s monitoring infrastructure was static, and cannot provide the "behavior learning" that is required in a virtual environment to detect subtle deviations from normal activity before they have a potentially negative impact on IT services, he added.
Unfortunately, most enterprises with internal data centers still focus on the elements, Williams said. These companies struggle to look at services that are unbound from the hardware. "Elemental monitoring involved resources and set policies," he said. "With virtualization, the opportunity to deploy appliances, and for the end user to allocate resources, removes the foundation for elemental monitoring. When you move a workload from A to Z, the underlying infrastructure might change. If you move the virtual components, the physical and virtualization environments both need to be monitored."
Being able to see patterns of data at a certain time of day enables analysis that is closer to real-time; knowing what happened previously will help the monitor decide if it's normal or abnormal. VMware Inc. recently purchased Integrien Corp. to gain this capability, and companies like Netuitive also look at data behavior based on historical trends, Williams said. Such virtual system monitoring tools allow IT departments in large enterprises to meet service levels, increase efficiency and capitalize on new service delivery opportunities.
Let us know what you think about the story; email Laura Smith, Features Writer.