In a data-driven economy, the stodgy storage marketplace has become a hive of innovation as business demand for...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
data storage and data access grows. Megavendors are practically tripping over themselves to offer up enterprise solutions -- with technologies such as solid-state drives, virtualization-optimized storage and the cloud making their mark on CIO storage strategies.
Amid all the new storage trends and opportunities, however, one aspect of enterprise storage has not changed for CIOs: It's a tough nut to crack. "When you are planning and looking out to the horizon, you're trying to pick a point that is always in motion," said Pete Graves, CIO at Independent Bank Corp. in Ionia. Mich. "That is difficult when not only the storage needs are changing and increasing, but the technology around that is also changing."
'Workload-centric' storage the wave of the future?
In a report published in February 2012 about their top 10 storage predictions, Forrester Research Inc. analysts Andrew Reichman and Vanessa Alvarez dissect the challenges that come with crafting a business-centric storage strategy, starting with the overall growth in data and data sets. Newer products incorporating solid-state drives and automated tiering tools promise greater efficiency, albeit at a price. Even these technologies, however, will not get CIOs to an optimal storage strategy, according to Reichman's latest research.
"That stuff is all important. But I see that now as a kind of traditional way of thinking," he said. One big trend is the push to use the storage capabilities and tools of each application, rather than the agnostic software provided by, for example, a traditional storage vendor. As an a indication of just how fast the IT infrastructure environment is changing, in new research published Sept. 7, Reichman advocates a "workload-centric" approach in which infrastructure teams of network, storage, server and application professionals coalesce around a major workload to optimize the service from top to bottom.
"I am seeing a trend now for people to divide their environment into the major workload categories like VMware or Oracle, or SQL Server from Microsoft if it isn't in VMware, and to start to think about what is the best fit for those major categories," he said.
The trick will be figuring out when this serves the business and when the dispersed systems slow it down. "If you have 20 different major workload categories, then you're probably better served putting the storage in the storage array, because you have too many variations," Reichman said.
Another challenge CIOs must face head-on? "The immediacy. Today the expectation is that you won't go down, and if you do, it will be for a matter of minutes," Graves said. "Your storage strategy has to match the business expectations." Data security, which in the banking industry is shaped as much by customer perception as by regulations, layers in yet more complexity to a storage strategy, he said. And money (as in "never enough") is always a factor.
Such competing demands often require a mosaic of solutions. For its core banking data, including loans and deposits, Independent Bank, for example, uses global banking application service provider Fidelity National Information Services Inc., a hosting service that also stores the data generated by that platform. "But we have a lot of ancillary platforms in-house that require a storage system," Graves said. Probably the biggest impact on the bank's storage strategy has been IT's aggressive push to virtualize operations. Between servers and desktops (and more than 300 virtualized applications), the bank's data center is on track to be 90% virtualized by the end of 2012, a transformation that has added to storage capacity needs "because all of these virtualized systems are also stored," he said.
A longtime storage area network (SAN) customer of EMC Corp., Graves decided last year that it was time to upgrade. The process involved months of vendor product vetting (and testing of assumptions), as well as a meticulous business case. Because of the cost (at well north of $1 million, the SAN is one of the bank's most expensive pieces of equipment), the investment required board approval.
"Investing in that architecture at a time when we would prefer to be spending the money on getting new customers in the door" is a tough sell, said Graves, who was a commercial lender for many years before he became Independent Bank's CIO 15 years ago. "It takes effort, communication and patience to explain why it needs to be done now." Thirteen months later, new SAN technology from EMC with solid-state drives (SSDs) and dynamic tiered storage is now up and not whirring, relying on algorithms built into the software to maximize efficiency.
Looking to the cloud for enterprise storage
Graves' new SAN comes with another important feature: appliances that allow integration with cloud storage. Independent Bank, with $2.2 billion in assets and more than 1,000 employees, likes to keep its data close, Graves said, but to ensure its future, IT cannot not prepare for cloud storage. "We're getting our infrastructure ready and storage systems ready, so if we needed to, or had a reason to park data in the cloud, we could do it," he said. An acquisition or a move to a new platform are just two reasons. "I may not be able to anticipate every business need out there, but I need to have the 'componentry,' so if they decide to go down that route, I can deliver."
For Ravi Ravishanker, CIO at Wellesley College in Massachusetts, the commitment to the cloud is well under way. In the year since he took the IT helm at the women's college, Ravishanker has been chipping away methodically at local storage (the college's strategy by default). The IT department is a NetApp shop exclusively. "Our strategy is going to be increasingly moving to the cloud, while realizing that we cannot give up local storage entirely because we need it still to run certain applications, like ERP," he said. He pegs local storage in the 50- to 60-terabyte range. "One of the things I did immediately in order to take advantage of cloud storage was to change from 160 megabytes of connectivity to 1 gigabit," he added.
Ravishanker's mosaic of storage solutions is impressive. He uses Code 42 Software Inc.'s CrashPlan cloud product to back up faculty and staff desktops and laptops; Sakai's open source software for course management; Luna Imaging Inc.'s software for image storage; Kaltura Inc.'s software for multimedia; Google Apps; and the Digital Commons, an institutional repository for faculty research. Local storage is "extraordinarily expensive" and "cloud storage is cheap," he said. Still, the decisions are not made in a vacuum, he stressed, but in consultation with the top officers of the college, as well as a committee of six faculty, two students and five staff.
"The overarching strategy is that we support the academic mission, and underneath that is how we actually accomplish that. One of the key pieces of our strategy is to exploit cloud-based and open source solutions, but very carefully. We are not interested in moving to the cloud 'just because,'" Ravishanker said. "The worst thing you can do is introduce something and ask users to change it two years down the road."