This Content Component encountered an error

Data lifecycle management tips for CIOs

Execution: Live life to the fullest <<previous|next>> :HP business service automation software improves strategy

CIO News:

Data lifecycle management no panacea

By  Elisabeth Horwitt

29 Jul 2008 | SearchCIO.com

Enterprise CIOs are starting to think outside the storage box when it comes to data lifecycle management.

Until recently, most real-life deployments of data lifecycle management (DLM) have taken the form of Hierarchical Storage Management (HSM) or multi-tiered storage: moving data off high-end storage subsystems onto cheaper media, particularly tape, as it ages.

However, HSM alone won't address exploding information costs and regulatory demands that enterprise CIOs now face, said Stewart Buchanan, a research director at Stamford, Conn.-based Gartner Inc. They need to deploy an "overarching enterprisewide information architecture that integrates storage needs and costs with business applications and processes," he said. That is what DLM is, or should be, about.

DLM architectures would typically include, in addition to HSM, an archiving system that indexes all critical and compliance-related information, backs it up and stores it where it can't be tampered with and can be discovered and accessed in a reliable and timely fashion. Deduplication and compression of all files ensure efficient usage of available storage space.

More on data management
Information lifecycle needs data realignment fast

IT infrastructure tracking software puts CIO in business

To get a sense of DLM's potential paybacks, consider the case of ING Investment Management Americas. Five years ago, the firm's data was spread across direct-attached storage on some 575 servers. Storage capacity demands and administrative costs were skyrocketing, with email storage doubling every year since 2001. There was no comprehensive business continuity strategy.

In 2004, the company began rolling out an enterprisewide information architecture based on NetApp Inc. storage area networks (SANs), and Symantec Corp.'s Enterprise Vault HSM and archiving platform.

According to a recently published analysis by The Alchemy Solutions Group, jointly sponsored by NetApp and Symantec, ING Investment's new information architecture has provided the following paybacks over a four-year period (numbers are approximate):

  • $2.31 million in cost savings from storing data more efficiently and recapturing storage space.
  • $10 million saved by consolidating direct-attached storage onto a SAN, and adopting a multi-tiered storage strategy
  • $677,000 in cost savings from compression of email, file and print data and adoption of a tiered storage strategy.
  • $238,000 in reduced costs for more efficient handling of regulatory e-discovery requests. "As a result of the Enterprise Vault implementation, we've reduced the time required for responding to a typical discovery request from as much as three weeks to two or three days," Mark Kolodzej, vice president of IT at ING, said.

It's a matter of policy

The past year or two have seen a spate of DLM products from major storage players like EMC Corp., Symantec and Hewlett-Packard Co., as well as a bevy of smaller players. According to a report by Framingham, Mass.-based IDC, archiving and HSM software posted a healthy year-over-year growth rate of 14.9% in the second quarter of 2007.

And yet, many business DLM deployments have stalled, primarily because companies have failed to define "an effective, comprehensive set of archiving and data migration policies," Buchanan said. This is no easy task: Since policies need to reflect business and regulatory priorities, they need to be a collaborative effort involving not just IT and storage administrators, but also "all the stakeholders, including email administrators, compliance officers and business managers," Buchanan said.

"We've reduced the time required for responding to a typical discovery request from as much as three weeks to two or three days."
Mark Kolodzej
vice president of ITING Investment Management Americas

The easiest criterion for migrating information to cheaper storage is age. However, companies in highly regulated industries often want to go further, classifying data by how quickly or often it needs to be accessed, on the basis of who sent or received it, or on the basis of keywords or number strings. Kazeon Systems Inc. and Njini Inc. are two vendors whose archiving platforms support complex data management policies. However, the real challenge is defining them and ensuring that they remain viable over time, with little or no human intervention.

Data deletion policies in particular are a major sticking point for many companies. IT departments are petrified that the data they trash will be required by a user or lawyer two days or a year later. It's much safer to stick it on tape. As a result, most companies end up paying someone to store an ever-growing collection of backup tapes.

"Playing data shuffle between different storage tiers" isn't enough to get information costs under control, Buchanan said. Enterprise CIOs need to tie data retention policies to business needs, and design policies that determine when information has reached the end of its lifecycle. Like any living organism, an enterprise information architecture requires constant recycling in order to grow and remain healthy.

Elizabeth Horwitt is a contributing writer based in Waban, Mass. Write to her at editor@searchcio.com.