The proper deployment of storage and server virtualization can help companies reduce capital investment, minimize...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
ongoing support and maintenance costs, and improve utilization of technology resources.
Contemporary enterprise storage systems are typically integrated by way of storage networks. Some storage networks are block-based (storage area networks, or SANs) and others are file-based (network attached storage, or NAS). Irrespective of how the storage system is integrated, data of varying types will be interleaved and processed together to a common set of storage controllers. Server virtualization compounds this problem by interleaving varying data types along a common storage network connection.
When integrating virtualized server or desktop environments with virtualized storage systems, finding the proper balance becomes paramount. Enterprises need to consider, at the most basic levels, balancing the following attributes:
- Cost efficiency
Networked enterprise storage systems (SAN, NAS and unified storage systems) are designed to address the above requirements across all data types. However, simply investing in an enterprise storage system is insufficient to rectify all of the virtualized data needs of any given company.
When considering the balance among network performance, capacity and scale, enterprises must consider multiple storage media, such as low-cost, high-capacity SATA drives, which are optimal for large, sequential input/output (I/O); solid-state media, which are ideal for small, random I/O; and server-based solid-state devices, which can help tremendously for those servers that require specific performance characteristics.
Avoiding performance bottlenecks
Enterprises must ultimately take into consideration the applications that will be deployed in a virtualized environment. Some applications have random, small reads and writes. Transaction-based systems, such as relational databases, email and other collaborative applications, are examples of this. Other applications read and write in large blocks and are sequential, the most prominent example of which is a backup or archiving application.
Some of the most common applications and use cases deployed in enterprises include the following:
- Oracle's relational database management system (RDBMS), Microsoft's SQL, SAP and other transaction-based systems
- Home directories
- Virtualized servers
- Virtualized desktop infrastructure
- Business intelligence and business analytics
Enterprises must not see an enterprise storage system as a panacea to data storage.
Ben Woo, founder, Neuralytix
Each application has different requirements. Some, such as RDBMS and transaction-based systems, have high-performance and large-capacity requirements and can read and write data relatively randomly across the storage capacity. Home directories tend to write data sequentially.
Virtualized environments have the added challenge of mixing various types of applications and data together on a single system. This means that despite typically having a limited number of physical connections to a storage system, the data that travels along these connections is highly varied. Database data and home directory-like data may be required to travel along the same physical paths.
Therefore, enterprises must not view an enterprise storage system as a panacea to data storage. Simply integrating an enterprise storage system, particularly in a virtualized server and desktop environment, can often result in creating performance bottlenecks within the storage network.
Instead, enterprises need to consider how data will need to be preserved, accelerated and consumed. Taking these considerations, IT can design and implement storage strategies that enhance each of these functions. In all cases, enterprises should look to centralized, networked storage systems that can address the aggregate of peak performance needs.
From there, enterprises should seek point solutions to achieve performance requirements, such as flash acceleration devices on a per-server basis to satisfy specific application requirements (e.g., transactional databases). Enterprises should also look at how data and performance requirements of virtual machines are distributed across physical servers in a way that no single server is overloaded or overburdened.
Finding the balance between performance and cost efficiency represents an even more complex reality. In a fiscal sense, the concepts are diametrically opposed. Performance requires an increase in investment, while cost efficiency (as the name implies) seeks to lower costs. A virtualized environment is a concentration of computing resources and their associated I/Os into fewer physical resources. This demands higher performance; however, most organizations consider virtualization in the context of consolidation (which is also true), and simply look at the decrease in investment outlays.
The need for instrumentation software
Fiscally, virtualization should be considered an evolution from an old approach to a more efficient and effective approach. In evolving this approach, organizations are likely to reduce server investments, but the storage investment may increase. The total investment is likely to be lower than the traditional approach. Certainly, in following years, investments can reduce dramatically despite increased demands and capacity.
Ultimately, one of the biggest challenges for most organizations is a lack of instrumentation. Instrumentation is software (and sometimes hardware) that monitors and reports on specific elements of an infrastructure. The complexity of a virtualized server infrastructure augmented with virtualized storage means that there is rarely a direct correlation between the virtual server or desktop and a physical disk drive. Instrumentation solutions for virtual environments and servers can greatly improve the manageability, administration and optimization of the virtualized data center.
The bottom line: Virtualization is complex. It requires consideration of both cost and performance factors and needs to take application demands into account. The best virtualization implementation will employ instrumentation to refine the initial deployment over time. All that said, virtualization of servers and storage will save you money.
Benjamin S. Woo is founder and managing director at Neuralytix Inc., a consultancy based in New York, N.Y. He was previously the program vice president at IDC's Worldwide Storage Systems Research. Write to him at firstname.lastname@example.org.