This is the second in a two-part series about enterprise desktop virtualization adoption strategies. This part digs into how enterprises are deploying virtual desktop technologies and the challenges they are encountering along the way. In the first part, IT executives and experts discuss the trends that are making it easier for IT to pitch desktop virtualization as the answer to a host of business problems.
For Neal Kaderabek, CIO at Hallmark Services Corp., desktop virtualization was but one piece of a larger business transformation project to update IT systems and business processes and position his company for growth.
More on desktop virtualization strategy
Desktop virtualization, however, was not an easy piece. Kaderabek's team converted 400 of the health insurance company's physical devices to virtual desktops using VMware Inc.'s desktop and server virtualization technologies.
"It's not as easy as it looks," Kaderabek said. "The hard part is getting the images created for every [user and department] role of the organization." This step alone took his team a year to complete.
When Hallmark Services began the project in early 2008, the Naperville, Ill.-based company was a trailblazer of sorts. "It required a lot of learning by the school of hard knocks," Kaderabek said. "My advice to anyone else choosing to virtualize desktops is to make sure you have some talent that has been there, done that."
Fortunately, these days enterprise organizations increasingly are sharing their desktop virtualization stories, along with lessons they've learned along the way. The main technical recommendations from CIOs who have been there and done that: Take a phased approach to desktop virtualization, and be prepared to invest in many types of desktop virtualization and nonvirtualized technologies to get the job done. Job No. 1 from a user perspective is creating a consistent application and data-service delivery experience for employees regardless of the platform they're on.
"Think about [desktop virtualization] as a composition of multiple application delivery vehicles that make the experience transparent to users," said Chris Wolf, analyst at Gartner Inc., during the research firm's recent Gartner Catalyst Conference in San Diego. That might mean using a combination of technologies (not necessarily virtualized ones) to give users access to a portal of applications. "This really is a brokering technology, where virtualization can be one enabler. Software as a Service is another enabler; in fact, it is the No. 1 stop for the app onboarding workflow before [our clients] look to deploy apps internally," he said.
A phased approach to desktop virtualization
For Elio Benincasa, assistant vice president for infrastructure management at Manulife Financial Corp., desktop virtualization began with server virtualization, progressed to application virtualization and has matured in the last couple of years into a technology stable enough for enterprise organizations the size of Toronto-based Manulife.
Starting with the U.S. Life Insurance division of Manulife, the justification for a virtual desktop infrastructure (VDI) was the need to support process workers, such as call center users; in-house and offshore application developers; and some administrative and operations staff. At the same time, Benincasa was called on to give call center and application developers a consistent look and feel when they access applications.
The challenge was that some development work was being done on Windows Server 2000 and some on Windows Server 2003, while users were on Windows XP Service Pack 1 and Windows XP SP2. Benincasa had to find a way to simplify the management and maintenance of multiple systems, and give users a consistent experience when they use applications, regardless of the platform they are on. A VDI allowed him to do both.
"VDI gave us a consistent look and feel for many platforms and for many users -- from offshore developers to call center users," Benincasa said in his Catalyst conference presentation. "Another benefit was that training was simplified, and deploying a new [operating system] and new apps to the platforms became much easier."
Shift your mind-set to not care about the end devices, who owns it or where it's coming from.
Dustin Fennell, vice president and CIO, Scottsdale Community College
In the beginning, the project's infrastructure was based on VMware and consisted of a server-hosted desktop and thin clients that took advantage of existing hardware platforms. Connection brokers identified users and provisioned virtual desktops for users based on their roles and access privileges in the company.
Version 1 (as Benincasa calls it) for U.S. Life Insurance began in 2007. Today he is working on a VDI plan for Manulife's U.S. Investment division with the same use case scenarios: application developers, operations support staff and offshore database administrators. VMware's ESX hypervisor and VMware View had been the base technologies for the first desktop virtualization project. For this go-around, his team is using Citrix Systems Inc.'s VDI infrastructure, with the ESX hypervisor as the back end.
"Our global infrastructure folks started developing standards during this first iteration and decided to go with the Citrix [brokering] platform with ESX on the back end this time," Benincasa said. A primary reason for the change is that ESX uses the PC-over-IP remote desktop protocol, but the Independent Computing Architecture protocol used by Citrix is a global standard at Manulife. "It's easier for us to run one protocol globally across all firewalls and regions," he said.
With version 2 under way, Benincasa has some advice to share:
- Understand why you are implementing VDI, who the users are and what kind of work they do: "User acceptance is a key thing. If you give users a solution that won't work with all of their apps, you're going to have a major problem."
- Some applications are still going to need a physical footprint. This was the case with Manulife's Bloomberg system that its traders use.
- The desktop virtualization landscape keeps changing. Keep an eye on it as you place your bets, and realize that acquisitions might affect your licensing terms.
- Sit down with your procurement team to define licensing terms with vendors under the VDI model. Is the per-user fee based on North American users? If it is, third-party offshore users will not be covered.
- You will be making significant infrastructure investments: "On the server side, you need to understand sizing. You don't want to implement and find out there's not enough memory running on the disks."
- The cost of persistent images in VDI is multiplied by each of those images.
- Have a provisioning mechanism in place that meets compliance requirements and prevents offshore developers from printing documents or walking away with information, for example, on thumb drives.
Head of the class with desktop virtualization
Dustin Fennell, vice president and CIO at Scottsdale Community College in Arizona, also is a big advocate of a phased-in approach to desktop virtualization. His team moved the college's IT systems to a fully operational VDI, with 250 applications virtualized to date.
Their first goal was to make application delivery independent of platforms and devices. "Get out of the business of managing end devices. Shift your mind-set to not care about the end device, who owns it or where it's coming from," Fennell said during a presentation at the Gartner conference. To make this happen, start with applications, he advised. By virtualizing applications first, you can begin to deliver features to users immediately without having to overhaul your current infrastructure. Then, move on to virtual desktops to tackle applications that are graphic-intensive, such as Autodesk Inc.'s AutoCAD and Adobe Inc.'s Creative Suite 5.
In Fennell's view, another key element is a Web portal with a provisioning system that's based on Active Directory user access to deliver apps to users located anywhere and using any device. "We are leveraging desktop virtualization and a whole host of other virtualization technologies to create on-demand access to services that our users need to consume via whatever device they happen to be using, from wherever they happen to be," he said.
The college's desktop infrastructure includes local PCs with the operating system still installed, server-hosted applications, server-hosted desktops, server blade computing, application and OS streaming to local devices, and bare metal hypervisors. "Using bare-metal machines, someone can come in and steal it and we don't care. It's a throwaway machine and no data resides on it," Fennell said.
Desktop virtualization saves Scottsdale Community College about $250,000 a year as a result of its being able to pool hardware and software resources, extend the life of hardware, and decrease support -- and reduce IT headcount as well, Fennell said. Of the money saved, a portion -- $50,000 -- is being redirected to fund IT innovation grants for ideas culled from the college's faculty and staff.
The savings and improved service to students prompted the rest of Maricopa County's community college system to adopt Scottsdale's desktop virtualization strategy. This project will cover the needs of about 260,000 students annually and 12,000 employees. In the meantime, Scottsdale Community College continues to reap the rewards of desktop virtualization.
"We gained a competitive advantage over the 15 other colleges in the area competing for students, because [students] no longer have to make an educational decision based on the age of hardware or the software they can afford. We provide free access to the software they need, to any type of device," Fennell said.
Let us know what you think about the story; email Christina Torode, News Director.