News Stay informed about the latest enterprise technology news and product updates.

Post-PC era: Is 100% remote desktop virtualization even smart?

Last August, I sat at VMworld 2011 with 19,000 other IT literati and heard CEO Paul Maritz assure us that we were in the post-PC era. But are we really?

Certainly there isn’t much argument about how remote desktop virtualization increases worker productivity by giving them access on multiple platforms.  Furthermore, desktop virtualization reduces company costs and upkeep.  CIOs are definitely paying attention to the benefits of virtualization.

The main problem with this description is that it discounts the PC entirely and smacks of absolutism. CIOs live in the real world, and there are always differing use cases in every business. With judicious application, it’s possible to have your benefits of virtualization without damaging the productivity of some workers, especially around specialty-use cases and ergonomics.

Specialty-use cases come up when you have highly specialized software, as engineering or science professionals do. These highly technical workers have applications that simply don’t work, or work poorly, in a virtualized desktop environment. Overzealous rollout of a remote virtual desktop where the goal is 100% coverage creates an absolute — one that works counter to the company’s overall productivity.  Crippling your engineers in order to fulfill an unrealistic goal? Not good for the business. CIOs need to remember that achieving 90% to 98% coverage is fantastic, and that those stubborn last percentage points are desktop users who actually might be better off being left alone, post-PC era or not.

Smart CIOs are mindful of the physical well-being of their workforce. Everybody loves a lighter, thinner laptop, and everybody loves a sleek tablet device. However, for workers who are not highly mobile, smaller screens and slick, tiny keyboards are a recipe for eyestrain, backaches and carpal tunnel syndrome. CIOs need to make sure that the IT departments they run are aware of use cases and are not making one-size-fits-all decisions. Certainly, use cases cannot be allowed to get too complicated, but a few standard configurations based on role will make employees more productive and prevent long-term problems for their health and happiness.

What do you think? Is this really the “post-PC era”? Is 100% remote desktop virtualization adoption even feasible? Is it pointless to cut the cord for each and every worker? Is it possible to extract some benefits of virtualization without going all the way? I’d love to hear your thoughts in the comments.

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

I don't think VMware has ever said that 100% desktop virtualization should be the goal. After several years of "this is the year of VDI," it's pretty clear now that desktop virtualization is never going to take off in the same way server virtualization did. VMware's "post-PC era" catchphrase really just refers to decreasing reliance on PCs in the business world, which is happening for a number of reasons far beyond desktop virtualization, such as cloud computing, Web and mobile apps, smartphones and tablets and even social networking. The PC and the traditional enterprise software it runs will still be important, but it's not the only way to do things anymore. I think that's what VMware truly means by the "post-PC era."
Wendy, I totally agree with you. There were nearly 300 million desktop and laptop PCs sold in 2011, and Forrester is predicting that more than 2 billion PCs will be in use worldwide by 2015. Sure, tablets and smartphones are all the rage, but just like television didn't kill the radio, these new mobile devices and PCs will co-exist and support their appropriate use cases. What isn't a stretch (at least, I don't think so) is that we're entering the post-native-PC era. Desktop virtualization, like server virtualization, simply makes too much business sense. It provides better endpoint security and manageability for IT, and end users gain flexibility and avoid costly downtime (due to viruses, malware and other PC issues). I doubt many CIOs -- particularly at this point in time -- are striving for a 100% remote desktop virtualizaiton adoption. Those that are subscribe to Henry Ford's view on color choice for consumers. Finally, the way to extract the benefits of virtualization without going all the way is to: a) avoid the costly server-centric approach to desktop virtualization (aka VDI) and b) ditch the thought of having to replace PCs with thin clients. [A href=""]Intelligent desktop virtualization[/A] (IDV) is gaining traction in the market because it involves local execution and centralized image management...using desktops and laptops organizations have already invested in (or must continue purchasing due to certain use cases).
Not all IT realms work well in the virtual environment currently. For instance, in the backup realm, virtualization of a backup server can be troublesome. One way that might be a better way to utilize the virtual only environment in backup infrastructure, might be to use multiple virtual backup servers to backup multiple different machines on the network. I have not seen this in production, but hypothetically, having a coordinated set of virtual backup machines backing up multiple machines (physical and virtual) and then storing that data in a coordinated system might be an answer to the problems of traditional backup in virtual environment and scalability. To bring the point back to this topic, I think that perhaps the virtual environment becomes a matter of recongizing the limitations of virtualization and then finding a way to scale within those limitations as there are certainly benefits to having a virtual only infrastructure. Full Disclosure: I have a tech role at backup software vendor, Arkeia Software.