If two technology trends were ever made for each other, at least in vendor marketing materials and generically simple diagrams of IT infrastructure, they are the consumerization of IT and desktop virtualization.
Analysts who study desktop virtualization say many of its use cases fit neatly into problem areas that their client companies face, such as the consumerization of IT. End-users who insist on using nonstandard or unapproved computing devices, such as tablets and iOS or Android smartphones, make demands on the IT department, the remote-access infrastructure and the IT budget, according to Ian Song, research analyst at IDC. When the same user wants to use two, three, or four computing devices for different reasons, the situation can quickly get out of hand.
[ Learn what desktop virtualization really means in InfoWorld's special report. | Download InfoWorld's "VDI Deep Dive" PDF report today. | Track the latest trends in virtualization in InfoWorld's Virtualization Report newsletter. ]
"You're not going to give everyone two or three computers or try to set up your applications and infrastructure to support every device everywhere, no matter what your resources," Song says.
The clearest solution is to create a virtual desktop that runs on a server in the data center but that can be launched, viewed, and used as easily by an end-user in the office at a traditional computer as by a worker logging in from a PC in a public kiosk or smartphone connecting via open Wi-Fi.
That setup -- full-blown virtual desktop infrastructure (VDI) implementations -- is becoming far more common but will probably never make up the majority of virtual desktops, let alone outnumber traditional physical desktops, according to Chris Wolf, a research vice president at Gartner.
It's by far the most expensive form of virtual desktop, especially compared to streamed or Web-based applications that can be used from tablets, smartphones, and other traditional devices that support VPNs or other encrypted connections, Wolf says.
"People tend to talk about desktop virtualization as if it's one solution, or even a set of solutions, but there has always been a range of implementations," he says.
Traditionally, virtual desktops consisted of dumb-terminal, shared-server/shared application setups used in call centers, banks, and other transaction-heavy environments in which employees work in shifts and several may use the same machine on the same day.
That's still the most popular implementation and the least expensive. Other implementations let IT match the functions required by the user with the complexity and cost IT can afford, Wolf says.
Some users might stream to the desktop one application they use only occasionally; others may be set up to choose several applications to be streamed from an internal corporate "app store" or even work full-time on a "desktop" that is actually a virtual machine running on a data center server -- which requires the resource-intensive VDI server setup.
"A certain percentage of people -- maybe 20 percent -- will be appropriate for full VDI, where most others will use either mostly streaming apps or stream the apps and OS onto their own laptops, and some will just have traditional installed apps."
It's not hard to put a Web front end on an application and make it available through an internal server to users on a range of client hardware, but thats not the most secure or manageable arrangement, according to James Staten, vice president and principal analyst at Forrester Research.
Applications are more secure and easier to control and support if IT is able to put a hypervisor on every device an end-user wants to use, Staten says. Not every device needs a hypervisor designed expressly for its hardware or operating system -- which Citrix sells and VMware is developing -- but native hypervisors perform at much higher levels than those installed later, no matter which vendor makes them.
Not only is the connection more secure than streamed apps or SSL, the hypervisor lets IT create a whole environment in which it can apply the same security, applications, and policies it does on a company-owned computer that never leaves the building.
"That makes it a lot easier to enforce policies on antivirus and security updates and keeps anything you might install on the 'home' part of a device from sneaking over into the 'work' section and corrupting drivers or transmitting a virus," Song says. "Just because of the number of end-users and client devices, there's a real scale issue especially compared to virtual servers."
Trying to scale virtual infrastructures to keep up with virtualized, consumerized hardware is a nightmare for IT operations people, who are often already struggling to move past the barriers and plateaus many companies hit during large-scale physical-to-virtual server migrations, Staten says.
"When you're doing a lot of P2V, server virtualization looks like a massive cost savings. But when you get past the point where you're taking out a lot of hardware and you start to see a lot of proliferating VMs and you start consuming a lot of virtualization, it can look like the costs and sprawl are out of control," Staten says.
Virtual desktop costs are exponentially greater than virtual servers, simply because the number of desktop machines is higher, Staten says. Even among a fully virtualized workforce, every employee needs some type of hardware at the desk, which can range from a normal PC to a zero-client terminal from Wyse, Pano Logic, or other thin-client hardware vendors.
Every additional user means more load on the data center for authentication, storage, and most expensively, to run the virtual machines, virtual applications, and streaming services that are actually running the apps, Staten says.
Demand is even greater for companies with lots of power users who run applications that are particularly resource-intensive and require a high level of security and data center quality availability and backup, Song says.
IT can mitigate the growth in cost by giving different users different amounts of virtual "power," which translates into space and computing-resource use in the data center. They can cut data center costs drastically by using hypervisors to create two separate virtual environments on every laptop: one available to the worker for personal use, the other for work that requires a secure login and keeps the virtual machine dedicated to the "work" environment clean of viruses, driver conflicts, and other problems unofficial use can introduce, Song says.
In that case workers do use a virtual desktop, but it lives on their physical laptop or smartphone, rather than on a back-end server.
While Citrix sells native hypervisors for most smartphones and VMware for one or two, that two-phones-in-one approach is growing only slowly because of the cost and complexity of maintaining virtual machines on many types of devices within the same workforce, Song says.
The sheer variety of machines is a problem because it is so important for the hypervisor to be as close to the processor and under as many layers of system software as possible, Song says.
Type II hypervisors run on top of the operating system even of resource-constrained devices like smartphones, eroding performance.
Native "bare metal" Type I hypervisors would be better, but aren't available yet, Song says. Citrix is due to ship its version of native hypervisor code this summer. VMware has announced it is working on bare-metal hypervisors, probably for release early next year.
"More than anything else the thing that can kill a project or save it is hardware compatibility. IT can't go tweak all its templates and rewrite all its software to support 100 different form factors," he says. "Expanding the HCL [hardware compatibility list] is the key to a lot of this. It doesn't solve all the problems, but it gets you past the first bunch of them."
This story, "How desktop virtualization can help IT manage consumer devices" was originally published by CIO.