Mike Williams considered his virtualization project a success after consolidating 17 U.S. datacenters into three. But then the traffic jams started.
As CIO of the U.S. Defense Contract Management Agency (DCMA), which monitors work on military contracts, Williams had a problem on his hands. Consolidating all those datacenters without reconfiguring the WAN was like consolidating 17 cities into three without widening the freeways.
“It is important to make sure to optimize the WAN. We actually didn’t,” Williams says. “All of a sudden the speed of light is not so fast anymore.”
Williams’ story is one of many cautionary tales surrounding early virtualization efforts. Although virtualization promises cost-saving optimization of datacenter resources, the path to that payoff is littered with hazards.
Not just network configuration, but software licensing, security, and systems management are all potential pitfalls, say industry experts and enterprises that have gone virtual. And people issues can be more troublesome than technical ones if the corporate culture resists virtualization.
When the city of Charlotte, N.C., began virtualization, some departments hesitated, says Philip Borneman, assistant director of information technology. “You’ll always have some early adopters and others who want to wait and see.”
Elsewhere, some IT fiefdoms simply won’t share. “I have heard of companies that have gotten a lot of pushback from departments who don’t want to give up their own hardware or applications,” says Charles King, principal analyst with Pund-IT, a technology analysis firm.
The organization chart can complicate other projects, adds Nick van der Zweep, vice president for virtualization at Hewlett-Packard. One unidentified insurance company, notes van der Zweep, maintained separate IT resources for group insurance, individual insurance, financial investments, and other departments. “When you decide to bring them together, you get turf wars,” van der Zweep says. “You’ve got to convince a lot of people.”
Virtualization also upends the software model. Typically, software is licensed to run on just one server, but having to license it to each of 50 virtual servers limits potential cost savings, van der Zweep explains .
HP faced that problem when deploying BEA WebLogic software on 400 virtual servers. HP created a shared application server utility, a cluster of five server nodes. HP paid for five licenses even though each cluster feeds up to 60 virtual machines.
Some but not all software companies are revising their software licenses for virtualization, while others withhold support if their software is run in a virtual environment.
Virtualization presents security issues, too, says Michelle Bailey, an IDC research director.
If security software runs on a physical server but one of the virtual servers is moved to another physical server without it, “that could be a problem,” says Bailey. “The security policy has to live somewhere else such as on the network layer.”
Using the right management tool is critical to making virtualization work, she says, and maintaining security is just one of its functions.
Companies assigning virtual workloads to physical servers need to make sure they are properly configured, have up-to-date patches and don’t still contain rogue software that could cause problems, says Erik Josowitz, vice president of product strategy at Surgient, a provider of virtualization for software development and testing.
The rush to virtualization poses a danger that some companies will practice the equivalent of “finding a server by the side of the road and plugging it in,” says Josowitz. “There will be some breach this year that will be a lesson to all,” he predicts.