What are the most common mistakes IT managers make when doing a server virtualization project? There are many, and they’re easily avoidable, said a gaggle of Gartner analysts at the Gartner Data Center Conference in Las Vegas this week.
Most virtualization goofs center around lack of planning and forethought, they said.
For example, on the storage front, perhaps the biggest mistake is to start a virtualization project without already having a SAN or shared storage in place, said Robert Passmore, a Gartner research vice president. “They get 75% of the way through a virtualization project and they realize they need a SAN, but by that point they don’t have the budget,” he said. “It’s a real disaster.”
Similarly, most virtualization newbies don’t anticipate the increase in demand that virtualization will place on their systems, nor the speed at which growth will occur.
“Virtualization introduces speed, and most processes aren’t ready for speed,” said Tom Bittman, another Gartner research VP. And because virtualization tends to remove artificial obstacles to provisioning new workloads, demand for services tends to double in highly virtualized environments, he said. To avoid being overwhelmed, newcomers to virtualization should consider implementing lifecycle management, as well as chargeback or showback.
Other rookie mistakes involve failing to treat the virtualization layer with enough gravitas.
In their enthusiasm for VMware, many IT managers make the mistake of moving too quickly to the latest and greatest release of the platform, said Passmore, oftentimes before ecosystem such as backup and management software are available.
And security of the virtualization layer is often overlooked, added Neil MacDonald, vice president and distinguished analyst. “These issues are overlooked because people say ‘nothing’s different,’ when really, a lot is different,” he said.
To avoid security problems, MacDonald suggested IT managers elevate virtualization to the same layer as the operating system. “Treat it like an OS,” he said, with all the attending hardening, patch management and compliance processes.
Finally on the desktop front, an overarching mistake is to fail to fundamentally rethink desktop support, said Mark Margevicius, vice president and research director. When it comes to the desktop, most organizations are built around a distributed device, he said, but “everything changes when you virtualize the desktop,” for instance, support processes, refresh rates, and budget allocation, to name a few. Without thinking desktop virtualization through up front, “you almost have more problems than the problems you solved with it.”