Server Farming

Dec 10 2010   10:22PM GMT

Gartner: rookie virtualization mistakes to avoid

Alex Barrett Alex Barrett Profile: Alex Barrett

What are the most common mistakes IT managers make when doing a server virtualization project? There are many, and they’re easily avoidable, said a gaggle of Gartner analysts at the Gartner Data Center Conference in Las Vegas this week.

Most virtualization goofs center around lack of planning and forethought, they said.

For example, on the storage front, perhaps the biggest mistake is to start a virtualization project without already having a SAN or shared storage in place, said Robert Passmore, a Gartner research vice president. “They get 75% of the way through a virtualization project and they realize they need a SAN, but by that point they don’t have the budget,” he said. “It’s a real disaster.”

Similarly, most virtualization newbies don’t anticipate the increase in demand that virtualization will place on their systems, nor the speed at which growth will occur.

“Virtualization introduces speed, and most processes aren’t ready for speed,” said Tom Bittman, another Gartner research VP. And because virtualization tends to remove artificial obstacles to provisioning new workloads, demand for services tends to double in highly virtualized environments, he said. To avoid being overwhelmed, newcomers to virtualization should consider implementing lifecycle management, as well as chargeback or showback.

Other rookie mistakes involve failing to treat the virtualization layer with enough gravitas.

In their enthusiasm for VMware, many IT managers make the mistake of moving too quickly to the latest and greatest release of the platform, said Passmore, oftentimes before ecosystem such as backup and management software are available.

And security of the virtualization layer is often overlooked, added Neil MacDonald, vice president and distinguished analyst. “These issues are overlooked because people say ‘nothing’s different,’ when really, a lot is different,” he said.

To avoid security problems, MacDonald suggested IT managers elevate virtualization to the same layer as the operating system. “Treat it like an OS,” he said, with all the attending hardening, patch management and compliance processes.

Finally on the desktop front, an overarching mistake is to fail to fundamentally rethink desktop support, said Mark Margevicius, vice president and research director. When it comes to the desktop, most organizations are built around a distributed device, he said, but “everything changes when you virtualize the desktop,” for instance, support processes, refresh rates, and budget allocation, to name a few. Without thinking desktop virtualization through up front, “you almost have more problems than the problems you solved with it.”

 Comment on this Post

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when other members comment.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: