Virtualization hulk VMware closed out 2010 with a bang, reporting a 37.4 percent increase in revenues at $835.7M, and a 212.5 percent increase in net income at $119.9M.
Desktop products such as VMware’s View and the Workstation hypervisor accounted for 10 percent of revenues for the full year in the United States, a possibly deceptive figure in the face of the prevailing hope that 2011 will be a big year for desktop virtualization, particularly virtual desktop infrastructure. VMware’s chief financial officer, Mark Peek, didn’t make any predictions in that arena, but TechTarget’s own virtual desktop experts Brian Madden and Gabe Knuth took a stab at foretelling desktop virtualization trends in the coming year, marking VMware’s Horizon as a major competition for Citrix’s OpenCloud Access.
VMware as a whole is projecting modest growth in 2011, between $800 and $820M for Q1. Acquisitions were a major part of VMware’s 2010 strategy, with seven in total, adding upwards of 600 employees. VMware finished off the year with around 9,000 employees worldwide, up from 7,000 at the start of the year.
With pre-installation of vSphere on servers from big name hardware makers such as HP, Dell, and IBM, VMware is in good standing for a strong 2011.
What are your predictions for desktop virtualization in 2011? Do you think VMware will continue to reap the benefits?
Today’s guest post comes from Michael Fox, author of DeMystifying the Virtual Desktop. Michael Fox is a practicing expert working in the field of desktop virtualization. He has worked in IT for more than a decade, holds numerous technical certifications, speaks regularly at technical conferences and currently works as a Senior Architect in virtualization for EMC Consulting, the professional services division of EMC.
My hope for 2011 is that I have fewer meetings introducing people to the basics of desktop virtualization and more meetings about advanced topics in the field. I want to have more meetings on the kinds of desktop virtualization topics that allow us consultants to flex our muscles, bringing the considerable weight of our practice’s talent to solve an organization’s challenges. I want to have CIOs ask me how we can baseline the existing compute experience and then, using desktop virtualization, improve productivity for employee workflows. I want to have VPs considering acquisitions asking me questions about how desktop virtualization can accelerate the time it takes to complete integration.
If everyone read my book, that might happen. I wrote DeMystifying the Virtual Desktop to be a resource that gives decision makers an introduction to the use and application of the technology. It covers the basics of desktop virtualization like putting the user first, communicating across teams, understanding foundational technologies, doing assessments and analyzing designs to help understand costs. It was peer-reviewed and written in a completely agnostic way that does not mention any specific technology or vendor.
According to a number of studies that are out there, a large number of organizations have only just begun looking at this powerful technology. The good news for those of you just beginning your virtual desktop journey is that the products and reference architectures are getting better all of the time. Density inside the data center is increasing and the competition between platforms is quite healthy. There were so many improvements in 2010 that they are difficult to count. Protocol improvements enabled improved access to virtual desktops inside the data center. A number of new features, both hardware and software, are eliminating the storage challenges with centralized deployments. Lastly, the promise of the client-side hypervisor, with its centralized administration and decentralized execution, is beginning to come true.
Watch for a number of exciting new desktop virtualization developments in the coming quarters. I can’t say anything because of those pesky non-disclosure agreements, but there are a few real gems out there that I am really looking forward to bringing up with customers – the kind of developments that enhance the usability and value of desktop virtualization technologies for both end users and administrators. Density will continue to do what it does best – increase. Application performance will continue to advance to levels that are quite significant. Architectures that support sub second application startups, for example, will continue to become more and more common. There is even progress on desktop certifications. AppSense has two new certification tests coming to a testing center near you in February.
What we all know for certain is that 2011 will continue to be a year of conquering virtual desktop implementation challenges. Enterprises that implemented the technology early on will be considering ways to take advantage of new storage and application virtualization techniques. Companies performing Windows 7 migration will continue to consider desktop virtualization technologies at the same time. A change from 2010 will be the slew of tablets that are going to be invading our enterprises. This may in fact be a boon to the entire desktop virtualization space. If you think about it, the tablet and the virtual desktop may be a match made in heaven by giving users the ability to access any kind of application and/or an entire corporate desktop from the convenient devices.
For a chance to win a signed copy of Michael Fox’s book, share your own desktop virtualization wishlist for 2011 in the forums!
In the hubbub of new year’s predictions, Security Week expects virtualization to take a step forward in 2011 despite general security concerns:
Many companies have cited security concerns as the main blocker to virtualization and private cloud adoption. Paradoxically, virtual machines can be more secure than the physical servers they replace. Because virtual machines are purpose-built, virtualization security software can offer levels of dynamic and automated security that are unequalled in the physical security realm. As organizations become more familiar with hypervisor-based security and VM Introspection, the apprehension that may have stymied virtualization of critical workloads will be appeased.
Rather than having numerous endpoint devices that increase the possibility of a rogue user or machine, aspects of desktop virtualization such as zero clients allow for a major decrease in the threat of human error. As a result of the lessening of security fears surrounding the technology, IDC analysts predict revenue of $1.7 billion by 2011 in the desktop virtualization market. Gartner, similarly, predicts an increase in desktops virtualized from 500,000 to 49 million by 2013. Now that desktop virtualization is becoming more of an accepted technology, you should know your options by asking yourself some questions.
A while back, Michael Morisy posed the question: Are iPads desktop virtualization’s greatest threat? When we posed the same question to the Cloud Computing, VMware, Virtualization and Enterprise 2.0 Group on LinkedIn, we got a slew of great responses. Here’s what we learned is going on in the minds of IT decision-makers when it comes to the future of virtualization:
- It’s not necessarily an either/or scenario. Because tablet and mobile computing are still in their infancy stages, the enterprise should take this opportunity to prepare its mobile strategy now.
- With many vendors offering access to VDI via RDP, VNC, or ICA, these services should be seen as complementary rather than competitive.
- Tablets are more of a threat to PCs, laptops, and Windows OS, but not so much to the practice and development of virtualization technology. On the other hand, without aggressive development for virtualization-friendly hardware, it’ll remain status quo for a while, with the PC industry remaining healthy for the time-being.
- One member of the group with experience in IT management said that the “security concern” is another major reason tablets are complementary to VDI rather than competitive. He posed the question: “Are companies prepared to accept VDI as a replacement for traditional desktop computing (regardless of client access device) and does the value add of VDI equate to real ROI?”
- A VMware technical trainer in the group wants an endpoint that is the application doing work for him (e.g. email, documents, etc.), rather than the device running the app. The ideal device is one that allows access to those applications without worrying about the in-betweens.
One topic that got a lot of attention in the discussion was phone virtualization. Will this emerging technology improve or detract from existing enterprise mobile policies? Perhaps the evolution in the device market will push innovation and customization in the desktop virtualization market as well. Either way, it seems IT pros are looking forward to the changes coming in 2011.
How do you think tablets will affect desktop virtualization? Let us know in the comments section or send me an email.
QR codes are finally coming to America with a caffeinated jolt, thanks to Starbucks’ new mobile payment system that lets you scan and pay for your drinks with an iPhone or BlackBerry pre-loaded with Starbucks Rewards account information.
And with Starbucks incredibly brand loyalty stats, the program has a huge opportunity for success. As the Seattle Times reports:
One in five Starbucks transactions is now made with the store cards, and mobile payments “will extend the way our customers experience and use their Starbucks Card,” Brady Brewer, vice president of card and brand loyalty, said in a release. “With mobile payment, the Starbucks Card platform further elevates the customer experience by delivering convenience, rewarding loyalty and continuing to build an emotional connection with our customers.”
But as Starbucks paves the way into a brave new world (for the US, at least) of QR payments, I get the sinking feeling that we’re bound to run into a “teachable moment” security lapse very soon. Continued »
We received a great response to our call to the community for advice on ensuring that a blizzard doesn’t become a perfect storm for your IT department. New member KFaganJr used the question as an opportunity to document everything his department did to prepare, preparing a fantastic guide for others facing a similar problem:
Most of the tasks were a verification that existing systems were fully functional before the storm hit. These included ensuring all off-site backups we run successfully, a live copy of any important documentation was hosted off-site and up to date, temperature sensors and reporting tools were functional and also tweaked to allow more time for action due to the extra travel time needed.
Being a small department we were able to just discuss things such as who can be where and do what if a major problem hits, but in a larger organization I would have documented it.
It’s important to walk through the process of redirecting traffic if a location goes down. We have four main sites that usually funnel all traffic through the main office, if one goes down then that traffic has to be sent elsewhere to keep everything functional. Things to note… how much time do you have to complete the job if need be, is the alternate connection you rely on functioning properly, is touch services needed to make the switch?
Also, for anyone working with end users, any employees with passwords about to expire were sent additional emails to prevent additional work, reminders of phone system features were sent out that would help users work from home seamlessly. emails to remind users that if they have VPN issues speak to IT before the storm hits if possible. Additional laptops were available to lend out for critical personnel.
The most important task in my mind is making sure everything is running smoothly before the storm though, you don’t want to worry about preparing for disaster and neglect the critical server that has been crashing, have an issue with that server when touch services aren’t available and making all other planning to avoid disaster in vein because of an unrelated issue.
Spadasoe suggested that while virtualization could help offload some effects of traffic spikes if the number of remote users faced a sudden jump, “Our fabric is our fabric so we are limited in what resources we can add.” Planning ahead and mapping out worst-case scenarios were the best tips for staying ahead of outages.
MrDenny shared that view, adding that using multiple sites and duplicated data was a worthwhile investment:
Having user data replicated between sites so that when they VPN into another site because the network link at their office is down the users can still access their data, etc. The setup and recurring costs for a little extra bandwidth is minimal compared to the loss of work from one bad 2+ day snow/ice storm.
One commenting wag suggested simply going with the flow, “cutting Internet access to the office? Then blame it on the provider.” While we can’t officially endorse that approach, we can understand the temptation. Any more tips? Leave them in the comments below or, better yet, add them to the community forum!
In an interview over at SearchVirtualDesktop.com, virtualization expert Mike Nelson highlighted some of the top stumbling blocks for new deployments of VDI. Among these stumbling blocks: Not understanding or being in tune with your users (i.e. not fully understanding what your users do); lack of application functionality in virtualized environments; and the inability to allocate resources and investment in planning. Brian Madden agrees.
Just like with any new technology or infrastructure, there are bound to be stumbling blocks and desktop virtualization isn’t something users want to jump head-first into. Enter VDI-in-a-box. Continued »
And if you actually look at what people are saying, it gets worse.
**Warning: Disturbing images ahead for the security conscious.**
So what’s an IT department to do? Well, for one thing, prepare early and often!
- Negotiate license agreements so you can have occasional “spikes” in remote software, like VPNs or web clients.
- Send out e-mail reminders to staff that, on days where you’re likely to have a large jump in remote workers, resources will be strained and outline strategies that workers can use to minimize their impact.
- Have a recovery strategy in place, with a timeline of how long it will take to bring critical and non-critical systems back online after a local or regional power outage.
Get your daily, 140-character dose of desktop virtualization with these evangelists, practitioners, and experts alike, compiled below as well as in our Virtualization Pros Twitter list. Don’t see your favorite name on the list? Add it in the comments or send me an email at Melanie@ITKnowledgeExchange.com! Not a Twitter person? No problem! Check out our list of top desktop virtualization blogs or SearchVirtualDesktop.com for meatier doses of the information you need.
When it comes to virtualization, be sure to read the fine print: Licenses can be surprisingly restrictive, even from vendors who are otherwise on the vanguard of virtualization. Take featured desktop virtualization blogger Brian Madden’s explanation of Microsoft’s licensing rules:
VECD stands for “Virtual Enterprise Centralized Desktop.” It’s the license that Microsoft requires to use its desktop virtualization. VECD must be purchased in addition to the base Windows operating system license. So if you want to virtualize Windows, you have to buy this VECD license as a second license. If you don’t like it — too bad. Don’t use Windows then. (Ah, the joys of a monopoly.)
And it gets worse, because VECD is a subscription, not a perpetual, license and signing up for VECD generally requires Microsoft’s annual Software Assurance program. As if things weren’t confounding enough, the VECD used to stand for “Vista,” and is documented as such in much of Microsoft’s documentation.