Posted by: mschlack
Application virtualization, Desktop virtualization, Microsoft, Microsoft Virtual Server, Servers
I’ve recently been through a number of operating system upgrade experiences on the small network I maintain to learn about new technologies, and it’s really made me hunger for an all-virtual future. I just installed Windows Server 2008 in a VM (VMware Server) and contrasting that with the pain of upgrading several machines to Vista, let me just say that I have seen the future, and it’s vastly superior to the present.
The ideal scenario, IMO:
Every machine ships with a virtualization layer, or at least that’s what you install on bare metal.
Every operating system comes as a VM, in a file, with a script to customize it as needed.
Every application comes in a virtual application machine — I’ll call it a virtual application component (VAC) to avoid confusion. Adding it into a VM CANNOT kill the base OS or paralyze other apps in the odd ways that apps sometimes do.
Every OS or app installation is instantly reversible and recoverable.
We’re not very far from that, eh? You can almost see Nirvana over the horizon. I’m sure I’m missing some key ingredient that will stop it all from happening (greed, maybe?). What do you think? Can we get there?
It does make me wonder what Microsoft’s Hyper-V strategy will be. Will Hyper-V be a candidate for that bare-metal layer? Or is that just going to be ESX or Xen, not that there’s anything wrong with them. Well, ESX is a bit much for desktops. Will Microsoft, in a short-sighted attempt at making sure you buy at least one copy of WS08, force you to install Hyper-V and the OS together?
Why I say short-sighted, among other reasons, is that if today’s computers had a Hyper-V layer, Vista would be having a different life. Or at least the possibility of one. There are many reasons why Vista hasn’t taken off, but at least one is the difficulties of upgrading. Here are some I’ve experienced:
1. If you had partitioned your hard drive for XP and created a boot partition with, say, 20GB, you couldn’t directly upgrade to Vista. Vista wants something like 19GB of free space to do an upgrade even though it doesn’t take up nearly that much space. If you had unallocated space on the drive, you probably couldn’t expand the boot volume into it, because for reasons I don’t understand, most XP boot volumes can’t be grown. Nor does the Vista install routine have any special mojo to accomplish that. So you had to blow away XP and do a clean install, and then reinstall all your apps. Such fun!
2. Then there’s the strange upgrade policies for Vista itself. Now I admit that I haven’t tracked down all the ramifications of enterprise licensing, but at the retail level, it sure is screwy. Let’s say you have a machine with an OEM copy of a lower version of Vista and you discover you need some of the features of Ultimate (oh yeah, I can’t live without BitLocker). You might think that buying the retail version of Ultimate would allow you to upgrade. Not a chance! Only buying a special upgrade version will do that — the full retail version can only be used to install a clean copy, once again forcing a reinstall of all apps. Who thought that was a good idea?
In both cases, wouldn’t it have been so much nicer to snapshot your VACs to a network drive, install the new VM appliance, assigning it disk resources from anywhere, reconnect your VACs and go? Maybe in the near future, your VACs would live on the network, either public or private, and the distinction between locally and remotely hosted apps would be somewhat transparent to the user.
Yes, I know there are all these nasty little issues of hardware compatibility and so on. And then there’s the occasional “change in the driver model” that renders half your hardware obsolete. But if these problems are sorted and isolated into the right layers, I have to believe life will be easier. Call me a cockeyed optimist.