Now that I’ve been living with Vista SP2 for three whole days, I’m getting some experience with the new environment on some production machines. I did encounter a situation where SP2 froze at 48% through its Phase 1 (of 3) changes prior to the restart between Phase 2 (before restart) and Phase 3 (post restart), but a reset on the machine caused it to start over with Phase 1, after which the entire remaining process completed successfully. Having made a complete backup before starting, and wondering if the admonition not to power off the PC while in process was as dire as stated, I was both surprised and pleased that the SP application proceeded and succeeded on a second try without having to restore the backup and start over. Is it possible MS has improved its SP application tools?
At any rate, with the SP now in place I’m watching my systems closely for stability and reliability. I’ve also dug into the System and Application logs in Event Viewer to see if some of my chronic and persistent errors have been addressed by the new service pack. Without conducting a complete exhaustive analysis, my observation is that some have been addressed, while some have not.
Here are some details. Prior to the SP2 application, I had a decent-sized laundry list of recurring errors for which I could find no fix, but which also didn’t seem to adversely affect system stability and usability. Here’s a summary table for these items:
|Critical||DriverFrameworks-UserMode||10110||A UFD has a flaky driver|
|Warning||Servicing||4374||KB955430 not applicable to my system|
|Warning||Time Service||36||No synchronization occurred in last 24 hrs|
|Warning||Tcpip||4226||Limit of concurrent TCP connect attempts reached|
|Error||HttpEvent||15016||Unable to initialize Kerberos for server side authentication|
Of these items, I see some have disappeared, and others have morphed slightly (and more informatively) into altered forms. The time service error remains unchanged (but it always works when I synch manually, so I’m not worried about it). The UFD error code persists, but also gets a new companion code 10111 that explicitly identifies the offending device by name. Because it always works when I plug it in, my workaround here is just to remove the device whenever I finish using it. 4374 (update not applicable) has gone away completely, and 15016 (Kerberos not initializing) shows up only once (it used to appear daily) . 4226 (TCP connect attempts) hasn’t showed up, either, but this usually occurs when I’m using FTP and I haven’t done so since applying SP2. That means I give SP2 a 20-40% improvement score on those “pre-existing conditions.”
As you might expect, however, I also see some new recurring items in the Event Viewer that I didn’t see before applying SP2. I summarize these in the next table:
|Error||Service Control Manager||7000||Windows search service failed to start in timely fashion|
|Error||DistributedCOM||10005||Error 1053 when attempting to start WSearch|
|Error||BitLocker-Driver||24620||Volume information on N cannot be read|
The DCOM error is one I’ve seen before and relates to Windows Search attempts to index items that are no longer present (hence an empty search target in the error message detail), and ties of course into error 7000 as well. Likewise, Volume N relates to the UFD with the driver problems. All of these are items I can live with (and if I can figure out my search target issue for Windows Search) maybe even do away with.
My final analysis on SP2 for chronic errors: “So far, so good!”
I’m helping to revise a book on Windows 7 right now. By well-known Windows mavens Brian Knittel and Bob Cowart, it’s to be called Windows 7 in Depth (Pearson, 2009, ISBN: 0789741997). In the course of writing the chapter on hard disks, formats, basic and dynamic volumes and so forth, I was forcibly reminded that there are many different formats that work on USB Flash Drives, aka UFDs. And as is usual for good questions of all kinds the answer to the inevitable question: “Which format should I use on my UF?” start with the famous qualification phrase: “That depends…”
I’ll list the formats that work on UFDs in Windows Vista and Windows 7, and explain why you might use each one in the ensuing explanation (for more great information on this topic see GUIDE: FAT16, FAT32, NTFS, or exFAT on USB Flash Drives? in the forums at the excellent NoteBookReview.com Website):
- FAT16 (aka simply “FAT;” there used to be a FAT12 format once upon a time, too, but it’s not supported in Vista or Windows 7 although I think these systems can probably still read it) offers the best overall cross-platform support for non-Windows or old Windows OSes, and also delivers the best total performance overall. That said, it’s limited to 2 GB volume size (up to 4GB on some OSes, if a 16 KB cluster size is used), with a maximum file size equal to maximum volume size minus file/directory overhead (usually no more than a few hundred bytes). FAT16 works best for UFDs of 2 GB and smaller.
- FAT32 also offers good cross-platform support, especially on non-Windows OSes (it won’t work on older DOS versions or Windows versions prior to Win95 SR2), and is not subject to the 2/4 GB size limit (it tops out at 32 GB, more than big enough for all but the largest UFDs on today’s market). It offers only moderate to slower overall performance, however, and supports a maximum file size of 4 GB. FAT32 works pretty well on all but the largest (< 32 GB) UFDs.
- NTFS supports relatively low cross-platform support, and doesn’t work on non-NT based Windows versions, and not at all on DOS (though the NTFSDOS utility from SysInternals can mitigate this to some extent but only for reading the contents of NTFS volumes). It’s very fast for single files, but doesn’t do as well as FAT16 or FAT32 for write activities involving multiple files. It does support ACL based access controls and works with all kinds of encryption technologies, and is far more secure than FAT (Windows 7 supports a Portable BitLocker implementation that lets you encrypt UFD contents for save storage and transport). You must use the Optimize for Quick Removal option on an NTFS formatted UFD, or risk losing data unless you use the “Safely Remove Hardware” applet to dismount it from your PC. NTFS or exFAT are required for UFDs larger than 32 GB, and work well for those who want to use large UFDs for backups.
- exFAT, introduced with Vista SP1, is basically FAT64. This supports file and volume sizes of 264-1 (16 exabytes), and volumes to match, and raises the ceiling on maximum file entries per directory as well. Vista can’t use exFAT for ReadyBoost, but Windows 7 can. exFAT currently works only in Vista, Windows Server 2008, and Windows 7, and also suffers from slow write speed on multiple files. NTFS or exFAT are required for UFDs larger than 32 GB, and work well for those who want to use large UFDs for backups.
To format a UFD with any of these file systems, insert it into a Vista or Windows 7 PC, right click the drive icon in Windows Explorer, then select the format entry in the resulting pop-up window, to see a display something like this one (it shows the exFAT format selected in Windows Vista).
In my frenzy to test and evaluate Vista SP2 for stability I built myself a test system around the slipstreamed version of Vista x64 SP2 RC available from MSDN for download to subscribers. And while my results were overall positive–FWIW, SP2 seems to add significantly to Vista stability and uptime–it appears that I’m now faced with a painful dilemma about when to scrub the machine and rebuild with the just-released production version.
So, off I go to find the expiration date for the RC. It turns out that whenever one exists, it appears in the “About Windows” information for any Windows release. I scratch my head a bit to figure out how to do this, then run winver at the command line: it does the trick nicely.
Upon seeing the details, I heave a sigh of relief for numerous reasons. First, there’s no compelling reason to act soon, with just under 11 months to go before this OS turns into a pumpkin. Second, there’s every hope that the commercial release of Windows 7 will be out before that date (and I may even be able to perform an in-place upgrade to x64 Windows 7 Professional from this version if the stars smile on me). Third, this being a test machine after all, circumstances (read: new work) may compel me to rebuild this machine for any number of reasons well before the expiration date arrives.
All this said, however, there is a moral to this story: When MS says you shouldn’t install an RC version on production gear, they’re not kidding around. You do have to be willing to bite the bullet sometime after installing one of these versions, and replace the install with something else (that is, a so-called “clean install” that wipes out all the work involved in setting up the machine with the RC and its accoutrement) in its stead. I’m glad that my primary work on this machine consists of building various VMs to use for testing inside Virtual PC 2007. Presumably I need only copy those constituent files to a backup drive, rebuild the underlying system, then replace them in the “My Virtual Machines” folder to keep on using them no matter which version or kind of host OS is running to support them. More on this later, I’m sure…
Way back in February I blogged here about Microsoft Enterprise Desktop Virtualization, aka MED-V. In the past few weeks, Microsoft has announced that it will offer a free download to buyers of Windows 7 in the Business, Ultimate, and Enterprise versions called Windows 7 XP Mode. Essentially what this provides is a copy of Virtual PC with a Windows XP SP3 license and install image, so that users can easily build and add an XP-based virtual machine to their toolset, primarily as a platform for legacy applications that won’t work with Vista or Windows 7.
I now understand that XP Mode is a kind of do-it-yourself or roll-your-own version of what MED-V provides as an adminstrator-handled and centrally managed capability for businesses at all scales (though it started with an enterprise target specifically in mind), Microsoft Product Manager Ron Oelgiesser told me yesterday that “even businesses with 100 or 200 users who want to run virtualization” can benefit from MED-V technology. Simply put, it’s designed to allow trained IT professionals (administrators) to design, build, and maintain standard VM images, and to make delivering those images to end users as simple as opening a utility and picking a virtual machine by some readily intelligible name for use (for example “accounts payable” or “call center”). Behind the scenes, the admins are responsible for putting those VM’s together, and updating them as new drivers, updates, and other changes come down the road. Users simply load them and use them as needed, which represents a technique for making good use of virtualization that’s just about as easy as it gets.
From the admin side, things aren’t too shabby either. Microsoft provides a QuickStart guide that shows them how to put VMs together, and test them to make sure they work as desired, then make them available for general access and use with the MED-V client components on end-user desktops. MED-V comes as part of the Microsoft Desktop Optimization Pack that is only available to customers who sign up for Microsoft Customer Assurance. Best of all, according to Oelgiesser, the incremental cost of adding a MED-V component to an existing Assurance subscription is “less than $10 per seat per year.” Considering that this includes an XP SP3 license on which to run legacy apps, as well as a nifty set of tools for packaging, distributing, and managing VMs, this is a fantastic value.
Thus, even though MS will be giving away the XP Mode components with higher-end Windows 7 licenses, I predict that MED-V will also enjoy considerable adoption and use, even from SMBs. Oelgiesser confirms that MS feels bullish on MED-V as well, and indicates that some adoption for Vista has already begun among existing Assurance program participants. Should be interesting to see how this all turns out, once Windows 7 goes commercial.
With Vista SP2 now available via MSDN and other restricted download venues (not to mention BitTorrent servers which already offer acess to all 36 languages that MS plans to support in its first public release of SP2), it’s time to start thinking about migrating to this latest version in your environments–or not, as the case may be. That means it’s high time to go off and grab your own copy for use in the test lab, to see if what it fixes meets or exceeds what it breaks. Notebook users, in particular, will benefit from this release if platform vendors haven’t already pushed Bluetooth and Wi-Fi updates to those machines through their own update/maintenance programs.
In any case, SP2 should become publicly available on demand through Windows Update, possibly before the end of May, and perhaps by the middle of that month. In my own testing with the beta, I found updated machines were significantly more stable than SP1 machines, particularly in environments where users are prone to installing and/or uninstalling programs on a regular basis.
Certainly, you’ll want to test SP2 for yourself in your production evironments to see what’s what. You should also put GPOs in place to block SP2 installation from end-users until such testing is complete, and send out e-mail to educate users on the risks attached to SP updates, because many of them will want to jump on this release for their home or family PCs. Be sure to tell them to back up their PCs completely before applying the SP, to avoid in-process errors that might leave a machine unbootable, or unable to roll back from SP2 to previous levels. You’ll also want to warn them about the potential for application conflicts or errors, particularly for any legacy software that may belong to your production environment.
Except for the potential for breaking some software, though, the prognosis for Vista SP2 seems pretty positive overall. Be sure to check it out for yourself at the earliest opportunity!
Yesterday, April 30, Microsoft announced the availability of Windows 7 Release Candidate (RC) and also the final version of Windows Vista SP2 for download at MSDN and other restricted access Microsoft developer and beta test sites. The world took notice, and started hammering Microsoft’s servers mercilessly and ceaselessly.
How can I say this? Because I’ve been working with the Microsoft File Transfer Manager for some time now to download ISOs for various OSes, all kinds of tools (WAIK, debuggers, devtools, and more), service packs, and other stuff for some time now. On most days, it’s not unusual for me to experience download speeds of 1.5 – 2.0 MBps (12-16 Mbps), and my normal speeds are usually in excess of 800 KBps (6.4 Mbps) at any and all times.
But that’s not true since the news about these items hit the net yesterday. I’ve been lucky to get a maximum of 350 KBps (2.8 Mbps) on downloads and values keep dipping below 150 – 200 Kbps (1.2-1.6 Mbps) on a pretty regular basis. This translates into the kinds of download times reported in the following table.
|Win 7 RC x86||2.357 GB||4:37|
|Win 7 RC x64||3.046 GB||5:19|
|Vista x86 SP2 final||1.345 GB||2:07|
What does this mean to those with access to MSDN or other restricted download pages on MS Web sites? If you can wait for the initial rush to die down, you’ll probably be a much happier camper. As for me, I couldn’t wait because I have to write about all of this stuff right away, starting today. Fortunately, I was able to queue this stuff up overnight before crashing last night (by which time I’d already downloaded the complete x64 Win7 RC anyway) and find the complete downloads waiting for me on disk when I got up this morning. For the time being, however, MSDN (and other restricted) downloads can only be described as “slow going!”
BitTorrent is turning into an unofficial prerelease mechanism for Microsoft software these days. It’s cartainly become a nonpareil source for Windows 7 builds that’s right in synch with Microsoft’s own internal releases. Yesterday Vista SP2 was already available online in Torrent form hours before Microsoft’s own 6:49 PM Tuesday announcement “Windows Vista SP2 RTM” appeared in The Windows Blog. This release also includes a common installer and code base for both Windows Vista and Windows Server 2008. For details on the changes that come with SP2, check out the associated “Notable Changes” document on TechNet (also dated 4/28/09).
I’ve been working with the 64-bit Vista RC version of SP2 for nearly three weeks and so far, it’s been remarkably stable. If my own experience is any guide, Vista admins can look forward not just to some nice boosts to functionality (Feature Pack for Wireless, improved Wi-Fi performance after sleep mode resumption, improved RSS feeds sidebar, and built-in support for burning Blu-ray media) but also to some increased stability as well. Enterprise versions will also benefit from improved power management for both servers and desktops, as well as better backward compatibility for Terminal Server License keys.
It’s not yet clear whether Microsoft will indeed get the downloads posted before April 30 comes and goes, as they had originally promised. If not, given that the bits are already available and in circulation on BitTorrent, I’d have to guess that they won’t show up much later than early next week in any case. Based on recent precedent, this means that SP2 should become available on demand from Windows Update some time in late May or early June, and will become a mandatory download later this year. A tool to block the SP2 download should also become available very soon, if not at the same time as the SP2 download, then some time soon thereafter.
OK, so I’m climbing a learning curve with x64 Vista and Microsoft Virtual PC that is at times frustrating, at other times just plain weird, but always interesting and even sometimes moderately entertaining. I’m starting to get the hang of this whole Virtual Machine thing at long, long last and have learned some interesting lessons that may help those who have themselves yet to venture down this path.
The old aphorism: “When your only tool is a hammer, every problem looks like a nail” resonates with the first of my recent Lessons Learned with Virtual PC 2007. I’ve recently set up a baker’s rack in my office, and now have all of my test and experimental machines racked up next to my desk. Ordinarily, I use Remote Desktop Connection to access and work with those other machines from the comfort and convenience of my dual-screen-outfitted desktop. One of the first lessons I learned with Virtual PC 2007 is that the number of levels of indirection for mouse and keyboard when installing an OS into a new virtual machine is limited. That is, I actually have to use the mouse and screen on the Virtual PC 2007 host machine to install a guest OS onto that machine. I can open and load an existing VM via a remote session, but no joy in performing installation tasks. Good to know.
At this point, the biggest benefit to using VMs is that I can create a reference machine for some target environment, install all the patches and updates, add whatever other scaffolding I want (antivirus, antispyware, necessary apps, and so forth), then save that machine for re-use. I just need to remember to load that VM from time to time to update it, then save it again so it becomes my point of reference for continued/continuing reuse. I’ve also learned to be very specific in naming the virtual drives I create for such machines, so I can tell them apart, and to copy the “reference versions” (for later reuse) to another hard disk, so I can always get back to a pristine state by copying over the backup version from that drive to its primary location as needed.
This approach makes it much easier and safer for me to install and test software to write about it, and then to rid myself entirely of it after the work is done. I’m still running only one VM at a time and figuring out how to make things work, while discovering a whole new set of virtues for shared or networked drives (they’re easily accessible to both host and guest systems, and thus provide a perfect means of file/information transfer between the two otherwise distinct and independent systems). As I learn more about how to make this environment stand up and bark, particularly while working with Windows 7 (I’ll be installing Build 7001 shortly) I’ll keep reporting back with new observations and lessons learned.
For a long time now — perhaps even too long, if recent experience is any indication — I’ve avoided x64 Vista on my production and test machines. Myths and rumors about lack of drivers, stability issues, software compatibility problems, and more, had dissuaded me from using the product on my production or test machines.
But no sooner did I find myself in a situation where 64-bit Vista was an absolute necessity for setting up a virtual machine host platform that could host both 32- and 64-bit test environments for a book on Windows 7 I’m working on, than I also learned that “news” to the contrary notwithstanding, 64-bit Vista is both workable and pretty robust. To jump to the end of the story before returning to the middle for more details, I’ve now installed 64-bit Vista on a couple of notebook PCs (each with 4 GB of RAM) and a desktop PC (with 8 GB) without too much difficulty and with very good results. To me the biggest thrill of all comes from seeing this kind of display in Task Manager on a machine with 8192 MB of RAM:
I don’t know where the “missing 2 MB” of RAM went, that being the difference between the amount installed 8192, and the amount showing here (8190), and I’m not sure I really care. What I do appreciate is access to nearly all the RAM (99.975% in fact) that I installed in the machine.
Why did I do this? Because I had to be able to install both 32- and 64-bit versions of Windows 7 on a platform that would let me shoot screenshots of the installation process at work. Today, this means one of three approaches to obtaining the needed screencaps:
- Photographing the actual screen itself (doable, but tricky because of lighting and reflections, especially when using a flash)
- Using special hardware to pipe graphics output from the target PC (where the install is underway) to a second PC (where an OS is already running, and can operate screen capture software; complex to set up and extra costs for graphics cards one must use to do this)
- Installing inside a virtual machine, so that the virtual machine window can be captured on the desktop of an operating and fully-functional OS (to make this scenario even more compelling, MS is still giving away its Virtual PC 2007 software, which worked like a charm for me)
Needless to say, I opted for the latter, and have now set up and run virtual machines (.vhd files) for both 32- and 64-bit versions of Windows 7 Premium edition versions of the new beta OS. Throughout, the 64-bit Vista Business software I’m using has been stable, accommodating, and workable. I haven’t yet figured out how to create a VM greater than 4GB in size, so I’m thinking I may need to build a configuration with 12 or more GB of RAM to make that possible. Other than that, I’m a pretty happy guy right now.
In installing the 64-bit version on an MSI and HP notebooks with 4 GB of RAM I encountered exactly zero driver issues: everything came up with a working driver immediately upon the install, and I was able to use DriverAgent to get the default drivers updated to the most current versions without too much difficulty. I did hit a snag on my Asus P5K motherboard, primarily because the built-in GbE interface identifies itself as Attsanic but the most recent drivers are now from the renamed builder’s Website at www.atheros.com. Once I figured out that the L1 GbE Ethernet interface I wanted was now an Atheros product, I was able to find, download, and install the right drivers pretty quickly thereafter (though I was lucky to have access to other machines so I could download those drivers and read them from an easily-inserted UFD).
I’m working with the RC version of Windows Vista Business with SP2 slipstreamed, and I must also observe that I’m impressed with the stability and capability of the upcoming Service Pack, scheduled for release some time in May. Looks like this will be a worthwhile upgrade!
In case you’ve been hiding out lately, you may not be aware that Oracle has made an offer to buy Sun, and that Sun’s board of directors has accepted that offer. All that’s left now to do to consummate the merger is to get past any government objections (none are expected), consolidate operations (and presumably layoff redundant staff), and bring the two parties under one umbrella. I leave this to upper management and the M&A teams at both outfits. What I really want to know about is what happens to the cert programs from both companies in the aftermath?
As company cert programs go, both Oracle and Sun have pretty substantial sets of offerings. I’ve worked with and around the Sun programs more than I have the Oracle ones, but my impression is that both organizations offer an interesting mix of credentials, supported by serious and capable training and certification teams. I do see some big differences in philosophy and approach between the two, especially where Open Source and standard vs. proprietary tools, languages, and platforms are concerned, however. But given that so many analysts and observers are drawing attention to the importance of Java in the overall mix of what Sun brings to this party, I have to guess that here will be some intense “cussin’ and discussin'” going on behind the scenes as these two outbits begin to coalecse and decide what to do with their respective certification and training programs.
Does that mean I’m brave enough to guess who’s going to come out on top? Probably not: I’ll simply observe that the combined mass of Sun certified professionals outnumbers the corresponing population of Oracle developers by two or more to one. Given that both organizations are strongly motivated to hang onto and grow their user bases, I’ll hazard the idea that numbers and perception will play a key role in keeping Sun credentials and programs alive, and in perhaps driving Oracle to change its certs and related infrastructure to be more like Sun’s rather than vice-versa.
Watching how all this plays out should be interesting, particularly for those of us (like me) who are distant enough from any associated carnage to not be harmed by it. I’m guessing it will take 1-2 years for all this to unfold, and should provide plenty of fodder for blogs and musings to come.