In researching a story for SearchWindowsServer late last week, I came face-to-face with some horrifying numbers. The first is a set of statistics from various sources that indicate more than 10 million servers are still running Windows Server 2003 in production mode in companies and organizations around the world (see, for example, these discussions in Redmond Magazine and in Greg O’Connor’s AppZero blog). The second is the looming date for end of life for that same software on July 15, 2015, where EOL is defined as the “day after Microsoft terminates extended support” for that OS version. You can look this up for yourself at the Microsoft Product Lifecycle Search page, where keying in “Windows Server 2003 R2 Standard x64 Edition” produces the following results (remember: the Extended Support End Date precedes EOL by one day):
I deliberately focused in on the most popular WS03 version to produce a tiny table.
Searching on “Windows Server 2003 R2″ instead will cover the whole product family.
Turns out there’s lots of work to do to prepare for a server migration — some of the most important aspects of which I’ll document in my upcoming SearchWindowsServer article for Techtarget (I’ll provide a link to that story right here as soon as it goes live) — so I’m simply stunned to realize that somewhere around 10 million servers in need of migration are still up and running some version of Windows Server 2003 right now.
If this applies to you or your organizations, it’s past time to get going on migration planning. Even with the end-of-year holidays almost upon us, somebody needs to get to work immediately on planning for this effort. The biggest stumbling block is likely to be application compatibility, according to those companies, organizations and enterprises who’ve already been through the exercise. With seven months left to go before the EOL date hits, that doesn’t leave much time to analyze compatibility issues and implement changes, workarounds, or replacements for the applications that so often provide the rationale for using servers in the first place.
If there’s a silver lining to this story, the necessity for change comes with two powerful potential improvements. First, it makes sense to rationalize and consolidate physical Windows Server 2003 server installations in some kind of virtualized form (which means some kind of hypervisor based virtual server environment, or some kind of virtual container for same). Second, it may also make sense to move those virtualized (and consolidated) servers that survive the migration process into the cloud. This will involve considerable work, certain expense, and solving numerous interesting and perhaps even challenging technical problems. But with the end of Windows Server 2003 now clearly in view, hopefully migration will also provide the opportunity to improve and strengthen IT operations along the way.
When I rebuilt my production PC at the beginning of November, I came face-to-face with a new incarnation of Intel’s Driver Update Utility aka DUU. Now out in version 2.0, the tool no longer depends on ActiveX running inside a Web browser (based on software from a company charmingly named “Husdawg”) to do its thing. Now there’s a full fledged Windows executable to handle the tasks involved in checking Intel drivers on a target PC. And given the scope of devices that Intel provides for most modern PCs nowadays — namely, processors, chipsets, USB controllers, display and audio controllers, and a whole lot more — there’s a lot for this utility to do when it’s run on a typical host.
Sure, the DUU still scans for and grabs drivers, but it now runs standalone and offers all kinds of other info, too.
In addition to scanning your PC and checking the drivers it finds installed, comparing those to the ones it knows about on the Intel site and that are judged to be compatible with your hardware, and providing links to new drivers available for download — a process the program calls “Auto detect and recommend drivers” — the new DUU also offers:
- Links to “the latest intel driver news and updates:” a page in the Intel download center with links to a user survey input form for the newly-rebuilt tool, plus links to versions 1 and 2 of this tool.
- Tracking of your downloads and installs using the tool, along with your download history, and a variety of program settings you can control, including the ability to target some other directory to receive file downloads, besides the default Downloads directory associated with the current logged-in account.
- Easy access to the Intel download center to search for drivers manually.
- Access to online help files, primarily through a series of FAQs on the Intel website.
Admins in charge of maintaining standard system images who have to keep drivers up-to-date will find this tool both useful and informative. But because Intel categorically refuses to recommend their latest driver versions until they’ve been tested and vetted with specific hardware configurations (including yours, perhaps?) this utility is not the be-all or end-all for obtaining and applying Intel driver updates.
In fact, it’s also good to note that the French website Station Drivers tracks the latest Intel drivers carefully and closely (and those of many other vendors besides) and is thus a great alternate source for the most current drivers available for Windows PCs, both WHQL and experimental or beta. Where they get this stuff and how they come up with it is a mystery, but it’s been a treasure trove for the latest drivers for me for years, and should also be on your favorites or bookmarks for those times when you know a new driver fixes some problem you’ve got, but you just can’t find the darned thing. Station Drivers may not always have what you’re seeking, but many times, it will!
As I work with a new Windows version, I have a strong tendency to go haring off after internals info to better understand how things work. Recently, I’ve had great fun digging into the volume shadow copy files and behavior under Win10TP. Somewhat less fun have been my attempts to figure out why the “Deployment Image Servicing and Management” tool, aka DISM, reports that both Windows 10 Technical Preview plain-vanilla and enterprise versions suffer from repairable corruption in their component stores, but neither are fixable using the command’s various /RestoreHealth options. The following screenshot tells the story in a single capture, which I’ll explain further below:
The symptoms look simple, but the fix provides impossible to perform.
Here’s what’s involved in the two commands shown, and what else I tried in my attempts to remedy the apparent defect reported:
1. The first DISM command simply inspects the running Windows image (that’s why /online is included), and checks its overall health and well-being.
2. The second DISM command attempts to repair any outstanding issues it finds in the component store (which resides in %windir%\winsxs, that often impenetrable repository for large and mysterious Windows OS component files).
3. To perform its repairs, DISM must have access to a known good working Windows image file (usually in the .wim format) so as to grab replacements for any damaged or corrupted items it might find. Alas, all my /RestoreHealth attempts produce the same error message, which indicates that the program can’t find the image file it needs to make repairs.
4. DISM supports a /LimitAccess option that prevents the utility from turning to Windows Update to find a working image (this is useful in environments where the Internet is not available, where access is purposely restricted, or when problems present and difficulties in accessing WU must be ruled out, as may be the case here). Turning this option on (or leaving it off, as by default) made no different.
5. DISM supports a /Source option that enables the command to target a specific set of source files. I downloaded the only Win10 TP ISO for build 9879 that’s currently available — namely, for the Enterprise version, through the MS TechNet Evaluation Center — mounted the ISO as a drive on my test system, and pointed the /Source option at the \Sources directory in that collection of files. No joy there, either, alas.
In short, nothing I tried let me fix the problem that /ScanHealth was reporting, despite access to all of what should have been the right stuff for either version of the Technical Preview, Build 9879. I am reluctantly forced to conclude that something in this Build is sufficiently wonky to prevent the utility from working as it should, and can only look forward to finding it fixed in some future build. I’ve reported the issue to MS via the “Windows Feedback” app, in hopes that their superior investigative talents and image analysis tools will help them to figure out how to link up the right sources to the repair utility to put the fix in. As is so often the case in a preview situation, it is interesting to see and learn from how such things unfold.
In working on this blog over the years, I’ve written repeatedly about a great open source file management project named WinDirStat. Short for Windows Directory statistics, this SourceForge project provides a nice compact tool for investigating and visualizing Windows volume layouts and contents. In working with Windows 10, however, I’ve noticed that Windows’ Volume Shadow Copies have more or less disappeared from view in that utility. In versions of Windows starting with Vista, WinDirStat has previously shown these files as Unknown when listing the contents of a boot/system drive, and has proved itself to be decent at keeping tabs on how much of a disk’s storage space is being consumed for the snapshots of disk volumes that the Volume Shadow Copy Service (which I’ll abbreviate as VSCS in future mentions, for brevity’s sake) is maintaining for Windows.
But as newer versions of Windows have appeared, the amount of disk space allocated for the VSCS has dropped, most likely in response to the increasing tendency for systems to incorporate faster but usually smaller SSDs for their boot/system drives. Whereas Vista allocated 15% of a disk’s overall storage for VSCS to use, Windows 7 took 5%, and Windows 8 2%. Thus, I was a little surprised to see the default Windows 10 VSCS allocation increase to 4% (though this could easily reflect increasing size for most newer SSDs, which are now typically 256 GB or larger). I was also surprised to see that the Unknown file bucket in WinDirStat no longer appears in Win10 displays in the %Homedrive% listing.
Shades of D Rumsfeld: In Windows 10 the Unknown bucket is no longer known to WinDirStat.
Given that WinDirStat could no longer give me the goods on VSC storage consumption, I turned to the Windows vssadmin command inside an administrative command prompt window. Readers unfamiliar or out of practice with this command will find the Vssadmin Command Reference useful in putting its many capabilities to work. I made use of the list shadows and list shadowstorage subcommands to determine that VSC was still active and working on my boot/system drive, and that volume shadow copies were indeed still consuming disk space as this screen shot illustrates:
Note: 2.16GB of a 10 GB allocation is already in use for the two snapshots on my Build 9879 C: drive.
I’ve been a big fan of the various Windows Forums sites for years. This tight-knit but small group of Windows mavens punches well above its weight class, and has had great success with similar sites for Windows 7 and Windows 8 pretty much since they first appeared. Thus, it came as no surprise to me that the same intrepid band has already put a Windows TenForums site together. What did come as a pleasant surprise, however, was how much useful content the group has already amassed on this still-new and emerging desktop OS. In particular, I want to give a special shout-out to their excellent Tutorials collection, whose total number is already closing on 60 in all, and which appears to be active and growing quickly as well.
The TenForums logo on top, and the tutorial categories underneath, just hint at the trove of good stuff available here.
Checking over the available tutorials can be a little challenging, because there are so many screens worth of information to peruse. That’s why the “Tutorial Index” that’s also available there can be helpful (though I wish they’d also produce a compact downloadable text-only version because perusing that list is itself a time-consuming process) to get a good sense of what’s available. I’ve already found a number of useful items myself there, so I’ll provide links to prevent the need for intensive scanning for others to find them, blithely assuming that what’s of interest to me might also be of interest to them as well. Here’s a list:
ESD to ISO – Create Bootable ISO from Windows ESD File (adds lots of useful information to my 10/24 post on this subject “New Win10 user? Build 9860 ISO for clean install“)
Hyper-V – Create and Use VHD of Windows 10 with Disk2VHD (lots of other Hyper-V tutorials as well)
Reinstall Windows Technical Preview with this media (new option in Boot Options menu: quick reinstall for Win10 TP that keeps only existing user accounts and associated personal files)
System Image – Create Hardware Independent System Image (lets admins create a Win10 image that includes a default user profile and additional post-install applications generalized to be hardware independent)
Upgrade to Windows 10 Technical Preview from Windows 7 or 8 (what you can bring forward from the earlier OS to the TP depends on which version of Windows you’re running: get those details here)
WinDBG – Install & Configure (how to install and configure the Windows Debugger, WinDBG, to perform “blue screen of death” or BSOD analysis)
If you want to learn about the Windows 10 Technical Preview, or would simply like to explore the information that’s available, check out the Windows TenForums today. And don’t forget the forums themselves, either: they’re active and you’ll find occasional gems among the forum traffic as well as in the tutorials.
For those who don’t want to run a tool like the great Rufus, MS offers another option via the Windows 8 Web pages. Prosaically named mediacreationtool.exe, this utility is available as a free download to all interested parties. Because it black-boxes access to the OS binaries used to create the UFD, and running the OS requires a valid key, this tool lumps the ISO download together with media creation and hides all the behind-the-scenes details from those who take advantage of its capabilities. Just for grins, I downloaded and ran the tool to see what it looked like, how it behaved, and how long it took to do its thing. I’ll recount my experiences in a series of screen captures numbered 1 through 7.
Before using the tool, one must first download it. Those who wish to skip the aforeliniked free download page, can go straight to the download link to grab it immediately. Move the file to a target directory from whence you’ll run it, and plug a USB flash drive into the host system (the contents of that drive will be obliterated as a part of the image creation process, so back anything up you want to keep around).
1. What kind of installation file do you want to create?
This is where you’ll enter a language choice (mine was “English – en-us”), the Windows edition you wish to install (mine was “Windows 8.1 Pro”), and the target architecture (32 or 64 bit; I chose “64bit (x64)”).
2. Choose where to save the installation file
You can either create an ISO file (for later transfer to optical media, like a DVD) or set up a USB flash drive (I chose the latter).
3. Choose a USB Flash drive
Pick an option from the local file system as the target for UFD creation (I chose the Mushkin 8 GB Atom drive named “Transfer” I’d mounted on my production PC for this test; you can use any UFD of 4 GB or larger for this task).
4. Downloading installation file
This is the phase of the process where the program finds and downloads the appropriate ISO file from MS servers to your local machine. It doesn’t access the UFD at this point, so presumably that means it’s writing to a temp file on the target PC’s %SystemDrive% somewhere. That file is around 3 GB in size, so you’ll want to make sure you’ve got sufficient disk space to accommodate it while the program is running. This was the longest part of the exercise: it took about 15.5 minutes on my production PC to grab the image necessary to create a 3.18 GB image on the UFD. If the size of the download is equal to the size of the resulting files on the UFD, actual throughput for this operation was 2.2 MBps/17.5 Mbps across my nominal 110 Mbps RoadRunner Internet link. The file was probably smaller, but I observed data rates in the 16-20 Mbps range during most of the transfer period through the Network Meter gadget, so that strikes me as a reasonably fair assumption.
5. Checking the download
The program performs an integrity check on the download once the file transfer has completed; this took about 1 minute to complete.
6. Creating the USB flash drive
If the download checks out OK, the process of formatting and building the bootable install image on the USB flash drive gets underway. I plugged the Mushkin unit into a USB 2 port on my production PC to get a worst case idea of how long that might take. With slower IO (data rates seldom exceeded 30 MBps during the process, and sometimes dipped below 10 MBps) this took just over 15 minutes to complete. Based on prior comparisons, this tells me that using USB 3 would cut that time to 5 minutes or just under.
7. Your USB flash drive is ready
If all goes well, you’ll get a final screen that tells you the process has completed. You must then click the “Finish” button to exit the program.
Once I had the final UFD to inspect, I observed that the file layout and contents are identical to what Rufus builds from the Windows 8.1 Pro ISO and its own capabilities. Thus, it appears that this tool should work for both UEFI and conventional BIOS PCs for installing Windows 8.1. Because I have easy access to all the current, supported Windows ISOs through MSDN, this tool doesn’t appeal to me as much as it will to other readers who lack such access. But this tool is worth knowing about and using, especially if one must build a bootable Windows 8.1 install device on the road or when otherwise separated from one’s usual admin toolkit. Overall time required to run it appears to involve something between 30 and 40 minutes over a medium-speed Internet link, so budget your time accordingly.
Having just rebuilt my production desktop one week ago, I’m still in the process of tweaking and tuning that system to bring it up to max performance. Over the weekend, I installed Samsung Magician 4.4, the latest version of the SSD utility for the 500GB EVO 840 drive installed as the boot/system drive on that machine. Then, I ran a pair of tests to see what impact this had on system performance. By at least one measure, the difference is astounding, as the following before and after screenshots will attest:
BEFORE: CrystalDiskMark shows that the mSATA port on the Gigabyte Z77X-UD3H is running only at 3 Gbps (half-speed, in other words).
AFTER: CrystalDiskMark shows an improvement of 1.5 orders of magnitude, WEI shows no change. What gives?
Turns out that the massive performance boosts on sequential read and write shown in the first two blocks of CrystalDiskMark measurements while dramatic, simply don’t reflect much file system activity in the real world (except perhaps when transferring files larger than the 1 MB default block size shown). The second two blocks (4K normal access, and 4 K with a queue depth of 32) are closer to real life, except that queue depths on most Windows desktops seldom exceed a range of 6-8, even under heavy read/write IO loads (see Samsung’s informative discussion of “Benchmarking Utilities: What You Should Know” for more good information on what’s going on here).
Thus, the results that stay more or less the same for Sergey Tkachenko’s implementation of the old Windows Experience Index (WEI) for Windows 8.* (and the Windows 10 Technical Preview) really reflect the overall impact of the drive optimization software on performance for Windows desktops. That’s not to say that these utilities are worthless, or that you shouldn’t use them: firmware updates, disk optimization, and over-provisioning capabilities can indeed improve performance and extend the usable life of such devices. I just don’t think anybody should expect them to offer major performance improvements simply by virtue of their ongoing presence in the runtime environment. At best, I believe that improvements they can offer are incremental (probably less than 10% on overall I/O) rather than dramatic (an order of magnitude or better, as the first two blocks of the CrystalDiskMark results might lead one to hope for).
Anybody who’s been reading this blog for a while knows that I collect – and blog about – useful Windows tools on a pretty regular basis. Here’s another one for consideration for your Windows toolbox, from developer and Windows-head Nic Bedford (whose UK-based Nic’s Blog is also worth tuning into from time to time): it’s called System Restore Explorer (SRE)and it finds all of the restore points defined on the current running Windows image, and allows you to choose one at a time for mounting as a folder on the system drive using Windows (or File) Explorer like this:
SRE finds and exposes the Volume Shadow Copies that represent Restore Points.
India’s Tech Gizmo wrote a nice review of this tool (though it’s pretty brief), and developer Paul Pruitt (who has developed a similar tool for more general exploration of volume shadow copies called Shadow Explorer) also gives Nic credit for crafting a useful tool as well. I need only point out that System Restore Explorer lets anyone explore the entire contents of any chosen Restore Point, and copy files from that restore point as needed, for most readers to fully appreciate what it’s good for. Followed with the observation that it provides a way to grab and restore or replace missing or damaged files from a current runtime image for Windows Vista, 7, 8, or 10, this lets Windows admins know why it’s useful to keep around for those occasions when it might come in handy. Great stuff!
I’ve been running the Windows 10 Technical Preview (Win10TP, as I abbreviate it in my blog post title) for about two weeks now, and I’m feeling better about the environment and the experience of running this latest desktop OS from Microsoft than I expected to be. In fact, I’m more than just a little bit impressed with the new environment’s ease of use, stability, and its willingness to accommodate a production system’s hardware and software components. So far, the only program that has flat-out refused to install on Win10TP is Franck Delattre’s excellent and informative CPU-Z utility (currently at version 1.71, which raises an incompatibility flag for build 9860, despite the Web site’s assertion that CPU-Z 1.71 “…adds the support of Windows 10,” which probably applies to build 9841 but does not yet extend as far as 9860).
Sergey Tkachenko’s WEI runs fine in Build 9860, despite issues with 9841, to report basic system performance ratings.
From an install and setup perspective, Win10TP follows very much in Windows 8’s entirely respectable footsteps: installation is quick, painless, and pretty easy. Applying updates ditto, though I’ve been spoiled by using Start8 on Windows 8 sufficiently that I’ve had to retrain myself to key search text in straight at the start menu/start window without jumping right into menu navigation.
Frankly, I was amazed to see Win10TP get 99% of the drivers right on the first boot-up into the OS following the initial install. It did miss a couple of devices (which showed up as “Unknown” in Device Manager) from the MSI Z87-G45 motherboard in my primary test machine, and like Windows 8, Win10TP doesn’t recognize Killer/Atheros Ethernet devices, either. Thanks to the StarTech ASIX GbE USB 3 NIC I keep around for tablets and notebooks devoid of wired interfaces, I was able to plug that bad boy into the test machine, and gain network access with which to obtain updates and download missing drivers lickety-split. DriverAgent (my driver analysis and access tool of choice) runs fine on Win10TP, and I was able to use it to grab the elements that I needed to bring that machine up to snuff.
It took me less than 15 minute — a new “personal best” — to bring all the Win10TP drivers up-to-date after a clean install. Amazing!
I’m still in the process of re-creating a typical production environment on my test machine, beavering away in my spare time. So far I’ve installed the following items successfully, reading from the “All Apps” menu on that PC: 7-Zip, 8GadgetPack (Core Temp even works with the CPU Usage gadget), CCleaner, Chrome, Intel Management Engine Interface and the great new Driver Update Utility v2.0, Logitech SetPoint, Microsoft Mouse and Keyboard Center, NVidia 3D Vision center and so forth, SlimImage, and WinDirStat. All appear to work correctly. I’ll be moving onto MS Office 2013 next, as soon as time permits.
Earlier this month, I sold the Fujitsu Q704 Stylistic tablet PC that I purchased last January, having learned as much from it as I could, and having also decided it didn’t present enough performance and stability for the costs involved in acquiring and maintaining that platform. Early last week, I ordered a Surface Pro 3 (i7-4650U CPU, 8 GB RAM, HD Graphics 5000, 256 GB SSD) to replace that unit, so as to give me a Windows tablet to play and work with. It arrived on Friday afternoon, about the same time my son came home from school. I was in the middle of upgrading my production PC, so the last thing I wanted to do was to unbox and set up another new PC. “That’s OK, Dad,” said Gregory, “I’ll do it.” And do it he did, all by himself (with a little help logging into my Microsoft Account) to the point where he used the system to do his homework this weekend.
The latest addition to our computing stable is already a huge hit with the younger generation.
I stayed busy through the weekend working on my production PC (which I’m writing this blog post on right now), applying updates, catching up drivers, installing MS Office and a bunch of other applications. I also decided to consolidate 4 of my older and smaller 3.5″ hard disks (ranging in size from 750 GB to 1.5 TB) onto my remaining spare Toshiba 3 TB SATA3 3.5″ HD, which supported data throughput over 100 MB/sec in its USB3 drive caddy for really big files (and probably averaged about half that overall during the entire drive copy marathon session involved).
An interesting and terrifying dilemma emerged on Sunday morning, as I was continuing my setup marathon. Suddenly, for no reason I could discern, I found myself unable to use my keyboard on any of the machines I was logged into with the shared Microsoft account I typically use. When my son “accidentally” reset the desktop theme on the Surface to High Contrast, and the same theme immediately popped up on my production PC’s screen and that of my traveling Lenovo laptop, I realized that something about the account settings made on the Surface was preventing my other machines from using their keyboards. A little poking around on the notification area showed me that my son had enabled Sticky Keys and Filter Keys on the Surface to improve use of the Type cover on that machine. Unfortunately, those settings also turned off the keyboard on the other Windows systems that shared those settings. Though it took me over half and hour to get to the bottom of the situation and find a fix (turn off both of them completely), once properly diagnosed it was relatively easy to work around. Of course, because I didn’t immediately understand what was going on, I first tried multiple keyboards on my production desktop without success. It was only when I turned to the Lenovo and found its keyboard out of commission as well, even though the keyboard drivers reported those peripherals as present and working, then saw the sudden change of desktop them across all systems, that I figured out the shared account settings must be involved.
This is a level of synchronization that I hadn’t encountered as a problem before. I’ll use this experience to warn admins to tell their users that they should be careful with account settings, particularly when they run the same Microsoft account across multiple machines. That also raises the interesting query of how all this will play out when people start running the same account on their smartphones as well as on conventional PCs.