Windows Enterprise Desktop

February 3, 2012  3:50 PM

Trends Notwithstanding, XP Stays Atop the OS Heap

Ed Tittel Ed Tittel Profile: Ed Tittel

Trends don’t always translate into facts, as a quick glance at this “Desktop Top Operating System Share Trend” graph for March 2011 through January 2012 illustrates.

The NetMarketShare trends suggest imminent inflection in December then diverge in January

The NetMarketShare trends suggest imminent inflection in December then diverge in January

I must confess that I (and numerous other pundits and online panjandrums) interpreted this graph to mean that Windows 7 would surpass XP last month, but that’s not how it turned out. Instead of each line following its long-established trend, numbers for XP actually jumped a little in the period from December 2011 to January 2012, from 46.52% to 47.19%, while the numbers for Windows 7 dipped slightly from 36.99% to 36.40% in the same period. Go figure!

Thanks to the editors at Tom’s Hardware for bringing this to my attention, in their February 2, 2012, story entitled “Report: Windows XP is Still The Dominant OS.” They speculate — and I agree — that still-lagging economies (especially in Europe, sorely best by the Greek debt crisis and the Euro melt-down) can easily account for the recent stall in the XP-to-Windows-7 cut-over that’s been chugging along for the past two years or better. Likewise, they observe that the pending release of Windows 8 in the third or fourth quarter of 2012 will further complicate matters, and should probably cause the trend lines for both of those OSes to decline further and faster.

One thing’s for sure: it’s still impossible to write off Windows XP, and it’s still around in huge and substantial numbers. With the “absolute retirement date” of this venerable OS slated for April 2014, it’s amazing it’s been able to cling to life and major market share for such a long, long, long time!

February 1, 2012  2:59 PM

More on Chipset Drivers and the Intel Driver Update Utility

Ed Tittel Ed Tittel Profile: Ed Tittel
Last Friday (January 27, 2012) I posted a blog entitled “Intel Driver Update Utility and Intel Chipsets,” wherein I observed that this useful online tool — which analyzes the systems on which it’s running, and reports on the currency and status of the drivers for Intel components they contain — doesn’t always recommend the most current Intel chipset driver. This observation nagged at me sufficiently that I did some additional research to determine why this might be the case, and how to find and use the most current such drivers for those who choose to use them.

About the “latest version” of the Intel Chipset Software

Intel proffers a Web page entitled “Do I need to upgrade to the latest version” under the general head of Intel Chipset Software Installation Utility. This page suggests an “If it ain’t broke, don’t fix it” approach to updating the Intel chipset software and its associated collection of drivers. Here’s what that information says, verbatim:

Intel® Chipset Software Installation Utility
Do I need to upgrade to the latest version?
Upgrade to the latest version if you are experiencing an issue listed under Issues Resolved in the latest release notes. The latest version of the Intel® Chipset Software Installation Utility and the release notes are available in Download Center.If you upgrade to the latest version, follow the have-disk installation instructions. The utility cannot install using setup.exe if another version is already installed on your system.

Finding the “latest version” of the Intel Chipset Software

In the Intel Download Center, specify the selections “Chipsets” (Product Family), “Chipset Software” (Product Line), and “Intel Chipset Software Installation Utility” (Product Name) on the initial Find Downloads by product name window that appears. This produces a Find downloads by categorywindow where you can specify your operating system (Windows 7 (32-bit) in my case) and Download Type (I picked “Utilities, Tools, and Examples” because that’s what provides the INF update utility that automatically updates Windows drivers for those willing to bypass the preceding instruction to “follow the have-disk installation instructions” (the chipset driver itself isn’t accessible through Device Manager in any case).

This is where things start getting interesting. Intel’s label for “Latest” driver applies to the file they’ve updated most recently NOT to the file with the highest version number (which actually indicates the most current driver). Here’s a screen shot that illustrates this potential point of confusion very nicely:

Pay careful attention to the date and version columns when selecting driver downloads

Pay careful attention to the date and version columns when selecting driver downloads

Notice that the utility with the higher-numbered version ( versus also has the more current date (8/27/2011 versus 4/21/2011). It’s essential to pay close attention to all of this information if you want to override the recommendations of the Intel Driver Update Utility, and choose the right version to download and install on your PC.

[Note in response to reader comment: You MUST identify the chipset in your PC and search on that particular chipset model to find the drivers and installation software that match it. I don't mean to suggest that the version numbers mentioned in this or the previous posting are prescriptive for all PCs and all readers. Rather, they were meant only as illustrations of the various version numbers that I encountered for the various chipsets on different PCs. Thus, for example the X38 chipset on my production PC gets version mentioned elsewhere in this post. OTOH, my D620 Latitude notebook has a Mobile Intel 945 Express chipset and gets version instead. As they say on the Internet, YMMV, so be sure to grab only what's relevant for your particular chipsets.]

January 30, 2012  4:02 PM

Windows 8 and ARM processors

Ed Tittel Ed Tittel Profile: Ed Tittel

An interesting rumor popped up on SlashGear this morning, in an article from Chris Davies entitled “Windows 8 on ARM stable release in February tip developers.” It seems that despite earlier issues with the stability of the developer preview for Windows 8 on ARM CPUs, another, much more stable release of Windows 8 will be released in February — again for developers — to coincide with the “customer preview” release of Windows 8 for x86 CPUs that heralds another milestone in the march to RTM (release to manufacturing) and GA (general availability).

Citing sources from CNET, Davies said “…there’s no obvious reason that [an] ARM-vesion of Windows 8 … should be ‘staggered’ from the traditional x86 build.” Furthermore, the same sources opined that “…ARM alternatives to Intel and AMD based Windows 8 machines could ‘undercut them by hundreds of dollars…’” Given the popularity of chips from Texas Instruments and Qualcomm, among others (these two companies showed Windows 8 prototypes at this year’s CES), for smartphones, this could be a huge coup for Microsoft in finally making a dent in that non-PC marketplace.

Should be very interesting to see how all this plays out. Stay tuned!

January 27, 2012  7:17 PM

Intel Driver Update Utility and Intel Chipsets

Ed Tittel Ed Tittel Profile: Ed Tittel

The Intel Driver Update Utility is handy but not always completely current
The Intel Driver Update Utility is handy but not always completely current

The Intel Driver Update Utility(IDUU) is a handy-dandy software tool that depends on a “Systems Requirements Lab” active widget from Husdawg to scan PCs for Intel components, and to report on their current update status. It’s pretty useful, and has become part of my normal maintenance routine for checking driver status on the PCs I manage. I also use DriverAgent (though other tools like RadarSync, Driver Detective, Drive Guide, and so forth also do pretty much the same thing) but I’ve observed that Intel is often better at keeping up with its own drivers than third parties, so I’ve come to depend on IDUU to help me keep my Intel drivers as current as can be.

But recently, I noticed something subtle about the language that the IDUU uses to report on drivers it finds, that in turn led me to realize that Intel apparently doesn’t care if a driver is the most current in every case. Rather, it appears to care only that some drivers are “valid,” even when newer drivers may be available. Notice this report block on the Intel Chipset Software Installation Utility:

Now, compare this to the language used for the built-in Realtek RTL8013EL Ethernet GbE adapter on that same motherboard:

In the chipset case, the version is valid, but in the Ethernet adapter’s case the driver is current. “Hmm…” I wondered, “Is there a difference between valid and current?”

The answer turns out to be “Yes.” By researching the most current Series 3 Chipset Driver through a manual search on Intel’s Download pages, I was able to determine that the highest numbered version available was My machine was happily running version instead. A quick download and install took care of that issue as the preceding chipset screen capture now attests, but this leaves me wondering why the IDUU doesn’t tell its users that a more current version is available. My best guess is that it waits until some critical feature gets introduced in a newer update, and only then instructs its users to update their drivers. That would be a reasonable approach to driver updates where “if it ain’t broke don’t fix it” often prevails as a guiding principle. I can only imagine that’s why Intel also labels the version as valid, even though it’s also the most current as well.

But if, like me, you want at least some machines to always be running the absolute latest and greatest Intel drivers, if only for test purposes, it’s good to know that when you see “valid” in this tool you should probably go looking for something more current, just in case it’s out there for downloading.

January 25, 2012  8:55 PM

About Sensors and Windows 8

Ed Tittel Ed Tittel Profile: Ed Tittel

The latest entry in the Building Windows 8  blog is called “Supporting Sensors in Windows 8,” and it comes from Gavin Gear, one of the Microsoft Product Managers on the Windows 8 Device Connectivity team. It tells a fascinating story of how (and why) the basic Windows 8 hardware requirements include so many sensors (an accelerometer, a magnetometer, and a gyroscope, among others) so that application  and service developers can count on basic system capabilities when building next generation software of all kinds. The following diagram helps to illustrate how combining such hardware devices together can support smoother and more capable physical functions, not just for the usual purposes (screen rotation and orientation) but also to support lots of other interesting capabilities as well (including game controllers, smart remotes, measurement and data acquisition, and so forth):

Position and motion sensing devices work better in concert than by themselves

Position and motion sensing devices work better in concert than by themselves

Mr. Gear follows up his explanation and discussion with some data traces from an accelerometer to depict why straightforward use of such data doesn’t always produce the best user experience. From there he goes on to explain how MS has worked with hardware manufacturers and device makers to create a single Microsoft driver that can “…work with all Windows-compatible connected over USB and even lower power busses…” The upshot is that any sensor company can construct a Windows 8 compatible package by adhering to “…public standard USB-IF specifications for compliant device firmware.”

Pretty slick already, but MS also worked to control power consumption and sensor performance as well by enabling sensor processing to occur at the hardware level without involving the CPU, and by building filtering mechanisms to restrict data flow (and event rates) to speeds that won’t exceed the processing stack’s handling capacity (this also helps reduce consumption of CPU cycles).

The blog concludes with a discussion of how sensors can play into Windows 8′s new “Metro-style” apps, using a new sensor API included as part of the Windows 8 runtime environment (aka WinRT). Code examples show the APIs to be simple and straightforward, making them easy for developers to put to work. It should be interesting to see how easy access to sensors and location data helps raise the bar for Windows 8 apps over the years and months ahead. There’s even a product reference to a specialty sensor part built into last year’s Windows 8 Developer Preview slate PC from Samsung, now that developers can purchase such parts on the open market, to help get this phenomenon rolling.

January 23, 2012  2:51 PM

Interesting discussion on fairness and ease for MS cert exams

Ed Tittel Ed Tittel Profile: Ed Tittel

As part of my daily due diligence in writing this blog, I keep up with the postings on Microsoft’s Born to Learn blog. A recent post (1/17/2012) from Andrew Bettany entitled “Have MSTS/MCITP exams got easier?” caught my eye this morning, as much or more because of the comments to this post as for the post itself. Be sure to check it out if you’ve got a Microsoft exam in your near-term future.

But what really got my wheels turning were the concepts of “fairness” and “challenge” raised in a comment by Microsoft’s lead psychometrician, Liberty Munson, especially in light of my recent brokering of contact between herself, an exam developer, and a friend and colleague who had taken the 70-662 Microsoft Exchange Server 2010, Configuring exam no less than four times without managing to pass it on any attempt. As somebody who’s actively engaged in consulting with major Fortune 1000 companies to help them transition hundreds of thousands of email users into Exchange 2010, I have trouble accepting the notion that he simply doesn’t understand the platform and its capabilities. In that light, consider what Ms. Munson has to say in this posting:

Thanks for great post! It shouldn’t surprise anyone that people who pass exams tend to think they are easy while those that fail believe that they are too hard. The key is to make sure that those who fail also believe that they were “fair assessments” of their skills while those who pass also found them to be “challenging” (or “easy but challenging”). That’s nuance of “easiness” that is often overlooked by candidates and hiring managers. Exams should be “easy” for qualified candidates because we’re assessing skills that they have, but they should also be “challenging” and fair assessments of skills. We rarely get past the “it’s too easy” part of the conversation to understand the deeper nuance…Was it challenging? Was it fair? Does it differentiate you as someone who knows the technology? It’s possible that our exams haven’t been challenging–that this is assumed in the “it’s too easy” statement–but there’s a deeper question here that needs to be asked.

Enough with the philosophical aspects of this conversation :)…over the last year, we’ve made changes to our item writing guidelines and use of item types to increase the percieved difficulty of our exams because we understand that the conversation generally stops at “it’s too easy”…certain item types and item writing strategies can be leveraged to make exams feel more challenging without requiring more skills or knowledge. However, as we introduce new certifications, you’ll find that we’ll be requiring deeper knowledge and higher level skills to address comments that “it’s too easy.” Psychometrically, I regularly monitor the difficulty of the exams and ensure that it’s at an appropriate level for the target audience and programmatic goals/definitions of what it means to hold the credential. Changes are made when and where necessary to meet these targets.

I’d love to start hearing people talk about our exams as “challenging and fair assessments” as we continue to improve the ability of our certifications to differentiate people who really know their stuff from those who don’t or just sorta do. Andrew–Thanks for starting to get the word out!

Clearly, some of the ongoing renovation of existing MS exams and development of new ones incorporates great concepts and intentions on the part of the people who work with SMEs to put those exams together, and to measure and monitor them for statistical relevance to the populations being tested. But “fairness” is a difficult concept to get right, especially when exam questions veer into the realms of what my aggrieved friend and colleague perceived as mostly “administrivia” that real-world admins would simply look up if they had to deal with such seldom-used (or rarely-needed) features and functions. In his case, he’d readily concede that the 70-662 exam is challenging, but also vehemently claims that it’s grossly unfair.

I symthpathize with both parties in this kind of encounter. On the one hand, having created hundreds of practice tests for the certification exams covered in my many Exam Cram and study guide certification books, I get how difficult it is to create sound, meaningful and relevant exam items for candidates to ponder and learn from. On the other hand, I also really feel for my friend’s situation where his consulting company requires him to pass the 70-662 exam so he can present the “right credentials” to his many clients, but where he feels that the exam is more of a random check on obscure or little-used Exchange features and functions, than a meaningful assessment of his skills and knowledge of the platform.

This is a tough situation, and one that requires not only that both sides recognize each other’s motivations and priorities, but also that the testing party provide enough information to testees to let them understand what they must learn to get past the exam, and that testees take this seriously enough to make a concerted effort to learn and master the necessary skills, knowledge, and even administrivia details to get over the hump. Usually, taking a Microsoft Official Curriculum class, reading the books, and acing numerous practice exams is enough to guarantee a good result. But when this tried-and-true method fails, it’s time to start asking why and trying to reason one’s way into the proper information set to figure out what’s wrong. My colleague also believes that MS should provide testees with more detailed feedback about questions answered incorrectly rather than simply flagging certain concept areas where the test-taker failed to meet minimum score requirements.

All this goes to show that certification exams really aren’t easy to create, and sometimes can be incredibly difficult to pass as well. It should be interesting to see how this situation plays out, and how the two parties will find some kind of rapprochement.

January 20, 2012  4:14 PM

Thurrott & Rivera on Windows 8 Logo Requirements

Ed Tittel Ed Tittel Profile: Ed Tittel

In his most recent monthly column for WindowsITPro entitled “The Rigor of Windows 8 Hardware Certification Requirements, …” Windowsmeister Paul Thurrott provides a snapshot of the current requirements that Microsoft intends to levy on those seeking to put the Windows 8 logo decal on their PCs, notebooks, tablets, and smartphones. It’s an interesting and detailed collection of info, based in turn on Rafael Rivera’s equally interesting WithinWindows blog entitled “Windows 8 Secrets: PC and Device Requirements.” Both are worth reading in their entirety; here I’ll summarize some of the most interesting high points.

A "mock logo" for Windows 8

A mock logo for Windows 8


  • 5-point digitizers: Windows 8 touch PCs must support at least 5 touch points (five fingers’ worth, guaranteeing support for complex multi-touch swipes and gestures).
  • NFC “touch marks”: Near field communications lets devices interact with each other when they’re so close as to be nearly touching, and are key to emerging technologies for payment transfers especially from smartphones to cash registers (or computers acting like same).
  • 5 required hardware buttons: devices must provide buttons for power, rotation lock, the Windows key, and volume up/down controls. Also, devices that belong to a domain that lack keyboards must support Windows Key + Power as an alternative to the Ctrl+Alt+Del key sequence (the infamous “three-fingered salute” for those who remember DOS and early Windows versions).
  • Minimum component requirements include 10 GB of free storage space after Win8 is installed, support for UEFI (see my 9/23/2011 blog for more info), WLAN, Bluetooth and LE networking, Direct3D 10 graphics with a WDDM 1.2 driver, 1366×768 screen resolution or better, 720p camera, an ambient light sensor, a magnetometer and an accelerometer, a gyroscope, at least one USB 2.0 controller and exposed port (or better), and audio speakers. Sounds like a decent smartphone or tablet, eh?
  • Graphic drivers must be upgradeable without requiring a reboot (easier to do thanks to dropping XDDM drivers and keeping only WDDM drivers for Windows 8).
  • 2-second resume(does not apply to ARM-based devices yet) from Standby (S3) to “resume complete” status. Rivera speculates “Microsoft simply doesn’t have enough data in this space” to impose the same restriction on ARM as on x86; Thurrott speculates that “it will be added in a future release, such as Windows 9.”

Veeeeeeeeeeeeeeery interesting. Be sure to check out the original posts, too!

January 18, 2012  4:18 PM

Windows 8′s “next generation file system” gets a preview

Ed Tittel Ed Tittel Profile: Ed Tittel

Building Windows 8 strikes again, this time with a 1/16/2012 blog from Surendra Verma, development manager for the Windows 8 Storage and File system team. It’s entitled “Building the next generation file system for Windows: ReFS.” The ReFS acronym is expanded in the FAQ that follows the blog post to mean “Resilient File System;” that same FAQ also documents some awe-inspiring maximum file system attributes.

What’s New About ReFS?

Verma takes several cuts at answering this question without necessary addressing it completely explicitly, so I’m doing my best to read between the lines here. It appears that ReFS comes with a new engine for file access and management, and also includes enhanced verification and auto-correction facilities. In Verma’s words, ReFS is optimized for “extreme scale” to “use scalable structures for everything.” Interestingly, ReFS volume will never be taken offline: even when corruption occurs, unaffected volume elements will remain available and accessible. And a new resiliency architecture (when used in concert with Storage Spaces) will help to protect and preserve volume contents. Verma’s list of key features also highlights plenty of new capability: 

  • Metadata integrity with checksums
  • Integrity streams providing optional user data integrity
  • Allocate on write transactional model for robust disk updates (also known as copy on write)
  • Large volume, file and directory sizes
  • Storage pooling and virtualization makes file system creation and management easy
  • Data striping for performance (bandwidth can be managed) and redundancy for fault tolerance
  • Disk scrubbing for protection against latent disk errors
  • Resiliency to corruptions with “salvage” for maximum volume availability in all cases
  • Shared storage pools across machines for additional failure tolerance and load balancing
Some changes underneath, but not much change on top

Some changes underneath, but not much change on top

What Remains in Common between ReFS and NTFS?

Not surprisingly, Microsoft has also elected to “maintain a high degree of compatibility with a subset of NTFS features that are widely adopted” although they do intend to “deprecate other … [features] … that provide limited value at the cost of system complexity and footprint.” This means that BitLocker encryption, access-control lists (ACLs), the USN journal, change notifications, symbolic links, junction, mount and reparse points, volume snapshots, file IDs, and oplocks will remain the same as those used in NTFS.

However, there will be no conversion utility to transform NTFS formatted volumes to ReFS. Instead Microsoft advocates copying data from NTFS into ReFS to make the change, and also to allow the resulting ReFS volumes, file collections, and data sets to enjoy the benefits of the new ReFS structures, maximum sizes, and resiliency. For the time being, ReFS will not be used to boot volumes in Windows 8 (though I’ve seen some discussions of making ReFS volumes bootable for the upcoming version of Windows Server due out late 2012 or early 2013).

What’s Gone from ReFS?

Verma does address this as “What semantics or features of NTFS are no longer supported on ReFS?”

The NTFS features we have chosen to not support in ReFS are: named streams, object IDs, short names, compression, file level encryption (EFS), user data transactions, sparse, hard-links, extended attributes, and quotas.

This is an interesting grab-bag of features, among which there are some obvious items: EFS was never that popular, and had sufficient problems to make lots of enterprise users steer clear, short names have been passe for 10 years or more, and named streams (aka “alternate streams”) have posed some clear and present security dangers. Extended attribute elements (also including object IDs) support the distributed link tracking service, which can be convenient but also had some problems, as did sparse files and hard links. It’s a little surprising to see file compression and quotas going away, and user data transaction handling was never implemented much anyway.

Once again, it’s “in with the old, and out with the new.” This should give Mark Russinovich and his colleagues something interesting to update in the next edition of Windows Internals, which should give me an opportunity to understand the impact of these changes in more detail.

January 16, 2012  11:47 PM

Compatibility Rears Its Ugly Head for Bluetooth

Ed Tittel Ed Tittel Profile: Ed Tittel

Ah, life can be strange when trying to get devices to play nice together. In this particular case, I decided to add a Bluetooth dongle to my son’s Acer 5552 notebook PC. With its AMD Athlon II P340 Dual-Core CPU, 4 GB RAM, and a Western Digital 5,400 RPM 250 GB hard disk, it’s a real little workhorse that matches my old reliable Dell D620 in overall performance and capability, but the killer deal I got from TigerDirect didn’t include Bluetooth in its $317 price tag.

The Acer 5552 is a modest but capable budget notebook PC

The Acer 5552 is a modest but capable budget notebook PC

If you read my IT Career JumpStart blog, you may recall that the one dated 12/26/2011 recounted some interesting adventures in getting my son Gregory’s Lego MindStorms robot kit up and going. In the weeks that have followed, he and I have added sensors to his robot, and have spent a number of happy and busy hours programming the machine to do all kinds of interesting stuff. When I discovered that the NXT 2.0 controller in his kit supports Bluetooth, I naively assumed that it would be easy to get notebook and the robot communicating with each other, especially when I was able to get the built-in Bluetooth on my HP dv6t traveling notebook and the NXT 2.0 controller working more or less instantly (after puzzling through the often cryptic menu navigation on the controller’s 2×3″ LCD screen, that is).

But wow, was I ever surprised by how much time, effort, and running around (with a certain amount of blue language for my Bluetooth issues along the way) was need to get things working with a plug-in USB dongle on the Acer notebook. I started out with the TrendNet TBW-106UB USB Class 1 Bluetooth v2.0 and v2.1 with EDR dongle I’d purchased for an HP course I revised in early 2010 that required me to write about Bluetooth printer connections (which I can now confess I plugged into a Dell 968 All-in-One Printer rather than an HP unit). But alas, though the TrendNet dongle does Plug-n-Play faultlessly on the printer, I couldn’t get it to work with the Acer 5552. I messed with driver after driver, and even went down another rathole related to an update to the BlueSoleil software that TrendNet supplied with the dongle.

The TrendNet TBW-106UB

The TrendNet TBW-106UB

Later, in reviewing the NXT documentation I learned that its Bluetooth works only with the generic Microsoft and the Widcom Bluetooth stacks (the BlueSoleil stuff is specifically identified as incompatible, alas). So off I went to the Fry’s website to see what kinds of USB Bluetooth dongles they had in stock at my local store, and ended up grabbing a nice Asus USB-BT211 from them for $14.99 (with a $5.00 mail-in rebate that brought the price down to a paltry $10, or about $17 less than the BlueSoleil software I purchased to updated the TrendNet unit, only to learn I couldn’t use it with the NXT robot anyway…sigh). I was able to update the drivers all the way up to the most current March 2011 Atheros AR3011 version available from the DriverAgent website, too, and keep both the notebook and the NXT 2.0 controller happy with its Bluetooth capability.

The cheaper Asus USB-BT211 did the trick

The cheaper Asus USB-BT211 did the trick

But that took some doing. I quickly learned I had to uninstall old versions of drivers before installing new ones, and equally quickly learned it was easier to take advantage of the restore point created when installing an outdated driver as a better way of restoring the 5552 to a pristine state than to rely on the uninstall code that the driver writers furnished with their software.

And now, finally, we can use the 5552 as a wireless remote to drive the robot around (fun for the boy) and as a handy-dandy way to download programs from the notebook to the robot controller (convenient for us, and doesn’t require a relatively short 1 m tether for a USB Type A-to-Type B plug-in cable between the two devices, either). This was my first big Bluetooth fracas, though, and I must confess I learned an awful lot about the various available drivers, protocol stacks, and the Widcom and BlueSoleil Bluetooth utility packages.

On the whole, I would’ve been just as happy, or perhaps even happier, to have just had the whole thing work plug-n-play as it did immediately on my HP notebook. But that’s not always the way things go in Windows World, is it?

January 13, 2012  5:19 PM

Waaaay Cool: Virtualizing Storage in Windows 8

Ed Tittel Ed Tittel Profile: Ed Tittel

A week ago, the latest post to the Windows 8 blog was entitled “Virtualizing storage for scale, resiliency, and efficiency.” And while the whole thing is definitely worth perusing, I want to zoom in on something that catches my particular fancy in this blog (it also caught Word and Windows guru Woody Leonhard’s attention in the latest Windows Secrets newsletter, too, which is what initially brought it to my attention; his piece is called “Storage Spaces might be Win8′s best feature yet“).

But first, I have to explain why I think the Storage Spaces feature is so awesome, by hearkening back to another hobby horse of mine — namely, Windows Home Server, aka WHS (I’ve blogged five times on this platform right here: once in 2008, and twice each in 2009 [1,2] and 2011 [1,2]). Turns out that Windows Server used to support a feature called Drive Extender that essentially allowed you to shrink or grow a multiple-disk logical volume simply by adding or removing space, then doing likewise with hard disks, on the system. No need for migration, recopying, partition re-sizing, Disk Management, and so forth: just a few commands to tell the file system what’s going on, and smooth sailing throughout.

I was incredibly bummed when it became clear that WHS 2011 (the latest and greatest version, out last year) did not continue earlier versions’ support for this feature, and thought perhaps we might never see its like again in the Windows World. I was wrong! In Windows 8, Storage Spaces lets users set up logical volumes that are bigger than the combined disk space provided for them, so you can grow the physical space over time to match the logical definition (Microsoft calls this “thin provisioning”). Here’s how Rajeev Naga, a group manager for Microsoft’s Windows 8 Storage and File System team explains what Storage Spaces can do:

Organization of physical disks into storage pools, which can be easily expanded by simply adding disks. These disks can be connected either through USB, SATA (Serial ATA), or SAS (Serial Attached SCSI). A storage pool can be composed of heterogeneous physical disks – different sized physical disks accessible via different storage interconnects.

Usage of virtual disks (also known as spaces), which behave just like physical disks for all purposes. However, spaces also have powerful new capabilities associated with them such as thin provisioning (more about that later), as well as resiliency to failures of underlying physical media.

And if and when you exceed the original maximum allocation for this storage pool, Storage Spaces simply notifies you in a pop-up window that you need to add more capacity, which you can do by plugging in more disks, then allocating them to that pool. Even cooler, pools can be designated as mirrored so that there will always be at least two (possibly three) copies of data on different physical disks within that pool. And should any disk fail, Storage Spaces automatically regens data copies for all affected holdings as long as enough alternate physical disks remain available within that pool. It’s not RAID, but it is fault-tolerant and resilient, and a great technology for all kinds of SOHO and end-user applications. The parity  designation also permits data to be reconstructed in the face of physical disk failures. They work best for big, relatively static files (like music, movies, home videos, and so forth) which are sequential in nature and require little or no updates to existing files.

Here’s how to work with Storage Spaces. You can use Windows PowerShell commands to create a storage pool, and to set up Storage Spaces that reside in that pool, or you can use the Storage Spaces item in Control Panel, to go through that process. This means selecting drives, then naming a storage space, assigning a drive letter, establishing the layout (default, mirrored, or parity), and assigning a maximum size. Afterward, you can always add more drives as needed, or make changes to configuration data likewise. Right now, the Win8 Developer Preview only supports storage spaces up to 2 TB in size, but when the Win8 Beta ships in February (next month, that is) you’ll be able to create storage spaces of arbitrary size.

What a great feature to bring to the Windows desktop. This may indeed be a true “killer feature” that prompts users to to upgrade to the new OS to exploit its capabilities, particularly those with big media collections!

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: