Windows Enterprise Desktop


January 20, 2012  4:14 PM

Thurrott & Rivera on Windows 8 Logo Requirements

Ed Tittel Ed Tittel Profile: Ed Tittel

In his most recent monthly column for WindowsITPro entitled “The Rigor of Windows 8 Hardware Certification Requirements, …” Windowsmeister Paul Thurrott provides a snapshot of the current requirements that Microsoft intends to levy on those seeking to put the Windows 8 logo decal on their PCs, notebooks, tablets, and smartphones. It’s an interesting and detailed collection of info, based in turn on Rafael Rivera’s equally interesting WithinWindows blog entitled “Windows 8 Secrets: PC and Device Requirements.” Both are worth reading in their entirety; here I’ll summarize some of the most interesting high points.

A "mock logo" for Windows 8

A mock logo for Windows 8

 

  • 5-point digitizers: Windows 8 touch PCs must support at least 5 touch points (five fingers’ worth, guaranteeing support for complex multi-touch swipes and gestures).
  • NFC “touch marks”: Near field communications lets devices interact with each other when they’re so close as to be nearly touching, and are key to emerging technologies for payment transfers especially from smartphones to cash registers (or computers acting like same).
  • 5 required hardware buttons: devices must provide buttons for power, rotation lock, the Windows key, and volume up/down controls. Also, devices that belong to a domain that lack keyboards must support Windows Key + Power as an alternative to the Ctrl+Alt+Del key sequence (the infamous “three-fingered salute” for those who remember DOS and early Windows versions).
  • Minimum component requirements include 10 GB of free storage space after Win8 is installed, support for UEFI (see my 9/23/2011 blog for more info), WLAN, Bluetooth and LE networking, Direct3D 10 graphics with a WDDM 1.2 driver, 1366×768 screen resolution or better, 720p camera, an ambient light sensor, a magnetometer and an accelerometer, a gyroscope, at least one USB 2.0 controller and exposed port (or better), and audio speakers. Sounds like a decent smartphone or tablet, eh?
  • Graphic drivers must be upgradeable without requiring a reboot (easier to do thanks to dropping XDDM drivers and keeping only WDDM drivers for Windows 8).
  • 2-second resume(does not apply to ARM-based devices yet) from Standby (S3) to “resume complete” status. Rivera speculates “Microsoft simply doesn’t have enough data in this space” to impose the same restriction on ARM as on x86; Thurrott speculates that “it will be added in a future release, such as Windows 9.”

Veeeeeeeeeeeeeeery interesting. Be sure to check out the original posts, too!

January 18, 2012  4:18 PM

Windows 8′s “next generation file system” gets a preview

Ed Tittel Ed Tittel Profile: Ed Tittel

Building Windows 8 strikes again, this time with a 1/16/2012 blog from Surendra Verma, development manager for the Windows 8 Storage and File system team. It’s entitled “Building the next generation file system for Windows: ReFS.” The ReFS acronym is expanded in the FAQ that follows the blog post to mean “Resilient File System;” that same FAQ also documents some awe-inspiring maximum file system attributes.

What’s New About ReFS?

Verma takes several cuts at answering this question without necessary addressing it completely explicitly, so I’m doing my best to read between the lines here. It appears that ReFS comes with a new engine for file access and management, and also includes enhanced verification and auto-correction facilities. In Verma’s words, ReFS is optimized for “extreme scale” to “use scalable structures for everything.” Interestingly, ReFS volume will never be taken offline: even when corruption occurs, unaffected volume elements will remain available and accessible. And a new resiliency architecture (when used in concert with Storage Spaces) will help to protect and preserve volume contents. Verma’s list of key features also highlights plenty of new capability: 

  • Metadata integrity with checksums
  • Integrity streams providing optional user data integrity
  • Allocate on write transactional model for robust disk updates (also known as copy on write)
  • Large volume, file and directory sizes
  • Storage pooling and virtualization makes file system creation and management easy
  • Data striping for performance (bandwidth can be managed) and redundancy for fault tolerance
  • Disk scrubbing for protection against latent disk errors
  • Resiliency to corruptions with “salvage” for maximum volume availability in all cases
  • Shared storage pools across machines for additional failure tolerance and load balancing
Some changes underneath, but not much change on top

Some changes underneath, but not much change on top

What Remains in Common between ReFS and NTFS?

Not surprisingly, Microsoft has also elected to “maintain a high degree of compatibility with a subset of NTFS features that are widely adopted” although they do intend to “deprecate other … [features] … that provide limited value at the cost of system complexity and footprint.” This means that BitLocker encryption, access-control lists (ACLs), the USN journal, change notifications, symbolic links, junction, mount and reparse points, volume snapshots, file IDs, and oplocks will remain the same as those used in NTFS.

However, there will be no conversion utility to transform NTFS formatted volumes to ReFS. Instead Microsoft advocates copying data from NTFS into ReFS to make the change, and also to allow the resulting ReFS volumes, file collections, and data sets to enjoy the benefits of the new ReFS structures, maximum sizes, and resiliency. For the time being, ReFS will not be used to boot volumes in Windows 8 (though I’ve seen some discussions of making ReFS volumes bootable for the upcoming version of Windows Server due out late 2012 or early 2013).

What’s Gone from ReFS?

Verma does address this as “What semantics or features of NTFS are no longer supported on ReFS?”

The NTFS features we have chosen to not support in ReFS are: named streams, object IDs, short names, compression, file level encryption (EFS), user data transactions, sparse, hard-links, extended attributes, and quotas.

This is an interesting grab-bag of features, among which there are some obvious items: EFS was never that popular, and had sufficient problems to make lots of enterprise users steer clear, short names have been passe for 10 years or more, and named streams (aka “alternate streams”) have posed some clear and present security dangers. Extended attribute elements (also including object IDs) support the distributed link tracking service, which can be convenient but also had some problems, as did sparse files and hard links. It’s a little surprising to see file compression and quotas going away, and user data transaction handling was never implemented much anyway.

Once again, it’s “in with the old, and out with the new.” This should give Mark Russinovich and his colleagues something interesting to update in the next edition of Windows Internals, which should give me an opportunity to understand the impact of these changes in more detail.


January 16, 2012  11:47 PM

Compatibility Rears Its Ugly Head for Bluetooth

Ed Tittel Ed Tittel Profile: Ed Tittel

Ah, life can be strange when trying to get devices to play nice together. In this particular case, I decided to add a Bluetooth dongle to my son’s Acer 5552 notebook PC. With its AMD Athlon II P340 Dual-Core CPU, 4 GB RAM, and a Western Digital 5,400 RPM 250 GB hard disk, it’s a real little workhorse that matches my old reliable Dell D620 in overall performance and capability, but the killer deal I got from TigerDirect didn’t include Bluetooth in its $317 price tag.

The Acer 5552 is a modest but capable budget notebook PC

The Acer 5552 is a modest but capable budget notebook PC

If you read my IT Career JumpStart blog, you may recall that the one dated 12/26/2011 recounted some interesting adventures in getting my son Gregory’s Lego MindStorms robot kit up and going. In the weeks that have followed, he and I have added sensors to his robot, and have spent a number of happy and busy hours programming the machine to do all kinds of interesting stuff. When I discovered that the NXT 2.0 controller in his kit supports Bluetooth, I naively assumed that it would be easy to get notebook and the robot communicating with each other, especially when I was able to get the built-in Bluetooth on my HP dv6t traveling notebook and the NXT 2.0 controller working more or less instantly (after puzzling through the often cryptic menu navigation on the controller’s 2×3″ LCD screen, that is).

But wow, was I ever surprised by how much time, effort, and running around (with a certain amount of blue language for my Bluetooth issues along the way) was need to get things working with a plug-in USB dongle on the Acer notebook. I started out with the TrendNet TBW-106UB USB Class 1 Bluetooth v2.0 and v2.1 with EDR dongle I’d purchased for an HP course I revised in early 2010 that required me to write about Bluetooth printer connections (which I can now confess I plugged into a Dell 968 All-in-One Printer rather than an HP unit). But alas, though the TrendNet dongle does Plug-n-Play faultlessly on the printer, I couldn’t get it to work with the Acer 5552. I messed with driver after driver, and even went down another rathole related to an update to the BlueSoleil software that TrendNet supplied with the dongle.

The TrendNet TBW-106UB

The TrendNet TBW-106UB

Later, in reviewing the NXT documentation I learned that its Bluetooth works only with the generic Microsoft and the Widcom Bluetooth stacks (the BlueSoleil stuff is specifically identified as incompatible, alas). So off I went to the Fry’s website to see what kinds of USB Bluetooth dongles they had in stock at my local store, and ended up grabbing a nice Asus USB-BT211 from them for $14.99 (with a $5.00 mail-in rebate that brought the price down to a paltry $10, or about $17 less than the BlueSoleil software I purchased to updated the TrendNet unit, only to learn I couldn’t use it with the NXT robot anyway…sigh). I was able to update the drivers all the way up to the most current March 2011 Atheros AR3011 version available from the DriverAgent website, too, and keep both the notebook and the NXT 2.0 controller happy with its Bluetooth capability.

The cheaper Asus USB-BT211 did the trick

The cheaper Asus USB-BT211 did the trick

But that took some doing. I quickly learned I had to uninstall old versions of drivers before installing new ones, and equally quickly learned it was easier to take advantage of the restore point created when installing an outdated driver as a better way of restoring the 5552 to a pristine state than to rely on the uninstall code that the driver writers furnished with their software.

And now, finally, we can use the 5552 as a wireless remote to drive the robot around (fun for the boy) and as a handy-dandy way to download programs from the notebook to the robot controller (convenient for us, and doesn’t require a relatively short 1 m tether for a USB Type A-to-Type B plug-in cable between the two devices, either). This was my first big Bluetooth fracas, though, and I must confess I learned an awful lot about the various available drivers, protocol stacks, and the Widcom and BlueSoleil Bluetooth utility packages.

On the whole, I would’ve been just as happy, or perhaps even happier, to have just had the whole thing work plug-n-play as it did immediately on my HP notebook. But that’s not always the way things go in Windows World, is it?


January 13, 2012  5:19 PM

Waaaay Cool: Virtualizing Storage in Windows 8

Ed Tittel Ed Tittel Profile: Ed Tittel

A week ago, the latest post to the Windows 8 blog was entitled “Virtualizing storage for scale, resiliency, and efficiency.” And while the whole thing is definitely worth perusing, I want to zoom in on something that catches my particular fancy in this blog (it also caught Word and Windows guru Woody Leonhard’s attention in the latest Windows Secrets newsletter, too, which is what initially brought it to my attention; his piece is called “Storage Spaces might be Win8′s best feature yet“).

But first, I have to explain why I think the Storage Spaces feature is so awesome, by hearkening back to another hobby horse of mine — namely, Windows Home Server, aka WHS (I’ve blogged five times on this platform right here: once in 2008, and twice each in 2009 [1,2] and 2011 [1,2]). Turns out that Windows Server used to support a feature called Drive Extender that essentially allowed you to shrink or grow a multiple-disk logical volume simply by adding or removing space, then doing likewise with hard disks, on the system. No need for migration, recopying, partition re-sizing, Disk Management, and so forth: just a few commands to tell the file system what’s going on, and smooth sailing throughout.

I was incredibly bummed when it became clear that WHS 2011 (the latest and greatest version, out last year) did not continue earlier versions’ support for this feature, and thought perhaps we might never see its like again in the Windows World. I was wrong! In Windows 8, Storage Spaces lets users set up logical volumes that are bigger than the combined disk space provided for them, so you can grow the physical space over time to match the logical definition (Microsoft calls this “thin provisioning”). Here’s how Rajeev Naga, a group manager for Microsoft’s Windows 8 Storage and File System team explains what Storage Spaces can do:

Organization of physical disks into storage pools, which can be easily expanded by simply adding disks. These disks can be connected either through USB, SATA (Serial ATA), or SAS (Serial Attached SCSI). A storage pool can be composed of heterogeneous physical disks – different sized physical disks accessible via different storage interconnects.

Usage of virtual disks (also known as spaces), which behave just like physical disks for all purposes. However, spaces also have powerful new capabilities associated with them such as thin provisioning (more about that later), as well as resiliency to failures of underlying physical media.

And if and when you exceed the original maximum allocation for this storage pool, Storage Spaces simply notifies you in a pop-up window that you need to add more capacity, which you can do by plugging in more disks, then allocating them to that pool. Even cooler, pools can be designated as mirrored so that there will always be at least two (possibly three) copies of data on different physical disks within that pool. And should any disk fail, Storage Spaces automatically regens data copies for all affected holdings as long as enough alternate physical disks remain available within that pool. It’s not RAID, but it is fault-tolerant and resilient, and a great technology for all kinds of SOHO and end-user applications. The parity  designation also permits data to be reconstructed in the face of physical disk failures. They work best for big, relatively static files (like music, movies, home videos, and so forth) which are sequential in nature and require little or no updates to existing files.

Here’s how to work with Storage Spaces. You can use Windows PowerShell commands to create a storage pool, and to set up Storage Spaces that reside in that pool, or you can use the Storage Spaces item in Control Panel, to go through that process. This means selecting drives, then naming a storage space, assigning a drive letter, establishing the layout (default, mirrored, or parity), and assigning a maximum size. Afterward, you can always add more drives as needed, or make changes to configuration data likewise. Right now, the Win8 Developer Preview only supports storage spaces up to 2 TB in size, but when the Win8 Beta ships in February (next month, that is) you’ll be able to create storage spaces of arbitrary size.

What a great feature to bring to the Windows desktop. This may indeed be a true “killer feature” that prompts users to to upgrade to the new OS to exploit its capabilities, particularly those with big media collections!


January 11, 2012  11:15 PM

SmartDeploy Does “Drivers and Deltas” RIGHT for OS Images

Ed Tittel Ed Tittel Profile: Ed Tittel

On November 28, I wrote a blog entitled “PSmartDeploy Helps Ease Heterogeneous OS Situations,” in which I promised a follow-up early this year to provide more technical details about how the company makes a science of the slippery art of matching Windows drivers to Windows OS installations. I got my info fix on this topic, and more, in a conversation with SmartDeploy’s Director of Sales, Spencer Dunford (an uncommonly technical sales guy, if ever I ran across one) and his colleague, Senior Systems Engineer, Erik Nymark, this past Monday (January 9, 2011).

The patented technology that SmartDeploy uses for its driver magic makes perfect sense, and probably reduces a lot of painstaking work and serious reverse engineering, to build on the great work that Microsoft has done with the Windows Pre-Installation Environment (aka WinPE) and its boon companion, the Windows Image (WIM) format and container mechanism that Windows Vista, 7, and 2008 Server versions use to manage OS installation and packaging of image information for same.

Prowess SmartDeploy uses WIM as a storage format and data container for all of its packaging and image presentation methods. It has worked its way deeply enough into WIM internals, in fact, that it does the following to deliver WIM data elements for use in actual deployments (where individual components get combined into a valid WIM container that’s used as the focus for image installations):

  • Hardware/device drivers are broken out into a separate component file that can be built and managed independently of OS files and components. These are known as “platform packs” and confer some interesting advantages from their use: they are based on device enumeration from a reference image, and therefore need only contain those drivers that some particular Windows installation actually needs, rather than a sizable galaxy of possible device drivers that any foreseeable Windows installation might need. Thus for example, the DriverStore folder in my Windows 7 installation is 1.07 GB in size (which represents possible drivers) while the Drivers folder is under 58 MB in size (which represents the drivers it’s actually using). Essentially, the SmartDeploy approach allows the OS build to zero in only on the drivers it needs, and omit everything else, to achieve considerable image size reductions.
  • The OS and other components travels along for deployment as well, and remains logically and physically separate from the driver stuff. This makes it alot easier to mix and match various platform packs with the standard OS portion to support environment where as many as 60 or 70 different reference images must be supported.
  • Perhaps most interesting, SmartDeploy can also support “delta WIMs” so that an existing image can be merged with a set of changes (new updates, service packs, and so forth) without having to transport the entire image from a deployment server to its deployment targets. Thus instead of moving typical images that can be anywhere from 5 to 25 GB in size to replace an old image with a new one, the existing image can be updated in situ with the SmartDeploy tools and laid down to replace its previous incarnation entirely in place. This often means that delta WIMS of 500 MB to no more than a couple of GB in size will do the job quite nicely.

Misters Dunford and Nymark also showed me some very cool demos that showed these technologies at work, and performed a live update of a Windows 7 Professional image with a delta WIM that took less than 15 minutes to complete, and even less time to set up and stage for deployment. SmartDeploy makes excellent use of Wizards to structure initial images, and to build deltas upon those images, and also includes some slick automation for creating answer files to perform post-install tasks for adding applications and other runtime components to OS images as they get deployed to target machines.

The biggest benefit of their tools and technology is that, according to Dunford, they “…have seen a change in how frequently and how often images are updated, and patches or changes are deployed in image format.” Anything that makes this kind of job easier and faster to accomplish is definitely worth digging into. Check out the SmartDeploy website for more information.


January 9, 2012  4:04 PM

Interesting Detour into Legacy Land: FAT-16 and SpinRite

Ed Tittel Ed Tittel Profile: Ed Tittel

Among the many elements in my diagnostic and repair toolkit, I have a licensed copy of Steve Gibson’s SpinRite v6.0, purchased in 2006 (I still have a copy of the receipt in my email archives, and I’m glad I did for reasons I’ll elaborate upon later). I’ve been messing around with a very nice little 2.5″ drive cage that parks itself in a standard 5.25″ desktop PC drive bay — see this blog at edtittel.com “Great Product for Recycling 2.5″ Notebook Drives” for more info on that recent adventure. This led to some fiddling about with those recycled drives to transform them from OEM and/or home-built system drives with repair and hidden partitions into standard, single-partition data drives. And, in turn this led to the need for some low-level repair on one of the drives involved (luckily for me it was in the disk area devoted to a repair/recovery partition that I had never used, or it surely would’ve made a bad situation worse). And finally, to close this circle, that’s what led me back to SpinRite.

But, as usual with Windows, there were a couple of interesting flies in the ointment to content with. One is apparently a problem with my production desktop (a prospect that never fails to darken my day), where it simply refused to recognize the FAT-16 bootable UFD upon which SpinRite is installed. When I stuck it into my production system to make sure the UFD was still OK, I ended up wandering into an interesting hall of mirrors. This system doesn’t see the UFD (that’s MS-jargon for USB Flash Drive, in case the acronym is unfamiliar to some readers), except as an anonymous entry in the Disk Management utility:

You can see the drive in the partition map pane below but it's absent from the Volume list above

You can see the drive in the partition map pane below but it is missing from the Volume list above

And although USB gives me the cheerful French horn blast when I stick the drive into that machine, I don’t get the usual drive recognized pop-up window that asks me if I want to do something with it, either (as you’d expect, since the file system apparently doesn’t recognize the drive as a legitimate disk voluime).

But where it gets interesting is that another of my 32-bit Windows 7 systems and a 64-bit system DO recognize the drive, see it as a legit volume, and happily assign it a drive letter as well:

No problem seeing the UFD as a legit named volume on other systems

No problem seeing the UFD as a legit named volume on other systems

Once I was able to resolve the contents of the SpinRite UFD and ascertain its contents were still complete and correct (easy enough to resolve by downloading the files, and extracting them to disk to compare file names and file sizes, which is where my packrat tendencies to keep all receipts allowed me to grab a new copy from the Gibson Research Website and do due diligence), I was ready to try repairs on my slightly wonky WD 500 GB notebook drive. (Alas, I’m going to use this apparent problem with the file system on my production desktop to spur me into the upgrade to 64-bit that I’ve been putting off for too long now. I’m not sure I  know how to solve this kind of file system issue, and it’s probably easier just to wipe the system drive, upgrade from 4 to 12 GB of RAM, and move up onto a more capacious working environment.)

That’s when I hit the next bump in the road: SpinRite would load FreeDOS and get going but it would never get past the “looking for storage devices” phase of its start-up on my test machine. A little quick research quickly revealed that not all BIOSes are created equal, and apparently the one that gets loaded when a PC boots into AHCI (or RAID) mode is not compatible with the BIOS that SpinRite needs to do its thing.  With a little trepidation, I made the switch, then booted back up into the UFD. This time, SpinRite ran fine, and after about 6 hours of repair, fixed (or routed around) what ailed my errant disk drive. It seems that the AHCI and RAID BIOS developers cut enough corners in building their versions of the Basic Input-Output System that Mr. Gibson can’t use it to dig deeply and deviously into a drive’s contents with a disk controller that employs such a BIOS. After problems I’ve had with SATA-IDE vs. SATA-AHCI booting issues, I heaved a sigh of relief after switching back to AHCI on the next reboot to my system drive and seeing everything work properly.

But in the end it all turned out to be easy enough to work around, thanks to the presence of multiple machines I could use to figure things out along the way. Now, I just have to find time to upgrade my production desktop to 64-bit Windows and life will be good. Yeah, right!


January 6, 2012  3:51 PM

Interesting Thoughts on “Starting Over” with Your Chosen OS

Ed Tittel Ed Tittel Profile: Ed Tittel

Earlier this week, I found one of my Windows 7 test machines in what can only be described as an “interesting state.” For whatever reason (I’d just finished updating a bunch of hardware drivers, and I suspect that one of them bit this OS installation in the hindquarters) the OS would no longer interact with Windows Update, and the file system interface through Windows Explorer was decidedly flaky. To make a long and tedious story short enough to be tolerable, I ended up rolling back to a system image dated just before the December 13 Patch Tuesday updates before that balky test PC returned to a stable and workable state. I found myself wishing for some mechanism to put my system in good order that would be better than a trial-and-error march back in time through restore points and system images until something finally clicked.

Desmond Lee, the program manager for the Windows 8 Fundamentals team, must have often entertained those same thoughts as he went about writing this January 4 post to the Building Windows 8 blog, entitled “Refresh and reset your PC.” Of course, this capability is focused on Windows 8 rather than Windows 7, but in the wake of a trying repair and recovery scenario, it was reassuring to read that my recent situation isn’t unique, and that people do wish for a quick and easy way to make this possible. In fact, Lee’s list of key objectives for the “refresh and reset” initiative are likely to strike chords with you readers, as well as resonating forcefully with me (I quote them verbatim):

  • Provide a consistent experience to get the software on any Windows 8 PC back to a good and predictable state.
  • Streamline the process so that getting a PC back to a good state with all the things customers care about can be done quickly instead of taking up the whole day.
  • Make sure that customers don’t lose their data in the process.
  • Provide a fully customizable approach for technical enthusiasts to do things their own way.

In fact, the approach to providing this functionality in Windows 8 was to provide what Lee calls a “push-button” technique for repair and restore operations. This is by no means a new concept but no matter how urgently platform and product providers have hyped such solutions in the past, my own personal experience with such tools has been that they’re invariably simpler in concept than in practice. Sigh.

To that end, Windows 8 will apparently ship with a reset option that returns a system entirely to factory-fresh set-ups and settings, and removes all personal preferences, settings, data, and so forth so that it can safely be turned over to a third party without incurring the risk of unwanted data disclosures. The refresh option is more interesting to me at the moment, in the wake of my recent “Where and when did things go wrong?” adventure. The idea here is to “…get the benefit of a reset – starting over with a fresh Windows install – while still keeping your stuff  intact…” In this case the Windows Repair Environment (Windows RE) copies “…your data, settings, and apps, and puts them aside…” before installing a fresh copy of Windows, then carefully restores all of that personal and personalized stuff onto a newly-installed and up-to-date copy of Windows. That said, file type associations, display settings, and Windows Firewall setting are NOT preserved as a matter of conscious design to avoid problems arising from mis-configurations in the previous Windows incarnation.

But alas, there’s a catch: where apps are concerned, Windows 8 will preserve only Metro style apps, “… and require desktop apps that do not come with the PC to be reinstalled manually.” In my own recent personal experience, where installing Windows and getting it ready to use may take two to four hours, re-installing all the applications I use (which varies from a low count of around 50 on test and traveling PCs to a high of 110 on my production PCs) can take the better part of a day to chunk all the way through to completion.

So, while this sounds like a partial answer to my recent problems, it’s still not a 100% solution, either. And, of course, it will be very interesting to learn how these goals translate into practice as the Windows 8 builds make their way into GA status. Stay tuned!


January 4, 2012  2:30 PM

Another take on the Win7-XP crossover point: Coming Soon!

Ed Tittel Ed Tittel Profile: Ed Tittel

On October 29, 2011, in a blog entitled “Win7, XP Reach Crossover Point,” I reported that StatCounter’s tracking indicated that on or about October 14, the number of PCs on the Internet running Windows 7 exceeded the number of PCs running Windows XP for the first time. Of course, not everybody agreed with those numbers (as evidenced by Ed Bott’s discussion in my November 2 blog “Windows 7 vis-a-vis Windows XP“) where many actually prefer the tracking that NetMarketShare does instead because of its global purview and better detail.

Recently, reports indicate that the crossover point for the two operating systems — that is the now-venerable, 11-year-old Windows XP that simply refuses to die, and the latest production Windows 7 OS that is still finding its legs even as the shadow of the next-and-future Windows 8 OS begins to make itself known — are imminent, according to NetMarketShare. Take a look at these graphs and data points:

NetMarketShare Desktop OS trends

NetMarketShare Desktop OS trends

If I read the graphs correctly, the NetMarketShare trend lines for Win 7 and XP indicate that the crossover point will occur in late February or early March, as the two numbers converge in the neighborhood of 40% share for each one. And with Win7 on the way up as Windows XP is on a slow but steady downward course, this finally puts Windows 7 in the driver’s seat.

What’s interesting to me is that different forms of measurement produce the same results (Windows 7 overtaking Windows XP) but at dates as much as half a year apart. It’s all in how you measure, what you measure, and where those numbers come from!


January 3, 2012  3:40 PM

MS Pushes Out-of-Band Security Update Over Holiday Weekend

Ed Tittel Ed Tittel Profile: Ed Tittel

Imagine my surprise when I sat down to my PC late morning on January 1, to see that Microsoft had pushed a security update to address “Vulnerabilities in the .NET Framework…” (MS11-100, rated Critical). This involved as many as three security patches on some of my PCs, depending on how many version of the .NET Framework were installed on those machines (for Windows 7 patches were released for versions 3.5.1 and 4; for Windows XP, patches also appeared for versions 1.1 and 2.0, as well as 3.5 instead of 3.5.1).

Headline for out-of-band MS security patch

Headline for out-of-band MS security patch

Because this update addresses zero-day vulnerabilities in ASP.NET that can lead to elevation of privilege for rogue software and execution requests, this is a patch that admins will want to fast-track through their internal testing and deployment procedures. Ryan Naraine (of the Zero Day ZDNet blog) explains why this one is worth special treatment in his post this morning entitled  “Microsoft ships emergency .NET fix to thwart hash table collision attacks.” See KB Article 2638420 for a list of “known issues” related to deploying this particular security patch (basically, it involves updating all servers that use ASP.NET authentication tickets concurrently, because the pre- and post-patch mechanisms are incompatible).


December 26, 2011  1:26 AM

An AHCI Dilemma Resolved at Long Last

Ed Tittel Ed Tittel Profile: Ed Tittel

This past summer, I had to rebuild one of my test Windows machines when the original motherboard went south. When doing the rebuild, the machine crashed during the reboot part-way through the install process, and forced me to switch from AHCI to IDE mode for the disk drives I was using, even though they’d worked fine in AHCI on the previous motherboard. I wrote it off to some issue with AHCI and my combination of parts, and simply stayed with the switch to IDE and figured that would be the end of it.

But when I was finally able to get the proper IDE drivers loaded for that machine this weekend I decided to revisit the AHCI issue on this Gigabyte P43-ES3G. I like to tinker with my systems over the holidays, and have just finished a major patch-fix-upgrade-and-repair pass over all my PCs now; the disk controller stuff all started working when I switched the BIOS from “emulate IDE” to “straight SATA” just to see what would happen. IDE kept working, but DriverAgent was suddenly able to help me find the right, current drivers for the Intel ICH10R chipset and the JMicron JMB36X SATA/IDE controller on the P43-ES3G. “What the heck,” I figured, “Let’s try AHCI now, and see what happens.” It still kept hanging during drive detect while booting.

With some assiduous poking around, I discovered that others have had issues with the BIOS hanging during the drive detect just as I have, including a variety of Asus as well as Gigabyte motherboards. As it happens, the reason I had to switch from AHCI to IDE drive mode was because drive detection would hang on the second of the two drives in that system (a Samsung 1 TB HD103UI 7200 RPM hard disk) after correctly detecting the WD 300 GB Raptor that serves as the boot drive, but before detecting the presence of the SATA DVD burner on the system.

Just for grins, I tried a different drive instead of that Samsung yesterday while fiddling with the machine, after which AHCI booted like a charm. Upon further investigation I came across a posting on social.technet.microsoft.comfrom a gentleman named Ivan Filippov –who just happens to work for German-based disk formatting and partitioning company Paragon Software (whose Hard Disk Manager Suite has long been a favorite of mine) that explains a possible cause for this situation.  He observes that when the disk geometry data in the first partition on a drive gets munged, it can cause disk recognition at the BIOS Level to fail. The two other drives I tried to replace the original Samsung didn’t have this problem, apparently, and the drives were recognized and the AHCI BIOS loaded successfully–and so did the Samsung itself, after I popped it into a SATA drive caddy on another machine (after backing it up, of course) and repartitioning and reformatting that drive.

Problem solved, and I learned something both interesting and valuable. I should’ve just tried another drive (I’ve always got at least a couple of spares around, sometimes more than that) when I first hit this problem and it pretty much would have solved itself. And so it goes! Another obscure but interesting Windows lesson learned, and another pesky annoyance rubbed out at last…

[Note to readers: I'm taking the rest of the year off, so you won't see me post again until January 2, 2012. Let me take this opportunity to wish everyone a happy holiday season, and a festive and prosperous New Year!]


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: