Here’s a cheery bit of news from our friends over at MaximumPC. The story’s headline “IDC: Happy Windows 7 PC Users Won’t Switch to Windows 8” pretty much says it all. Quoting from ZDNet’s Mary-Jo Foley, the operative phrase is that “Windows 8 will be ‘largely irrelevant’ to traditional PC users.” The interior quotes are from an actual IDC report, which costs $3,500 and should therefore explain why none of us underpaid Windows hounds have the necessary scratch to obtain and quote directly from the real source. It’s entitled “Worldwide System Infrastructure Software 2012 Top 10 Predictions” from which IDC VP Al Gillen tweeted the net-net items on December 2:
01. Customer Face Confusing Choices as Virt, Cloud System SW and Information Automation SW Converge.
02. Private Clouds Will Grow Like Gangbusters, One Use Case at a Time.
03. 2012 Will Be VMware’s Last Year as King ot the Hill. Competition Starts Squeezing Hard in 2013. No More Green Field for VMware.
04. Operational Complexity Will Drive Demand for Predictive Analytics and APM.
05: Consumerization of IT Will Create New Management Challenges and Solutions.
06. Platform as a Service Will Ramp Up Slowly Due to Lock-in Fears.
07. Battle Royale Will Be Waged to Establish Linux Kernel of Cloud Computing.
08. Enterprises Will Reconsider Benefits of Infrastructure Heterogeneity.
09. There will be Layers for the Masses, Stacks for the Few.
10. Expect big success for WS8, Win Client 8 a skip-over for desktop users. Uphill battle for WC8 mobile space.
Item 10, of course, is what caught MJF’s eye and produced this magnificent quote from the actual report: “Windows 8 will be largely irrelevant to the users of traditional PCs, and we expect effectively no upgrade activity from Windows 7 to Windows 8 in that form factor.” IDC believes that Windows 8 will become generally available no later than August 2012, at which point PCs with Win8 pre-installed may be sold. I take this to mean that while new PCs may come bearing Win8, users won’t be lining up at midnight to buy the upgrade media at retail outlets, or clogging the Internet to download the new OS when the floodgates open.
At the enterprise level, of course, things will lag further behind. With many organizations in the middle of migrating to Windows 7, or having just completed same in the past 12 months, enterprises aren’t exactly eager to gird their loins and “do it again.” It’s typical for early enterprise adopters to jump on new OSes about a year after release, and for the main pack to slink onto the platform a year or two after that.
And with most regular, non-enterprise users pretty happy with Windows 7 on current PCs, there aren’t a lot of obvious and compelling reasons visible just yet to propel them to upgrade. This should be an interesting launch to watch–especially for Microsoft!
OK, I cheerfully confess that I glossed over the November 21 Building Windows 8 blog posting by Christa St. Pierre of the Windows 8 Setup and Deployment team, thinking that because I’ve gone through a couple of Windows 8 pre-release installs that I didn’t really need to read it from stem to stern. I was wrong, not because I had some experience and thus also, some appreciation for how it works and how it compares to Windows 7, but because that decision to “just skip it” deprived me of some valuable background information and insight into how the Windows 8 install process has been redesigned. Perhaps I should have paid closer attention to that posting’s title: “Improving the setup experience.”
But thanks to Jon Brodkin over at Ars Technica (one of my favorite sources for inside information, insight, and breaking Windows news and rumors) I started to understand more about what was going on with this part of the Windows 8 experience. The title of his article, in fact, points to some incredibly salient points–namely “Windows 8 gets faster installation, 11-click upgrade for casual users.” So, plowing through this piece I discovered information about how the Windows 8 process has been substantially streamlined so that what required multiple downloads and 4 programs to complete, and involved working through 60 screens of information, is now condensed into a single program that might require working only 11 screens. By itself, that’s pretty remarkable (and this is coming from somebody who’s had to write up detailed install descriptions and instructions for every version of Windows since 3.1 way back in the mid-1990s).
There’s also a fascinating discuss of the upgrade process (Windows 7 to Windows 8, in this case) that explains how the number of files and applications installed can affect overall install completion time. Because Windows 7 stages files for upgrading, it ends up copying files twice during that process, even though the bulk of such moves start and end in the very same folder! Simply put, Windows 8 moves what it has to more expeditiously at the folder lever when moves are required, skips unnecessary moves, and uses hard links to move things logically in the file system without actually moving stuff around on disk. This cuts upgrade time for large complex installations from 188 minutes to 46 minutes on systems with 430K files and 90 apps (about a 75% improvement) and from 513 minutes to 52 minutes on systems with 1.44M files and 120 apps (almost 90% better). Wow!
The blog goes on to explain how Web delivery has been optimized, through better compression, elimination of duplicate OS files, and smarter download behaviors. It closes with a nice overview of the Windows 8 ADK (Assessment and Deployment Kit, which replaces the old Windows Automated Installation Kit or WAIK), along with some best practices info on building Answer Files to automate bulk and remote installs.
This post is definitely worth reading, and pondering carefully, as you start thinking about test lab, pilot, and ultimately, production roll-outs of Windows 8. Be sure to check it out!
The word on the street is apparently out, and it’s claiming that we’ll see a large-scale beta release of Windows 8 sometime in January or February 2012.This is not too ridiculous to believe, by any means, considering that the developer preview made its debut around September 14. Among many publications proffering such guesstimates, PC World’s story “Windows 8 Beta Expected in Early 2012” gets my vote.
I’m glad to have an approximate time-frame for this upcoming release, because I want to build a new PC with a UEFI BIOS and a touchscreen in time to install the Windows 8 beta shortly after it’s let go. Looks like I’ve still got some time left to put together a new configuration. I’m guessing a lot of other readers who like to keep up with Microsoft’s release schedule may be thinking the same thing!
With the impending release of the CompTIA Storage+ credential on January 18, 2012, the Storage Networking Industry Association (SNIA) has announced the withdrawal and impending retirement of its SNIA Certified Storage Professional (SCSP) credential the day before. In fact the SCSP home page currently starts out this way:
In checking out the latest hub-bub on the Building Windows 8 blog this morning, I encountered a new post entitled “Enabling large disks and large sectors in Windows 8.” It’s the work of Bryan Matthew, a program manager on the Windows 8 Storage & File System team, and it’s definitely worth a read-through, not just for its explanation of how the file system in Windows 8 will support disks larger than 2.2 TB directly (without requiring an add-in driver like Seagate’s beyond-2TB software) but also for its explanation of where the technical limitations come from on BIOS-based systems that set the ceiling at 2.2 TB.
A big part of the answer to the support question involves another invocation of UEFI (Unified Extensible Firmware Interface) which is also the key to Microsoft’s boot-time security protection for Windows 8 as well. Here’s what Matthew has to say about UEFI and Windows 8: “Beginning with Windows 8, multiple new capabilities within Windows will necessitate UEFI. The combination of UEFI firmware + GPT partitioning + LBA allows Windows to fully address very large capacity disks with ease.” Although this is nice, it is also another indication that BIOS-based PCs–which represent over 99 percent of the current installed base, including every system I myself own–are heading for second-class status in the brave new Windows 8 world.
I’ve recently started speccing out a UEFI based system with a touchscreen to put a Windows 8 test system together, and I’ve learned that the number of UEFI-based motherboards is still very small (though it’s growing by leaps and bounds every month: when I first checked on this in September, I could find only 8 UEFI mobos on Newegg; this morning, the site lists 28 AMD-based mobos and 43 Intel-based mobos). Though I can understand and appreciate the technical advantages of UEFI, I wonder if the world is ready to make a mass migration to a new firmware architecture simply to take best advantage of Windows 8.
I’ve got two newish notebook and an equal number of newish desktop PCs and I’m not planning to junk them for at least another 2-3 years. Will the need for UEFI to obtain the best Windows 8 experience force me to change my mind? I don’t know yet. But I do know this is gonna be interesting!
I’m the reasonably proud owner of a Dell M11X (Alienware) mini-laptop with an i7 26xx processor, 8 GB RAM, and an OCZ Agility 3 SSD. It runs fast enough for me, and has enough horsepower to do everything I ask of it with aplomb, including some occasionally heavy-duty protocol trace capture and analysis (Wireshark rules!). But yesterday, when I turned it on, I couldn’t help but notice that its built-in Intel WiMax 6250 module had gone bye-bye.
When the laptop booted up, it reported that it couldn’t find the WiMax module. So, I ran the Intel WiMax connection utility to try to kick it into presence. No dice. Next, I read the help file to learn that I should probably duck into Programs and Features to run a “repair” operation on the WiMax connection software. And while that went without a hitch, there was still no WiMax module to be found after my next reboot.
Then, I went into Device Manager and, sure enough, no WiMax device appeared under the Network adapters heading, in contradiction to normal behavior. So I dug into the Dell Technical Support site to see if I could find any joy from their help and advice. Zilch.
After that I remembered seeing the WiMax module as I removed the unit’s conventional HD a couple of weeks ago to replace it with the SSD it’s now using. “Could I have jiggled an antenna wire, or loosening the module somehow?” I wondered out loud. Popped open the hatch, and saw nothing amiss with the WiMax 6250 module upon visual inspection but pushed on the antenna connections and made sure the module was tightly socketed anyway.
I’m not sure if this fixed the problem, or if the WiMax module had simply taken a short vacation, but when I rebooted the machine–Presto! The WiMax module appeared, and established its usual rockin’ connection to Clear Communications. Problem solved, with not too much muss or fuss, or time wasted.
What it reminded me of is the old adage about troubleshooting: always look at what else has changed recently on a system when components go kerflooey. Again, I can’t be 100% sure that opening the case and checking the WiMax module fixed my problem. But performing those actions did result in a return to normal function, so I’m not inclined to look this gift horse too deeply in the mouth.
I was kinda glad it turned out that way, because I’d started to investigate the Dell Technical Support channels I would have had to traverse had more serious fixes been required. The only phone number I could find was for premium, added-charge support. I’m not sure my same-day hardware replacement support warranty gets me that high up the food chain, and I guess I save the effort involved in climbing that learning curve for another day!
In my previous blog “The Changing Face of OS Migration: Windows 7 and Beyond” I ruminated on the state of Windows desktop deployments in modern enterprises, based on a discussion on November 16 with Aaron Suzuki, CEO of Prowess, the makers of the SmartDeploy toolset for Windows operating systems and applications. In that first blog in this two part series, I laid out the current situation that the vast majority of enterprises face on their desktops (and servers) today: a heterogeneous situation where OSes are no longer migrated in lock-step, and where the age of the hardware often dictates the OS it uses (particularly on desktops, where enterprise buyers often elect to have vendors provide them with pre-installed reference images on new machines, usually running Microsoft’s latest and greatest desktop OS).
In this blog, I’ll recount some of the specific pain points that Mr. Suzuki reported hearing from his customers, and how SmartDeploy Enterprise (and the Prowess sister product SmartMigrate) can help to address them.
Managing Device Drivers Is Key
According to Suzuki, managing device drivers often ends up being a bigger problem than managing the OS itself, and also often involves a great deal more time and effort from a maintenance perspective as well SmartDeploy solves this problem by separating drivers both logically and physically from the OS/software imge, and packaging these elements separate and distinctly. Each driver package, called a “platform pack” in SmartDeploy parlance, enables enterprises to manage drivers by hardware model, configuration, device configurations, and so forth. Some organizations may manage as many as 60 or 70 such driver packs, while others find it necessary only to define and manage a dozen or two. Once a driver pack is defined, the regular SmartDeploy management console handles its application and use simply and directly. These ensure that the right drivers are available at deployment time, and help ease the most common (and vexatious) issues that can occur during actual roll-outs.
Single Point of Management and Control for All Windows Machines
Easy definition and management of drivers, plus OS and application images, enables SmartDeploy users to manage multiple Windows environments from the same toolset, all the way back to Windows 2000, along with XP, and even Vista, as well as Windows 7. And Prowess is working assiduously to get ready for the upcoming release of new server and desktop OSes from Microsoft around the end of 2012 as well. This provides a single point of management and control for all Windows desktops and servers in the enterprise, and is gaining considerable traction and adoption from a broad range of enterprise customers around the world. With over 84,000 desktops in its biggest deployment, Suzuki is also confident of the platform’s ability to scale well, and to handle heterogeneous hardware and operating systems with panache.
SmartDeploy also permits consolidating images from multiple operating systems into a single image file, something that I plan to learn more about after New Year’s from other members of the Prowess technical staff. At that point, I’ll report back with further details. In the meantime, check out the SmartDeploy home page for more information on the company’s products and platforms. Interesting and capable stuff!
Last week I had the pleasure of talking to Aaron Suzuki, the co-founder and CEO of Prowess, a Seattle-and Salt Lake City-based software company best known for its SmartDeploy product suite. In an upcoming blog, I’ll write about what’s up with SmartDeploy and why it could be interesting to readers responsible for managing packaging, deployment, and maintenance of Windows desktops. In this blog, however, I want to ruminate about the changing face of desktop deployment and use in the enterprise.
One of the elements of my conversation with Mr. Suzuki that struck me most as we were talking, also has come back to me repeatedly as I’ve tried to understand what’s happening in many, many enterprises around our globe. The facts that drive these ruminations are:
- only 20-25 percent of global enterprises have made significant progress in deploying Windows 7 in their production environments.
- wholesale migration is becoming less of a concern in many organizations, where migration can hinge on retirement or replacement of older desktops (which generally have XP installed, where replacements or new purchases generally have Windows 7 pre-installed).
- it’s increasingly clear that XP won’t have entirely disappeared by the time Windows 8 ships in Q4 2012, which raises the same prospect (older XP and/or Windows 7 PCs on their way out, replaced by newer machines with Windows 8 pre-installed on their way in).
The notion of some kind of massive switch-over, or wholesale “the world now runs Windows X” approach to migration and deployment appears to be fading from the scene. Tight IT budgets, lengthening desktop hardware lifecycles, and the realities of continual purchase of new desktops with new OSes appear to be blurring the lines somewhat. It’s also very much to the good that virtual machines can be made to work on newer OSes with ever-increasing ease and automation, so that runtime access to legacy systems and software remains feasible even as the underlying host architecture continues to change beneath the VM layer.
I’m starting to think that a “we were an XP shop, now we’re a Windows 7 shop” mentality is fading from the scene, too. It’s starting to look like enterprises will use all the tools at their disposal, including a mix of old and new OSes and hardware, along with increasingly sophisticated software tools to help them manage heterogeneous desktop environments, while doing their best to deliver a consistent and positive user experience across all supported environments. In my next blog, I’ll discuss how the Prowess SmartDeploy toolset makes that possible and affordable, as a case in point.
When PCs reach their end-of-life point within many organizations, the question then arises about what to do with them to achieve their responsible disposal or redisposition. In developing a lifecycle course for HP a few years back, based on HP’s in-house lifecycle expert Bruce Michelson’s excellent book, Closed Loop: Lifecycle Planning (A Complete Guide to Managing Your PC Fleet) (Addison-Wesley, 2007, ISBN-13: 9780321477149, List Price: $44.99), I had cause to ponder the many and various ways of dealing with obsolete or end-of-lifecycle computing gear.
Of course, vendors such as HP, Dell, IBM, and others do offer “take-back” programs to permit companies and organizations to dispose of used computing gear safely. But after its useful life in business is over, older PCs can still enjoy a useful second life in other worlds. I’m not necessarily in favor of outright disposal of such units, no matter how environmentally correct such disposal might be.
To this day, one of my favorite recommended methods for safe disposal of computing gear that adds jobs and income to communities where such equipment is received is through Goodwill Industries International. With a strong presence in North America, and an increasing presence in South America and Asia, Goodwill offers environmentally sound e-waste disposal around the world these days. But they scavenge donated equipment first and foremost for salvage and refurbishing, to gain extra life, money, and work for their employees out of handling these materials. Over the past 10 years, I’ve probably donated over $20,000 in in-kind gifts of computers, monitors, keyboards, networking cards and components, plus miscellaneous computer parts and components galore to Goodwill.
Recently, I heard about a group based in Austin, TX, called The Helios Project. It’s run by Ken Starks, who works out of donated space in Taylor, Texas (about a 40-minute drive away from Austin). This group collects older computers and related parts for refurbishing or what you might call “reconstruction” so they can give working PCs to disadvantaged youths who might otherwise not have access to personal computing power. It’s a great cause and one well worth supporting, and they’re going to be getting my used gear going forward, primarily because they not only see to its safe and productive re-use, but also because the group voluntarily wipes gifted hard disks to DoD erasure standards to protect donors from potential illicit data mining and reuse. Ken and his directors (see “The Helios Project Directors” for some capsule summaries of the movers and shakers there) are also passionately committed to Linux and other Open Source software, and equip all of their outbound PCs with legally licensed operating systems and software that the new owners can maintain without having to incur re-licensing or annual maintenance fees.
I’d encourage you to look for similar kinds of organizations in your communities as potential recipients for used or end-of-lifecycle computing gear. These outfits will work with you to protect your data and information assets, but can also give that gear added life outside the business world. It’s nice to make a difference, and do something good for the community, in addition to practicing safe and secure disposition of no-longer-needed computers and equipment. Please, look around your neighborhood, and see if you can find an outreach organization to support. If not, you can still always turn to Goodwill, and generate some jobs and income in your community through your gifts in kind.
With the holidays approaching, The Helios Project is humping like mad to get a raft of new computers ready to show up under Christmas trees all around Central Texas. I’m driving out to Taylor myself next week to give them a barely used “One Laptop Per Child” machine I purchased from Negroponte’s organization a few years back, plus an Asus netbook PC, along with a collection of surplus memory, disk drives, and other spare parts they can use to bring computers back alive for re-use. Those interested in making financial donations to this organization should send checks c/o Ken Starks, 308A High Estates Drive, Round Rock, TX, 78664. But those checks should be made out to “Software in the Public Interest” (SIPI) with a memo notation that reads “Helios Project” (SIPI is a legitimate 501(c) organization that handles the donations for The Helios Project, and save them the expense of registering with the state to process donations and write IRS-accepted receipts for donations to a nonprofit charity).
Remember the recent hoopla this summer about fraudulent master-level (intermediate authority) digital certificates showing up in the wild? Well, Microsoft quietly released another out-of-band security update last Friday (11/11/2011) under the heading of KB2641690, with an accompanying Security Advisory. Apparently, Microsoft has also revoked its trust in the Digicert Malaysia Certificate Authority (doing business as DigiCert Sdn. Bhd.) for violation of the Microsoft Root Program requirements (see this Softpedia report for more information: “Microsoft Revokes Trust in Digitcert Malaysia Certificate Authority“).
The Softpedia story nicely explains why Microsoft took this action, and issued an emergency security update to match:
“Microsoft was notified by Entrust, Inc, a certificate authority in the Microsoft Root program, that a Malaysian subordinate CA, DigiCert Sdn. Bhd issued 22 certificates with weak 512 bit keys,” revealed Jerry Bryant, Group manager, Response Communications, Trustworthy Computing.
“Additionally, this subordinate CA has issued certificates without the appropriate usage extensions or revocation information.”
Microsoft stressed that unlike the DigiNotar scenario from a few months back, this time around attackers did not get the chance to exploit the weak and deficient secure sockets layer certificates issued by Digicert Malaysia.
This is best understood as a pre-emptive measure designed to forestall possible security compromises or potential attack vectors BEFORE they occur. Nevertheless, it also dictates that this security update be fast-tracked into production for the selfsame reasons. Another one for your hurry-up schedule!