I’ve blogged here repeatedly about the benefits — and some gotchas — for the built-in Windows 8 recimg (record image) command. Here’s a list of those items for anyone who might be interested in learning more about this fabulous Windows 8 (only) utility that permits admins to capture and store current Windows 8 system images in .wim (Windows Image) file format, then restore them to refresh their systems as and when they might be needed:
- Create Your Own Refresh Image for Windows 8 (12/7/2012)
- Make DISM Your Go-To Image Management Tool in Win8 (12/10/2012)
- What Gets Lost When Using Win8 Refresh (1/21/2013)
- More Benefits of Win8 Refresh (1/23/2013)
I’ve become a big believer in using the built in recimg command to capture — and when necessary, restore — Windows 8 image files as a way of fixing subtle problems in Windows that might otherwise take weeks to troubleshoot. I learned this lesson the hard way on one of my Windows 8 machines when it wouldn’t let me run the recimg command at the command line (which means RecImgManager couldn’t work either, of course). After running a factory refresh on that machine I was able to start using recimg at the command line and through the RecImgManager program itself.
As depicted in the screen cap at the head of this blog post, you can add image snapshots already captured using recimg at the command line. This works by using the browse button (bottom right) in RecImgManager to find and integrate such captures as “Imported Snapshot” items (you see an image I grabbed in late January as I was working on the “What Gets Lost…” post linked to in item 3 above). As long as you know where to find your images (easy enough to do, using File Manager to search on “*.wim”) you can add them to the items under RecImgManager’s control.
Now that I’ve been able to work with the underlying recimg command and the RecImgManager utility from SlimWare, I’ve really learned to appreciate the latter’s convenience. It doesn’t do anything the command line utility can’t do, but it provides a very nice visual organization to those capabilities, and makes it much easier to capture new images, and especially, to select images to use for a restore operation. It’s always nice when you find a good, free, and capable software tool that makes it easier to manage desktops. This would be one of those.
I’ve been reading a whole spate of reviews on various Windows 8 tablets lately, including the Dell Latitude 10, the Acer Iconia W510, the Lenovo ThinkPad Tablet 2, and so on and so forth (for a nice synopsis on why Windows tablets might actually make sense in the enterprise see Adrian Kingsley-Hughes 3/14/13 post entitled “New Windows-powered tablets threaten iPad’s enterprise dominance, claims analyst“). All of this has got to me to thinking about why I’m not willing to buy the current Surface Pro, or any of these other models at this point in time. In so doing, I hope I’ve formulated a nice list of design goals for the OEMs and system designers to ponder as they design a next generation of Haswell-based tablet PCs for business users like me:
- More horsepower: most of the successful tablet designs right now rest on the latest dual-core Atom processors, and simply don’t have enough oomph for me. Those that do have oomph — like the current Surface Pro — don’t have enough battery life (see next item).
- Longer battery life: the tablets with oomph can’t generally manage to squeak more than 5 hours out of a fully-charged battery. I want at least 8 hours, preferably 10 hours or more. Here again, those tablets that can do this (the Dell Latitude 10 is an excellent example) currently lack the processing power I want.
- More pixels, please: Far too many tablets still sit at 1366×768 resolution, which isn’t enough pixels anymore, even on a smaller screen where that form factor looks acceptable. I want at least full HD (1980×1020) or better, please, with a pixel density of at least 200 ppi.
- User accessibility: Though this may mean slightly thicker enclosures, I’d like to see the next-gen high end-tablets with underside ports to access memory, mSATA SSD ports, WLAN ports, with a user-replaceable battery receptacle. Storage and RAM are growing too fast to force buyers to accept soldered-in components for the 3-5 years that’s typical for the life of a modern notebook PC (or a valid tablet replacement); WLAN modules must often be replaced for overseas travel; and a user-swappable battery makes it possible to keep computing on long flights (or very long days).
- Multi-factor authentication, plus: Go ahead, put a fingerprint scanner into these units, or add both a front-facing camera and facial recognition software, so that enterprise users can add biometric authentication to the more usual account/password or image-touch-sequence login methods. Let installers add a “nuke the drive” option after a large number of failed log-in attempts (10 or more is good), and make sure the device works with remote wipe facilities included in most Mobile Device Management (MDM) platforms nowadays.
- Smart virtualization: Make sure the units support virtualization (as both client and hypervisor) to permit clients to remote into their data centers quickly and easily on the one hand (acting as a client), and to run various VMs locally (acting as a hypervisor) as well.
- Good accessories: Provide strong, durable cases with keyboard/mouse modules that don’t add too much to the overall size/weight equation of the tablet itself. Docking stations for work at the office are also very nice: make sure you put lots of ports, video options, and GbE Ethernet into these babies, and make it way easy to dock/undock the tablet for high-speed entrances and exits.
- Fast peripheral and storage ports: At least two USB 3.0 ports, and a reliable microSD port, please. The former lets me use all kinds of high-speed peripherals and storage, the latter lets me extend my storage space by 50% at a modest cost (64GB microSD cards go for about $65-90 these days). A mini-DisplayPort would be nice, too, but not really necessary if you add another USB 3.0 port for video access.
I know, I know: it’s a LOT to ask, and I’m hoping that Intel will fix its chipset USB3 issues with the Haswell chipset quickly, so technology can jump on that bandwagon sooner rather than later. Given the lower power consumption of the Haswell CPUs, we may even see something like what I’ve described late in 2013 or early in 2014. I would happily pay a premium over Surface Pro costs of about $1,200 at the moment to get everything I’m after — say $1,600 for a Surface-Pro like tablet with a 256 GB mSATA SSD, 16 GB RAM, and an LTE WLAN module? Here’s hoping this might actually come to pass!
Mistakenly posted to the wrong blog (my apologies): find the real article over at my IT Career JumpStart blog instead.
I know I’m busy, busy, busy when Patch Tuesday takes me by surprise, and that’s what happened to me yesterday. Between phone calls galore, and catch-up from a long family weekend, I wasn’t necessarily ready to go haring after Windows Updates. But, ready or not, there it was and I’ve been digging in ever since. My Windows 8 machines show 14 updates for Windows itself, and another 10 for Microsoft Office 2013; my Windows 7 machines show 7 for Windows and components (including Internet Explorer 10 , which has now been pushed into the Windows Update channel) and another 3 for Microsoft Office 2010.
A quick gander at the latest Microsoft Security Bulletin for March 2013 reveals bulletins numbered MS13-021 through -027, for a total of 7 bulletins overall. Four of them are labeled critical (MS13-021 through -024), with the first three qualified as “Remote Code Execution” and MS13-024 as “Elevation of Privilege.” The coverage is all over the place: -021 is a cumulative security update for IE, -022 addresses Silverlight vulnerabilities, -023 tackles the Visio Viewer 2010, and -024 addresses four SharePoint vulnerabilities.
The remaining three bulletins are rated Important, where -025 and -026 are qualified as “Information Disclosure,” and -027 as “Elevation of Privilege.” The -025 update is for OneNote, -026 is for Outlook for Mac, and -027 touches on Kernel-mode drivers. MS13-021 and -027 require a restart, -023, -024, and -025 may require one, and the remaining items (-022, -026) do note require a restart. Severity ratings nothwithstanding, my impression is that admins will want to consider accelerating deployment of -021 and -027 first and foremost, as these are most likely to address potential vulnerabilities on the vast majority of end-user machines, unless Silverlight is also in broad use (in which case it should be prioritized for testing and possible deployment as well).
BTW, I really like the Acknowledgements section that has been added to the MS Security Bulletins, which gives those who report vulnerabilities credit for their work, and also ties updates to specific entries in the Common Vulnerabilities and Exposures (CVE) database. It’s also interesting to see many of the same names (and test labs) showing up in those credits as well. Here’s a snippet, by way of illustration:
I just read a mind-boggling story over at Business Insider (an arm of the respected British newspaper, The Guardian) entitled “The PC Business Isn’t Growing Anywhere, And It’s Really Scary for Microsoft.” The story is laden with interesting facts and figures about future growth for the PC industry, but the real point of the story might be best summarized as follows:
- The developing world (outside of North America, Europe, and the Pacific Rim) is where the biggest growth potential for computing lives and works.
- The best the PC industry can hope for in the future is zero growth, as fewer buyers spend money on PCs (though a larger number of buyers overall will be participating in the market for computing devices of all kinds).
- The global market for PCs is shrinking not growing, because users everywhere turn first and foremost to smartphones (and second to tablets) to get online nowadays.
The inference worth drawing from this analysis is that PC vendors (and PC OS makers, like Microsoft) cannot expect the rising tide of technology purchases outside the developed world to keep their boats afloat, or to help them achieve incremental success outside their historical core markets. As is so often the case in the developing world, buyers simply short-circuit the traditional path of technology adoption, and skip to the inevitable conclusion — in this case, nearly universal adoption of smartphones as their primary computing platforms of choice.
The story goes further to observe that MS has done itself no favors with the feature set and UI behavior of Windows 8, and widespread lack of adoption of that operating system in the business world at any level of commerce (from Mom’n’Pop shops up to the largest enterprises). Pricing for more portable notebook PCs — the so-called “ultrabooks” — also imposes a barrier to buy-in for the products that might otherwise drive broader adoption and use of PC platforms. Though the hit won’t be huge or damaging for Microsoft (and other major PC players) at first, if these trends continue, all will have profound reasons for worry as the decade wears on.
FWIW, I agree with this analysis, and believe that unless a compelling and affordable tablet-based Windows computer (which the Surface Pro suggests, but fails to deliver on grounds of price, weight, and battery life) appears on the market soon, the PC era may well move to a close much sooner than many might believe possible. That’s not to say I think the need or market for workstation class machines will ever disappear; rather, I believe the bulk of personal and workplace computing will migrate to sufficiently powerful mobile device platforms that can run VMs resident in a data center somewhere, and only those who need to consume lots of computing resources in-place (3D modeling, CAD, heavy data analysis, and so forth) will need to stick to traditional desktop and notebook machines as we know them today.
One big component of the cost of PCs nowadays — especially those touchscreen notebooks, ultrabooks, and tablets most likely to play host to Windows 8 — is the fee from Microsoft for the OS license. That number may be declining soon, according to a report from Taiwanese publication DigiTimes entitled “Touchscreen notebooks to enjoy at least a 10% price cut after Windows 8 discount,” (note: I had to visit the Google cache to read the report, because the URL on the preceding link returned a 404 error). Their research indicates that touchscreen PC prices could drop by 10% whereas “some entry-level and mainstream models may even drop more than 20%.”
Will this be enough to prop up Windows 8’s lagging uptake (see Steven J Vaughan-Nichol’s excellent analysis in “Five reasons why Windows 8 has failed” on ZDNet)? Perhaps lower prices will encourage some improvements to Windows 8’s fortunes, but there’s a lot of ground to be made up (Vaughan-Nichol’s analysis shows that usage rates for Windows 8 are only just over half of Windows Vista’s at best, itself no paragon of marketing acceptance or product excellence).
To me what’s interesting about this revelation is that Microsoft is actually willing to reduce OEM licensing costs to add some momentum to Windows 8 purchases. Traditionally, their posture on pricing has been more inflexible, because OEM licensing has been and remains the primary engine for desktop operating system revenues. This testifies to Microsoft’s understanding that Windows 8 is in trouble (or at least, “failing to perform to its expectations”). Hopefully, the pricing changes will be enough to breathe more vigor into Windows 8 sales. That should make watching the sales numbers — and the corresponding usage rates (or as Net Applications calls them “market share”) over the next half year more than ordinarily interesting.
As I explained in my February 25 post “MS Office 2013 Licensing Follies,” MS updated its terms for an Office 2013 license to prohibit transfer from one PC to another, “so if you decommission an older PC running Office 2013, and seek to install the program on a newer replacement PC, you’d theoretically be required to purchase a new license for that replacement machine.” Apparently, the company decided that most buyers would pick the subscription model for Office 2013, where no transfer actually has some reasonable basis, and overlooked the fact that those who purchase retail — or for the readers of this blog, volume — licenses would scream bloody murder at what they’ve always considered an inalienable right for their precious Office licenses.
The screaming must have echoed loudly in the hallways of Redmond because Ed Bott reported this morning (see “Microsoft restores transfer rights for retail Office 2013 copies“) that Microsoft is restoring the same “perpetual license” terms that had applied to all three such versions of Office 2010 where Office 2013 is concerned. This means that those who purchase a retail license (rather than a subscription) or whose licenses depend on access to ISO downloads available through TechNet, MSDN, or volume license agreements and Software Assurance with Microsoft will get their apparently much-coveted transfer rights back.
If you enjoy reading Microsoft eat crow, here’s the verbatim text from an Office blog post where you can watch that bird make its way down their gullet:
Based on customer feedback we have changed the Office 2013 retail license agreement to allow customers to move the software from one computer to another. This means customers can transfer Office 2013 to a different computer if their device fails or they get a new one. Previously, customers could only transfer their Office 2013 software to a new device if their PC failed under warranty.
While the licensing agreement text accompanying Office 2013 software will be updated in future releases, this change is effective immediately and applies to Office Home and Student 2013, Office Home and Business 2013, Office Professional 2013 and the standalone Office 2013 applications. With this change, customers can move the software to another computer once every 90 days. These terms are identical to those found in the Office 2010 software.
For some time now, Microsoft’s radical redesign of the Windows 8 user interface has raised the question of whether the new operating system will fare more like Windows Vista (which is to say poorly) or like Windows XP or 7 (which is to say, like gangbusters). Microsoft’s relentless rah-rahs and never-ending hype to the contrary, the numbers make a case that Windows is moving into the market more like Vista than like those, more popular sibling versions of Windows. No sources for numbers for such discussions are more popular, or more frequently cited, than those from NetMarketShare, which shows the following breakdown for Internet users by percentages this morning for Desktop Operating System Market Share:
Paul Thurrott states this case very well in his recent WindowsITPro article “Windows 8 Sales: Hot or Not?” (where he also points out, quite correctly, that even though Net Applications labels these numbers as representing market share, they actually represent usage share because those numbers are drawn from an analysis of Internet users; all the copies of Windows 8 that have been sold but not yet installed or which aren’t being used aren’t counted by such measurement techniques). Usage rates for Windows 8 are climbing month to month, but very slowly which causes Thurrott to observe that Windows 8’s “… usage share after one quarter trails that of Windows Vista at a similar time after its 2007 launch.”
On the other hand, Microsoft continues to tout that Windows 8 has “sold 60 million copies” (though I’ve been seeing that number since late January, so one wonders why it’s not already 70 or 80 million). The company claims this is on part with Windows 7, which sold at a steady 20 million copies per month for nearly three years. And as Thurrott points out “…there are … questions about Microsoft’s numbers, which always represent sales to PC makers and into the channel and not sales to end users.”
Finally, Thurrott isn’t ready to call Windows 8 a “debacle” just yet, but he does confess to being worried, given that Windows 8 should have a much bigger sales base than Windows 7, because it covers mobile devices, especially tablets (and possibly even smartphones, if you buy the idea of Windows 8 Phone as just another version of Windows 8). And so far, that bigger potential mostly remains unrealized. Do I think this is cause for concern? Yes, I do, but I don’t think it’s a death knell for Microsoft, either.
In its own words (from the organization’s home page) the Open Data Center Alliance, aka ODCA, “… is working actively to shape the future of cloud computing — a future based on open, interoperable standards.” Their membership includes 300-plus companies, including Nokia, Rackspace, BMW, China Unicom, Deutsche Bank, Lockheed Martin, Motorola Mobility, and AT&T — in other words a mix of consumers and producers of cloud services and technologies. Does this sound like the kind of broad, all-embracing, standards-oriented effort to which Microsoft would voluntarily become a party? Well, hold on: believe it or not, Microsoft and ODCA announced its membership in the organization on February 27, 2013 (here’s a link to a Microsoft-News.com article entitled “Microsoft Joins Open Data Alliance to Shape The Future of Cloud“).
Microsoft’s angle into this move comes out of Windows Azure, the company’s cloud computing platform for building, deploying, and managing applications and services through Microsoft’s global network of data centers. In writing about the move, Slate‘s Nick Kolakowski says “Microsoft has spent the past several weeks pushing Office 365 Home Premium … even highlighting … how the cloud-based version is a better deal. That’s not a move by a company dabbling in the cloud, that’s one betting one of its biggest cash cows on it. …one can see why the company likes to say it’s ‘all in’ with regard to the cloud.” For MS to jump onto an open, standards-based approach to cloud computing is also a strong if tacit acknowledgement that the cloud is bigger than any single company’s platforms, frameworks, tools, and technologies. They already know they have to rub shoulders with the other players in the cloud space, but it’s refreshing and inspiring to see them do something to recognize the realities of the cloud (“it’s bigger than all of us put together” might make a supplementary mantra for the OCDA, should they be looking for further taglines or rhetorical justifications).
Any way you look at it, though, this is big news, and represents a major new approach to a major market for the colossus of Redmond. Frankly, I’m very interested to see this going down, and will be watching closely as the real import (and standards support and compliance) involved in ODCA membership manifests itself across the many cloud-related Microsoft platforms, frameworks, and products — and its various certification programs as well.
I just read a fascinating article in the Healthcare section of InformationWeek by Ken Terry entitled “Dell Targets Healthcare with Windows 8 Tablet.” The basic premise behind the story is that because the Latitude 10 offers some key security features and a lower TCO than an iPad, it stands some chance of competing head-to-head with that favorite of Apple devices.
According to Terry, the Latitude 10 offers some interesting advantages for potential healthcare adoptions:
- It can require use of either a fingerprint or smart card reader to employ dual-factor user authentication for login and access
- It already includes built-in encryption, and will add file-level encryption in an update scheduled for Q2 2013
- The device has a “20-hour battery life, which should be attractive to clinicians who work hospital shifts that can be 12 hours or longer” (while the iPad tops out at under 14 hours of modest use). Also, the Latitude 10 includes a user-replaceable battery, so users can swap out a dead battery for a charged one if needs must (the iPad battery is not user accessible)
- The device is simpler and more manageable than an iPad, and fits into existing Windows-based healthcare IT environments (including remote management, deployment, and maintenance tools, which Dell also offers to enterprise customers) both neatly and nicely
I like all of this information, and I even buy Dell’s contention that “…few EHR [Electronic Health Records] vendors have developed iPad-native versions of their applications…” My concerns about this platform are as follows:
- The Latitude 10 includes only a dual-core Atom Z2760 1.8 GHz CPU with 2 GB RAM (not expandable) — it does run either 32-bit Windows 8 or Windows 8 Pro, however. I’m not sure that this will give healthcare professionals enough horsepower to do everything with their tablets that they might want.
- There’s no comparable cover/integrated keyboard device for the Latitude 10 like the Microsoft Type or Touch covers that both protect and extend the capabilities of these devices. [Note added 2/28/2013: In subsequent reading on bundles for corporate adoption, I’ve learned that Dell DOES offer an integrated cover and separate wireless keyboard for the Latitude 10. I think this successfully addresses my concerns on this issue (pricing is a little better than similar iPad offerings, in fact), and scotches this potential objection/gotcha.]
- There’s not a lot of EHR infrastructure for Windows machines yet, either, nor is there much emerging for the Windows 8 platform at the moment, either. Dell’s purported advantages do require application (and app) support to help further bolster their case.
- I don’t know anything about the planned aftermarket for the Latitude 10, so I also don’t know if there will be cases and covers available for these devices that will protect them from inevitable drops, falls, and minor abuse as they accompany medical professionals on their rounds. For the iPad, there are already plenty of options that offer everything from nicely integrated keyboards to near-invulnerability from physical abuse. Something along those lines will be essential for the Latitude 10 to compete on equal footing with the iPad. [Note added 2/28/2013: see previous note for the second item in this list, which refers to accessories available for this unit from Dell; online research indicates that a growing number of aftermarket offerings are popping up for this unit, too.]
All of this said, I find Terry’s case for the Latitude 10 as a possible healthcare tablet at least interesting, if not intriguing. I wonder if the IT and purchasing folks at any of the major healthcare providers are seriously pursuing this platform as an option…a detail that Terry’s story unfortunately fails to address. I presume the sales cycle for such a deal is underway, possibly at several providers, but methinks if any deals were done or in the offing they would have been mentioned here (or in some Dell press release somewhere).
[Note added 3/1/2013: Thanks to a shout-out to one of my Dell contacts, I’ve been connected to the Lat10 product manager, and am jumping through some hoops to see if I can get a review unit. I’d like to try one of these machines out to see how they behave, and hope to get one with 4GLTE WWAN capability to take them out and about with me as I drive around my local area, and make the occasional business trip. More on this as I make progress on this front…or not!]