My home office set-up includes a pair of 27″ Dell monitors (2707 WFPs) on my primary production desktop. I say this by way of explaining why one of my favorite working techniques is to stay logged into that machine, and to use Remote Desktop Connection (and the RDP protocol) to reach out from that desktop to other machines I want to work on here and there around the house. In particular, I’ve got 2 Windows 8 test machines that I work on quite regularly — a Lenovo X220 Tablet with an i7-2640M and 12 GB of RAM, and a home-built desktop with an Asus P8Z68V-PRO GEN3 mobo, an i7 2660K CPU, and 32 GB of RAM — where my preference is to remote into those machines and work on them.
But alas, that doesn’t always work, because not all applications are 100% (or even a little bit) compatible with RDP. And figuring out what’s compatible and what’s not can be interesting, too. Until some time in 2011, Microsoft offered a free tool called the RDS Application Compatibility Checker. But in 2011 they handed off this functionality to a Dell-owned software company formerly known as Quest Software, whose ChangeBASE product includes a variety of tools, including automated application compatibility testing that incorporates a variety of remote access compatibility checks. I’ve launched inquiries to find out more about this capability, because I have to imagine that many IT professionals (and network/data center/virtualization admins) will want to know what’s safe to use with RDP and what’s not, particularly when it comes to custom applications of the mission critical variety, as well as any number of common desktop tools and utilities.
What spurred this blog post from me was the discovery, upon installing and learning to use StarDock Software’s ModernMix utility (which permits Modern UI apps to run in Windows on the Windows 8 desktop, instead of taking over the whole screen — or part of it, if you’re inclined to run multiple Modern UI apps in tandem) that it worked wonderfully when I was sitting at the real physical keyboard for those PCs, but not at the remote keyboard. The issue is that it uses the F10 function key to instruct the program to switch from filling the entire display to displaying inside a window, and function key presses are notoriously tricksy to transport across a remote link. However, after setting up one app in a desktop window, all other apps would then appear via remote access. Nevertheless, I had to make that happen at the real physical keyboard to enable the remote connection to work properly. Here’s a screencap of the Skydrive app running in a window, after the initial set-up was handled on the actual machine:
I have discovered other apps that are even less well-behaved when using remote access to my Windows 8 desktop. On the Lenovo machine for example, Lenovo System Update v5 works perfectly when run from the local keyboard; if you launch the program from a remote connection, nothing ever appears on the display to indicate that the program is running (nor does an application entry appear in Task Manager, either). The only way to get the program to work remotely is to fire it off before starting a remote session, after which it stays up and running in the remote window that Remote Desktop Connection opens to that machine. I assume the same conditions might hold for programs that operate on hardware at a low level, too: that’s why I’d be leery of trying a disk partitioning program through a remote connection, for example, or nervous about other, similar low-level hardware configuration or set-up tools.
All of this, of course, confirms the notion that testing of applications in a corporate environment must include checking them through a remote access window as well as on the local desktop. It will be even harder for help desk or tech support folks to get their jobs done if the tools they’d like to use don’t work that way: far better in fact, to find tools that are amenable to remote control and operation, to ensure that when those hard-working IT pros must reach out to a user’s desktop their chosen tools will work as needed and expected.
Windows 8 includes a Control Panel widget called “Create a recovery drive,” that you can use to create a USB flash drive to boot up and repair your system should anything go wrong with the boot or system partitions. And if your PC includes a custom-built recovery partition (something you’ll have at your disposal when the machine comes from an OEM, or the system builder has taken the trouble to build a recovery partition as part of the initial system install), you can even move it from its present location on the system/boot drive to the flash drive to free up space. This can be especially helpful on tablet, notebook, or other PCs with smaller (less than 256 GB) system/boot drives, where every GB of storage space really counts. A typical recovery partition might be as big as 10-15 GB: on a 64 or 128 GB SSD, that’s a significant amount of storage space.
Building such a recovery drive is very easy. Type “Create a recovery drive” in the Windows Start screen (Modern UI method) or into the search box in a Start menu replacement such as Start8 or Classic Shell, then follow the prompts as they appear. Depending on whether or not you have a recovery partition to transfer, the process takes as little time as under a minute (no recovery partition) to as long as 10 minutes (15 GB recovery partition) to complete. You’ll know what you’re up against depending on whether or not the checkbox and text that reads “Copy the recovery partition from the PC to the recovery drive” is available and in dark text, or unavailable in greyed-out text on the initial Recovery Drive screen as shown here:
I built one of my Windows 8 test machines from scratch, and installed Windows 8 over Windows 7 on the Lenovo X220 Tablet, so neither of those machines had a recovery partition for me to copy. However, after setting up a recovery drive for my desktop Windows 8 machine, I then turned to RecImgManager to create a refresh image for that machine on the same 32 GB flash drive where the initial recovery drive materials were deposited. Since the base level files consume only 223 MB of disk space (this proved to be the same for both desktop and notebook PCs, so I must believe that this holds true for all 64-bit Windows 8 PCs). The refresh image for my X220 Tablet is 8.5 GB, while the one for my i7 2600K desktop is 7.5 GB so you could easily use a 16 GB flash drive, instead of the 32 GB unit I employed for this maneuver.
The combination of the recovery drive functionality and a refresh image means you can start up Windows 8 from the USB flash drive, but some additional work is required to re-create a usable environment on a target PC. You must basically convert the .wim into an install image, so that you can then install that image to rebuild your machine. The good news is this custom install will include your drivers and applications; the bad news is, you must jump through a few hoops to make this happen. Fortunately, it is all nicely explained in a forum thread over on the Windows Eight Forums entitled “recover Windows 8 from a .wim file.” I’ll be fooling around with this in my spare time over the next week or two, and will report further as I learn more.
[Note: Although the recimg utility itself didn’t help me troubleshoot this problem, I was able to Google my way into understanding that you cannot capture a refresh image onto an SD Card or a USB Flash drive. The utility insists on writing to a full-fledged disk of some kind (works with both SATA or other direct-attached SSDs or conventional drives, and with USB attached SSDs or conventional drives). I don’t have any USB3 high-speed/high-capacity UFDs around right now, but I plan to try some out as soon as I can lay hands on one that’s big enough — 32 GB or better — and affordable. This means you can still build the kind of Recovery UFD I’m talking about in this blog post, but you can’t use that UFD as the target when recording the .wim image you’ll convert to another form as described in the utilities mentioned in the Win8 Forums blog posts above. Again: more on this as I keep digging deeper.]
There’s a terrific piece over on Paul Thurrott’s Supersite for Windows that posted yesterday, entitled “In Praise of the Windows 8 Desktop.” In that story, he calls out all the new features on the old-fashioned but still extremely usable desktop in Windows 8 that deliver new or vastly improved functionality. Those items are worth perusing and pondering, as they do provide some real and tangible reasons why business might consider permitting Windows 8 to find a spot on their users’ desktops. And FWIW, I mostly concur with his observations and analyses, and even add a special favorite item of my own.
He calls specific attention to the following elements or aspects of Windows 8 in the story:
- Aero and its resource-hungry “glass effects” have given way to a more spare, square, and opaque Window display on the desktop. Aero was banished because of its negative impact on battery life, which makes the new look also more battery friendly.
- Windows Explorer — actually called “File Explorer” in Windows 8 to distinguish it better from IE — gets a ribbon-based UI (that power users can banish from the File Explorer window, if they so choose). Other improvements include the ability to mount ISO files and virtual hard disks (vhd and vhdx files) right in File Explorer, in the form of volumes with drive letters and everything. I also like the speed improvements to file copy and move operations, and the added details on progress boxes that tell you what’s going on. If only MS would add the ability to resume interrupted file move/copy operations as well…
- Task Manager gets a big increase in capability, including a vastly improved look (which Thurrott correctly attributes to Sysinternals’ excellent Process Explorer utility, the brainchild of Windows guru Mark Russinovich, who’s been a Microsoft Fellow for almost a decade now), the ability to manage startup items (no more msconfig.exe, yippee), services, app history, and more.
- Improved security thanks to a beefed-up Windows Defender and Smartscreen technology. Given recent reviews have found these free MS built-ins less secure than other free (and commercial) alternatives, I’m not sure I buy into this 100%. But I do confess to using Windows Defender on test systems and VMs because it installs by default and is at least adequate at keeping things secure.
- For power users, Thurrott points to Storage Spaces fast and simple support for JBOD and data redundancy without — as he puts it — “…a master’s degree in RAID required.” He also mentions BitLocker, BitLocker to Go, and improved support for multiple displays as boons to those who want to give Windows a real work-out.
- Finally, he calls on fast boot (and shutdown) times and the ability to reset a Windows install to factory reset conditions (he calls this “nuke from space”) as great improvements over earlier Windows versions, with reports of 6 second boot times and the ability to run a reset in 6-7 minutes. My times aren’t that fast — more on the order of 30-50 seconds — but mine are better than Windows 7 on the same systems across the board, too.
Of course, I’ve got some items I’d like to add to this list, too:
- The “Refresh your PC” capability is also great, but even better is the built-in recimg (record image) command that lets you capture a complete Windows install image after you’ve updated all the drivers and installed all of your favorite applications (especially desktop applications and installer-based device drivers) and use it as the source for the refresh operation. Slimware Utilities’ free RecImgManager tool makes this facility especially easy and convenient to use, too.
- Because Windows 8 supports the same hypervisor that Windows Server 2012 uses (Hyper-V Manager v3.0), using VMs in Windows 8 beats the pants off Virtual PC or Virtual Server in Windows 7: you get support for bigger virtual disks (VHDX format), faster VM load/unload times, more VMs, and lots more.
- I like Windows 8’s ability to use the Microsoft Live ID login for system access, and the ability to synchronize multiple Windows 8 user environments that share a common Live ID. I use this on multiple Windows 8 installations very gladly.
- Windows 8 adds improved support for the Universal Extensible Firmware Interface (UEFI) and provide the ability to secure the boot-up process before the OS is loaded. Though I’ve had occasional issues with this facility, it does provide better low-level diagnostics and troubleshooting for PCs, and will protect boot-up from malware and break-in attempts.
I do see Thurrott’s point — that there are things about the Windows 8 desktop that users and admins will like — but I also get the backlash against the removal of the Start Menu from the base OS, and how little business users like being forced to boot into the Modern UI tile-based Start screen, rather than the old, familiar Windows desktop. Nevertheless, there are real, positive reasons to look further and deeper into Windows 8. Whether or not this leads to wider adoption still remains to be seen.
I’ve blogged here repeatedly about the benefits — and some gotchas — for the built-in Windows 8 recimg (record image) command. Here’s a list of those items for anyone who might be interested in learning more about this fabulous Windows 8 (only) utility that permits admins to capture and store current Windows 8 system images in .wim (Windows Image) file format, then restore them to refresh their systems as and when they might be needed:
- Create Your Own Refresh Image for Windows 8 (12/7/2012)
- Make DISM Your Go-To Image Management Tool in Win8 (12/10/2012)
- What Gets Lost When Using Win8 Refresh (1/21/2013)
- More Benefits of Win8 Refresh (1/23/2013)
I’ve become a big believer in using the built in recimg command to capture — and when necessary, restore — Windows 8 image files as a way of fixing subtle problems in Windows that might otherwise take weeks to troubleshoot. I learned this lesson the hard way on one of my Windows 8 machines when it wouldn’t let me run the recimg command at the command line (which means RecImgManager couldn’t work either, of course). After running a factory refresh on that machine I was able to start using recimg at the command line and through the RecImgManager program itself.
As depicted in the screen cap at the head of this blog post, you can add image snapshots already captured using recimg at the command line. This works by using the browse button (bottom right) in RecImgManager to find and integrate such captures as “Imported Snapshot” items (you see an image I grabbed in late January as I was working on the “What Gets Lost…” post linked to in item 3 above). As long as you know where to find your images (easy enough to do, using File Manager to search on “*.wim”) you can add them to the items under RecImgManager’s control.
Now that I’ve been able to work with the underlying recimg command and the RecImgManager utility from SlimWare, I’ve really learned to appreciate the latter’s convenience. It doesn’t do anything the command line utility can’t do, but it provides a very nice visual organization to those capabilities, and makes it much easier to capture new images, and especially, to select images to use for a restore operation. It’s always nice when you find a good, free, and capable software tool that makes it easier to manage desktops. This would be one of those.
I’ve been reading a whole spate of reviews on various Windows 8 tablets lately, including the Dell Latitude 10, the Acer Iconia W510, the Lenovo ThinkPad Tablet 2, and so on and so forth (for a nice synopsis on why Windows tablets might actually make sense in the enterprise see Adrian Kingsley-Hughes 3/14/13 post entitled “New Windows-powered tablets threaten iPad’s enterprise dominance, claims analyst“). All of this has got to me to thinking about why I’m not willing to buy the current Surface Pro, or any of these other models at this point in time. In so doing, I hope I’ve formulated a nice list of design goals for the OEMs and system designers to ponder as they design a next generation of Haswell-based tablet PCs for business users like me:
- More horsepower: most of the successful tablet designs right now rest on the latest dual-core Atom processors, and simply don’t have enough oomph for me. Those that do have oomph — like the current Surface Pro — don’t have enough battery life (see next item).
- Longer battery life: the tablets with oomph can’t generally manage to squeak more than 5 hours out of a fully-charged battery. I want at least 8 hours, preferably 10 hours or more. Here again, those tablets that can do this (the Dell Latitude 10 is an excellent example) currently lack the processing power I want.
- More pixels, please: Far too many tablets still sit at 1366×768 resolution, which isn’t enough pixels anymore, even on a smaller screen where that form factor looks acceptable. I want at least full HD (1980×1020) or better, please, with a pixel density of at least 200 ppi.
- User accessibility: Though this may mean slightly thicker enclosures, I’d like to see the next-gen high end-tablets with underside ports to access memory, mSATA SSD ports, WLAN ports, with a user-replaceable battery receptacle. Storage and RAM are growing too fast to force buyers to accept soldered-in components for the 3-5 years that’s typical for the life of a modern notebook PC (or a valid tablet replacement); WLAN modules must often be replaced for overseas travel; and a user-swappable battery makes it possible to keep computing on long flights (or very long days).
- Multi-factor authentication, plus: Go ahead, put a fingerprint scanner into these units, or add both a front-facing camera and facial recognition software, so that enterprise users can add biometric authentication to the more usual account/password or image-touch-sequence login methods. Let installers add a “nuke the drive” option after a large number of failed log-in attempts (10 or more is good), and make sure the device works with remote wipe facilities included in most Mobile Device Management (MDM) platforms nowadays.
- Smart virtualization: Make sure the units support virtualization (as both client and hypervisor) to permit clients to remote into their data centers quickly and easily on the one hand (acting as a client), and to run various VMs locally (acting as a hypervisor) as well.
- Good accessories: Provide strong, durable cases with keyboard/mouse modules that don’t add too much to the overall size/weight equation of the tablet itself. Docking stations for work at the office are also very nice: make sure you put lots of ports, video options, and GbE Ethernet into these babies, and make it way easy to dock/undock the tablet for high-speed entrances and exits.
- Fast peripheral and storage ports: At least two USB 3.0 ports, and a reliable microSD port, please. The former lets me use all kinds of high-speed peripherals and storage, the latter lets me extend my storage space by 50% at a modest cost (64GB microSD cards go for about $65-90 these days). A mini-DisplayPort would be nice, too, but not really necessary if you add another USB 3.0 port for video access.
I know, I know: it’s a LOT to ask, and I’m hoping that Intel will fix its chipset USB3 issues with the Haswell chipset quickly, so technology can jump on that bandwagon sooner rather than later. Given the lower power consumption of the Haswell CPUs, we may even see something like what I’ve described late in 2013 or early in 2014. I would happily pay a premium over Surface Pro costs of about $1,200 at the moment to get everything I’m after — say $1,600 for a Surface-Pro like tablet with a 256 GB mSATA SSD, 16 GB RAM, and an LTE WLAN module? Here’s hoping this might actually come to pass!
Mistakenly posted to the wrong blog (my apologies): find the real article over at my IT Career JumpStart blog instead.
I know I’m busy, busy, busy when Patch Tuesday takes me by surprise, and that’s what happened to me yesterday. Between phone calls galore, and catch-up from a long family weekend, I wasn’t necessarily ready to go haring after Windows Updates. But, ready or not, there it was and I’ve been digging in ever since. My Windows 8 machines show 14 updates for Windows itself, and another 10 for Microsoft Office 2013; my Windows 7 machines show 7 for Windows and components (including Internet Explorer 10 , which has now been pushed into the Windows Update channel) and another 3 for Microsoft Office 2010.
A quick gander at the latest Microsoft Security Bulletin for March 2013 reveals bulletins numbered MS13-021 through -027, for a total of 7 bulletins overall. Four of them are labeled critical (MS13-021 through -024), with the first three qualified as “Remote Code Execution” and MS13-024 as “Elevation of Privilege.” The coverage is all over the place: -021 is a cumulative security update for IE, -022 addresses Silverlight vulnerabilities, -023 tackles the Visio Viewer 2010, and -024 addresses four SharePoint vulnerabilities.
The remaining three bulletins are rated Important, where -025 and -026 are qualified as “Information Disclosure,” and -027 as “Elevation of Privilege.” The -025 update is for OneNote, -026 is for Outlook for Mac, and -027 touches on Kernel-mode drivers. MS13-021 and -027 require a restart, -023, -024, and -025 may require one, and the remaining items (-022, -026) do note require a restart. Severity ratings nothwithstanding, my impression is that admins will want to consider accelerating deployment of -021 and -027 first and foremost, as these are most likely to address potential vulnerabilities on the vast majority of end-user machines, unless Silverlight is also in broad use (in which case it should be prioritized for testing and possible deployment as well).
BTW, I really like the Acknowledgements section that has been added to the MS Security Bulletins, which gives those who report vulnerabilities credit for their work, and also ties updates to specific entries in the Common Vulnerabilities and Exposures (CVE) database. It’s also interesting to see many of the same names (and test labs) showing up in those credits as well. Here’s a snippet, by way of illustration:
I just read a mind-boggling story over at Business Insider (an arm of the respected British newspaper, The Guardian) entitled “The PC Business Isn’t Growing Anywhere, And It’s Really Scary for Microsoft.” The story is laden with interesting facts and figures about future growth for the PC industry, but the real point of the story might be best summarized as follows:
- The developing world (outside of North America, Europe, and the Pacific Rim) is where the biggest growth potential for computing lives and works.
- The best the PC industry can hope for in the future is zero growth, as fewer buyers spend money on PCs (though a larger number of buyers overall will be participating in the market for computing devices of all kinds).
- The global market for PCs is shrinking not growing, because users everywhere turn first and foremost to smartphones (and second to tablets) to get online nowadays.
The inference worth drawing from this analysis is that PC vendors (and PC OS makers, like Microsoft) cannot expect the rising tide of technology purchases outside the developed world to keep their boats afloat, or to help them achieve incremental success outside their historical core markets. As is so often the case in the developing world, buyers simply short-circuit the traditional path of technology adoption, and skip to the inevitable conclusion — in this case, nearly universal adoption of smartphones as their primary computing platforms of choice.
The story goes further to observe that MS has done itself no favors with the feature set and UI behavior of Windows 8, and widespread lack of adoption of that operating system in the business world at any level of commerce (from Mom’n’Pop shops up to the largest enterprises). Pricing for more portable notebook PCs — the so-called “ultrabooks” — also imposes a barrier to buy-in for the products that might otherwise drive broader adoption and use of PC platforms. Though the hit won’t be huge or damaging for Microsoft (and other major PC players) at first, if these trends continue, all will have profound reasons for worry as the decade wears on.
FWIW, I agree with this analysis, and believe that unless a compelling and affordable tablet-based Windows computer (which the Surface Pro suggests, but fails to deliver on grounds of price, weight, and battery life) appears on the market soon, the PC era may well move to a close much sooner than many might believe possible. That’s not to say I think the need or market for workstation class machines will ever disappear; rather, I believe the bulk of personal and workplace computing will migrate to sufficiently powerful mobile device platforms that can run VMs resident in a data center somewhere, and only those who need to consume lots of computing resources in-place (3D modeling, CAD, heavy data analysis, and so forth) will need to stick to traditional desktop and notebook machines as we know them today.
One big component of the cost of PCs nowadays — especially those touchscreen notebooks, ultrabooks, and tablets most likely to play host to Windows 8 — is the fee from Microsoft for the OS license. That number may be declining soon, according to a report from Taiwanese publication DigiTimes entitled “Touchscreen notebooks to enjoy at least a 10% price cut after Windows 8 discount,” (note: I had to visit the Google cache to read the report, because the URL on the preceding link returned a 404 error). Their research indicates that touchscreen PC prices could drop by 10% whereas “some entry-level and mainstream models may even drop more than 20%.”
Will this be enough to prop up Windows 8’s lagging uptake (see Steven J Vaughan-Nichol’s excellent analysis in “Five reasons why Windows 8 has failed” on ZDNet)? Perhaps lower prices will encourage some improvements to Windows 8’s fortunes, but there’s a lot of ground to be made up (Vaughan-Nichol’s analysis shows that usage rates for Windows 8 are only just over half of Windows Vista’s at best, itself no paragon of marketing acceptance or product excellence).
To me what’s interesting about this revelation is that Microsoft is actually willing to reduce OEM licensing costs to add some momentum to Windows 8 purchases. Traditionally, their posture on pricing has been more inflexible, because OEM licensing has been and remains the primary engine for desktop operating system revenues. This testifies to Microsoft’s understanding that Windows 8 is in trouble (or at least, “failing to perform to its expectations”). Hopefully, the pricing changes will be enough to breathe more vigor into Windows 8 sales. That should make watching the sales numbers — and the corresponding usage rates (or as Net Applications calls them “market share”) over the next half year more than ordinarily interesting.
As I explained in my February 25 post “MS Office 2013 Licensing Follies,” MS updated its terms for an Office 2013 license to prohibit transfer from one PC to another, “so if you decommission an older PC running Office 2013, and seek to install the program on a newer replacement PC, you’d theoretically be required to purchase a new license for that replacement machine.” Apparently, the company decided that most buyers would pick the subscription model for Office 2013, where no transfer actually has some reasonable basis, and overlooked the fact that those who purchase retail — or for the readers of this blog, volume — licenses would scream bloody murder at what they’ve always considered an inalienable right for their precious Office licenses.
The screaming must have echoed loudly in the hallways of Redmond because Ed Bott reported this morning (see “Microsoft restores transfer rights for retail Office 2013 copies“) that Microsoft is restoring the same “perpetual license” terms that had applied to all three such versions of Office 2010 where Office 2013 is concerned. This means that those who purchase a retail license (rather than a subscription) or whose licenses depend on access to ISO downloads available through TechNet, MSDN, or volume license agreements and Software Assurance with Microsoft will get their apparently much-coveted transfer rights back.
If you enjoy reading Microsoft eat crow, here’s the verbatim text from an Office blog post where you can watch that bird make its way down their gullet:
Based on customer feedback we have changed the Office 2013 retail license agreement to allow customers to move the software from one computer to another. This means customers can transfer Office 2013 to a different computer if their device fails or they get a new one. Previously, customers could only transfer their Office 2013 software to a new device if their PC failed under warranty.
While the licensing agreement text accompanying Office 2013 software will be updated in future releases, this change is effective immediately and applies to Office Home and Student 2013, Office Home and Business 2013, Office Professional 2013 and the standalone Office 2013 applications. With this change, customers can move the software to another computer once every 90 days. These terms are identical to those found in the Office 2010 software.