Although Windows 8 versions do a much better job of accommodating and adjusting to solid state drives (SSDs) used for system/boot and other purposes as compared to earlier Windows versions, there are still certain ways to improve upon their default behaviors. As I recently worked through Les Tokar’s excellent article at TheSSDReview.com entitled “The SSD Optimization Guide Ultimate Windows 8 (and Win7) Edition” (5 pp, 4/23/2013), however, I realized that there are numerous things that sysadmins can and probably should do the extract the best possible performance benefits from using SSDs. The results can be beneficial: I squeezed an extra 15% in performance from an OCZ Vertex 4 system/boot drive simply by working my way through the list of 21 tips (18 or 19 of them offer substantive “do-this” instructions) around which this guide is built. I suspect that other Windows-heads charged with the care and feeding of systems with SSDs installed can do as well or better by doing likewise.
I skipped three of the steps in the guide as I worked my way through them in numerical order. Two of those I omitted because I didn’t want to implement them on my primary production system: Tip 12 asked me to turn off Windows Search (which I find useful on my old-fashioned data drives, all of them conventional hard disks); Tip 19 (against which Tokar himself inveighs) asks users to tweak BIOS settings to turn off CPU states that produce higher performance at the cost of reliability or system stability. The third, covered in Tip 10, explains how to tweak multi-boot Windows systems for performance gains (my production system only boots Windows 8.1 Update 1). At the end of my efforts CrystalDiskMark 3.0.3 (64-bit version) produced the following not-at-all-shabby results:
Values shown here don’t match those for the latest SATA3 or PCI-e SSDs, but they aren’t bad, either.
I have to believe that working through the list of tips on Windows 8 systems with SSDs installed will be beneficial in many if not most such cases. That’s what makes Tokar’s Guide worth consulting. Check it out!
Though this post wanders a little bit off topic — it actually comes from an app built for iOS and the iPhone — I just had to write about it because it underscores so much of the real value of taking more and better advantage of the computing power present in mobile devices. As I was taking my walk this morning, I was also listening to the news on NPR’s “Morning Edition.” That’s when I heard the story entitled “Father Devises A ‘Bionic Pancreas’ To Help Son With Diabetes,” which recounts NPR reporter Rob Stein’s interactions with Ed Diamano and his 15-year-old son, David, who was diagnosed with Type 1 diabetes at the tender age of 11 months, just over 14 years ago.
To make a long but interesting story short enough for this blog post, it suffices to observe that Diamano changed fields from mathematics to biomedical engineering to do everything he possibly could to help his son cope with the disease, and to try to avoid some of its most damaging long-term side-effects, which can include blindness, amputation of the extremeties, nerve damage, heart problems, and more. Diamano now works as an associate professor of biomedical engineering at Boston University, and has aimed his professional life and considerable energies toward what Stein describes in his story quite accurately as “developing a better way to care for people with Type 1 diabetes.”
To that end, Diamano has developed what he describes as a “bionic pancreas” to help diabetics manage their blood sugar as effectively as possible, and is working day and night to obtain FDA approval for the device before his son heads off to college in three more years. At present, Diamano monitors his son’s blood sugar with an active blood monitor that triggers an alarm whenever his blood sugar levels wander outside an acceptable middle range, so he can infuse insulin when blood sugar levels climb too high, or another hormone, glucagon, to increase blood sugar when levels drop too low.
This tiny transmitter monitors blood levels constantly and can transfer the data to nearby remote devices or monitors.
David already wears a transmitter on his abdomen that sends data on his glucose levels to a monitoring unit, which in turn triggers the aforementioned alarm so that Ed can take appropriate actions to adjust those levels as needed. But Diamano’s work goes beyond that approach for a series of tests and studies currently funded by the National Institutes of Health (NIH) and the Juvenile Diabetes Research Foundation. He’s built an iOS app that communicates with transmitters like the one David wears on dozens of adult and juvenile volunteers who will use this system for 11 days entirely on their own. The app manages blood sugar automatically, dispensing insulin or the other hormone to adjust levels up or down as needed, using a pair of compact pumps with reservoirs of each substance likewise plumbed into volunteer patients.
Early trials show that those volunteers found themselves less worried about managing their blood sugar than ever before. Diamano is absolutely obsessed with getting things ready, working, and approved before his son goes off to school, so it seems pretty likely that his research will lead to a method to help diabetics manage blood sugar 24/7 without having to resort to regular sticks and manual blood sugar level checks.
As amazing as this work is, and as significant as it could prove for diabetics around the world, the real import to me is that we’ve just begun to tap the possibilities inherent in the computing power available to an increasing portion of the global population through smart mobile devices. Trust me when I quote Mr. Joe Walsh on this subject: “You ain’t seen nothing yet!”
I’m a great fan of various web sites that track and announce new software and device drives. Two of my particular favorites include MajorGeeks and Station Drivers. While visiting the former this morning, I was alerted that a new version of Intel’s Rapid Storage Technology had been released: version 220.127.116.111, to be specific. A quick check to the about screen on my production desktop showed me that I was running version 18.104.22.1686, so I decided to download and install that update.
Ultimately, I did produce this screen from the latest Intel RST Help-About display, but not without some fear and loathing along the way.
Everything seemed to go swimmingly with the install, right up to the warning at the end of the process that the new version wouldn’t activate until after my next reboot. I elected to delay that reboot for a while, so I could work on other things. But as the following snippet of events from Event Viewer after ultimately firing off the reboot clearly show, something “interesting” happened on the way to a successful restart.
Please note the eight-minute gap from 9:35 to 9:43 in the event log.
The machine started to shut down as usual, and ended at a blue screen with the message “Restarting windows” with the cycling balls. Unfortunately, there it stayed for the next eight minutes, at which point I took my heart in mouth (and some comfort from a nightly image backup schedule) and hit the reset button on my production PC. My burning question while taking this action was, of course: “Will the machine boot correctly, or is this installation now hosed?”
Luckily for me the answer was “No, it’s not hosed” as Windows then promptly restarted without even complaining about an abnormal shutdown (nor does the event log include such a notification either). What’s highlighted in the event stream above apparently flags my hitting the reset button after eight minutes had elapsed since beginning the restart process. It looks like Windows didn’t hang until the shutdown process was nearly complete, since the OS didn’t complain about an abnormal shutdown. Fortunately for me, that also means that the processes involved in switching from the old version of RST to the new one (or at least flushing out the remains of the old one to be replaced by the elements of the new one upon the next successful boot-up) had already completed. I have to imagine that’s why my machine rebooted properly and is running correctly at this very moment.
But dang! I really HATE when this kind of thing happens because when storage drivers get interrupted in mid-update, there’s always the potential that the entire storage subsystem will be corrupted, damaged, or completely unusable. Although today may be Friday the 13th, I find myself thanking my lucky stars that whatever provoked my system hang didn’t strike until after key changes to my system had already been completed. And while it may have been completely in character to be reinstalling the old OS image from last night’s backup, I’m grateful to have been spared this unscheduled administrative chore. Whew! That was a close call…
Patch Tuesday has come and gone and along with it, my Windows 8.1 machines have picked up 20-26 new updates depending on what they’ve got installed (more MS applications generally mean more patches and fixes). One item of particular interest in the latest batch is summarized here:
Attentive users with OneDrive experience will notice lots of changes. There’s a new pop dialog that shows sync status, and provides buttons to force an immediate sync, or to pause a sync that might be underway (another link opens OneDrive in File Explorer, which was the former default behavior).
Options to control sync are immediately available, and one more click takes you to the File Explorer view.
Right-clicking the notification icon also now produces more options, including a jump to the OneDrive website, immediate access to the sync troubleshooter (“View sync problems”), OneDrive storage controls and related settings, and Help information. Storage options even let you force local copies for OneDrive contents so all files can be accessed offline (and will be synced automatically the next time you go online). All in all a nice set of changes, and a simpler, more understandable set of controls. Looks like MS got something right, and really made some improvements.
In the past few weeks, I’ve been involved in some BIOS updates and also with upgrading the firmware on a number of SSDs from makers including Samsung, Intel, OCZ, and Plextor. On a couple of occasions, the installation routine has called for converting an ISO to a bootable image so the computer can work its magic outside of the Windows OS environment, usually in the embrace of an alternate Linux-derived OS that runs the installer and firmware update process independently. This is often handled by burning a bootable CD or DVD to perform the necessary tasks, but that comes with some time disadvantages — namely, it take a while to burn optical media for use, and optical drives generally run at the bottom of the secondary storage performance hierarchy (slower than everything else: hard disks, SSDs, and USB flash drives).
Here’s Rufus with the Windows8.1-iso file as its install target, but any iso will do.
That’s why I turned to the latest version of Pete Batard’s excellent tool Rufus (The Reliable USB Formatting Utility), currently out in version 14.5.9 (Build 506) as I write this blog post. It works with any .iso file to build a bootable UFD to deliver the contents of that image to the target system at boot-time, simply by targeting the host UFD as the focus for the next boot-up. I used it to build bootable UFDs to update the BIOS on one of my Lenovo laptops (the X220 Tablet, model 4294-CTO). I also used it to update the firmware on the Samsung 840 SSD on my wife’s primary desktop machine. In both cases, it took less than 5 minutes to prepare the bootable UFD, and a similar amount of time to boot the machine to the UFD, and then let the corresponding utilities do their thing.
Rufus excels at building OS install UFDs so I was pretty familiar with the program already. This added capability makes it incredibly handy in those occasional situations where vendors don’t provide firmware or BIOS update utilities that run inside the normal host OS, but instead require a boot-up into an environment that they control completely (the Paragon disk migration and re-org tools do this as well, but they’ve created their own standalone environment that handles the process for you transparently, from start to finish). Good stuff!
Next Tuesday, June 10, is not only Patch Tuesday for the month, but also represents the point in time by which those running Windows 8.1 who get their updates from Microsoft Update or Windows Update MUST have applied the April 2014 Windows 8.1 Update patches to keep receiving patches and fixes for their Windows installations (those who use WSUS or other staging services to get updates, usually internally managed, have until Patch Tuesday next month — August 2014, that is — to do likewise). With a major and important fix for a long-standing Internet Explorer vulnerability (Larry Seltzer says it’s likely to appear as security bulletin MS14-030) in the mix, and another remote code execution vulnerability labeled “critical” that affects all supported versions of Windows (including Server Core), plus various other server add-ons and even MS office (2007 and 2010), it’s probably a good idea to get right with those April updates before next Tuesday rolls around.
A preview of the seven updates scheduled for next Tuesday from the MS Advance Notification.
Of course, the real issue in achieving a successful application of the Windows 8.1 Update is in KB2919355, about which I have blogged repeatedly in the past couple of months (and about which Woody Leonhard has done the best job of digging into the dirty details over at InfoWorld; see my post “Woody Gets Down and Dirty on KB 2919355 Details” for pointers and links). For most Windows 8.1 installs, KB 2919355 (which includes a new installer that is apparently involved in such problems as can occur) mounts without a hitch. But for a certain and tangible percentage of Windows 8.1 installs, applying KB2919355 fails, provoking a pretty spectacular array of error codes (7 or 8 at last report, with more still possibly remaining to be reported). For those unlucky few for whom such failures occur, a reinstallation of Windows 8.1 seems to be the surest fix (a thankless and time-consuming task). But that’s how things work, or sometimes not, in Windows-land. For those in need of next Tuesday’s updates, and in this fix, it’s time to get cracking to prepare for that deadline…
Every now and then, it’s a good idea to run a reality check against Microsoft’s relentless drive to bring all parties into the Windows 8 fold ASAP. Netmarketshare’s latest “Desktop Share by Version” does a pretty good job of showing where the market is today, versus where Microsoft would like us to be. Take a look at this pie chart, so I can user it put some slices of that pie into context, especially where business use is concerned:
On the one side, we have Windows 7; on the other side, we have everything else, with Windows XP matching the remaining quarter-pie.
Source: www.netmarketshare.com (Operating Systems/Desktop Share by Version 6/4/2014)
What this tells me is that Windows 7 has finally reached maturity and now represents the primary desktop OS for business (and most other) uses. Given that we’re almost two months past the “end of life” for XP, it’s interesting that just over one quarter of the installed base is still running that venerable old operating system. One wonders what percentage of such users are business-based (I’m guessing it’s at least half, perhaps more) and how quickly that slice of the pie will deflate. Given that XP’s share has dropped by 50% in about 18 months, it seems reasonable to speculate that it will drop by 50% again by the end of 2014.
Will Windows 8 versions pick up that slack, or will Windows 7 slide inexorably into the other half of the pie as XP shuffles off the stage? That’s the really interesting question, indeed, and one I’m sure that keeps some people at Microsoft up at night. Let’s keep watching, and see what happens.
I’m reading some reports from CompuTex 2014 right now with a degree of bemusement. Among the slew of new product announcements typical of this event, there’s news from Asus about a new convertible tablet named the Transformer Book T300 Chi that’s billed as “the world’s thinnest 12.5-inch detachable tablet” (quote source: Engadget). At 7.3 mm thick (0.29″) it beats the previous record holder in this interesting category (the Microsoft Surface Pro 3, 9.1 mm) by a significant 1.8 mm (~0.071″).
Last week we saw the SP3 at 9.1 mm; this week it’s the Asus Transformer Book T300 Chi at 7.2 mm; what’s next?
Details on the new offering are still pretty sketchy. According to FirstPost (my best source of such skinny as is available at the moment), we’ve got a device with some kind of Haswell (or possibly Broadwell) “next-gen Core microarchitecture” (undoubtedly of the ULV variety, given the form factor), “up to 128 GB of flash” SSD, 4 GB of DDR3 RAM, and this surprising element “up to 1 TB of storage” (which may mean a hybrid HD, or perhaps an add-in second mSATA SSD). I’m pretty sure it will be available in at least i3 and i5 models, but would be surprised if i7 also joins the mix because the slim package doesn’t leave much room for battery capacity. The keyboard dock is itself quite slender (total width for the device with keyboard and tablet folded together is reported at a mere 14.3 mm, which indicates it’s only 7.1 mm (~0.29″) thick on its own. I have to suspect that doesn’t leave much room for a second internal battery, nor for too many additional ports (from the various photos currently available, there may be one or two USB ports of some kind, possibly a mini DisplayPort, and perhaps a Kensington Lock port, but it’s really hard to tell).
One thing’s for sure: aspiring to be the thinnest tablet is an interesting proposition nowadays, and apparently gives contenders little or no time to rest on their accomplishments before being dethroned. But at 7.2mm, Asus may be in a position where other vendors will be challenged to match them, let alone surpass them. We’ll just have to wait and see about that!
The Llama Mountain motherboard faces one way, the sisterboard the other way, to help heat dissipation. The entire package is just a little bigger than an iPhone. Wow!
[Note added 6/3/2014 10:22 AM: Part of the mystery behind the new Asus tablet is addressed by coverage from Computex that emerged this morning, regarding the so-called “Llama Mountain” prototype laptop from Intel, which provides a reference implementation built around the upcoming Broadwell incarnation of the company’s ultra-low voltage Haswell processors, now called the “Core M” (M is for Mobile) line. This is a “die shrink” from 22 to 14 nm of the already-popular Haswell family, and results in a set of mobile processors for which all members enjoy a TDP of under 10 watts. The best Llama Mountain story I’ve found so far — and there are already oodles of them out there — is from TheNextWeb and is entitled “Intel introduces incredibly thin Llama Mountain reference design running Windows 8.1.” Among other things, this story reveals that the Asus Transformer Book T300 Chi is indeed based on the Core M processor line, that it’s an entirely fanless design, and that Intel’s own reference tablet is a mere 7.2mm thick itself. TheNextWeb includes photos of the motherboard (which contains RAM and CPU plus the bulk of the PC’s circuitry) and a so-called “sisterboard” that houses SSD storage and the Wi-Fi radio components. The whole package is just slightly bigger than an iPhone (looks like a 4c model from what I can tell from the side-by-side photo). Here’s my favorite quote from the story: “Because of the small motherboard, the 12.5 inch tablet can house more batteries. The board is flanked by two batteries that give the device 32 hours of battery life. The tablet without the keyboard is 670 grams and 7.2mm thick. It’s thinner than an iPad air but running Windows 8.1 Pro.”
My! My! My! Hasn’t the tablet scene just suddenly gotten a lot more interesting? I have two thoughts on this: Poor Microsoft, with its triumphant SP3 announcement from last week now completely eclipsed. And is it possible that Apple will finally start feeling some heat from the long-downtrodden PC-derived tablet sector?]
A few months back, I posted a whimsical post right here entitled “Windows 8.1 Bada Bing!?” (March 3, 2014). It not only reported on rumors of a low-cost/no-cost version of Windows 8.1 which turned out to be correct, but speculated that it “…could be offered as an ‘upgrade’ to Windows 7 users who might otherwise be disinclined to migrate to Windows’ upcoming Update 1 release scheduled for April 8 or thereabouts” which turned out to be wrong, wrong, wrong.
Windows spokesguy Brandon LeBlanc explains what Windows with Bing is all about: cheap bits for cheaper low-end PCs.
You can find Mr. LeBlanc’s post on the subject and see for yourself, but these two blue-lined “call-out” points plucked from its content (plus the title of the blog itself) pretty much tell the whole story:
- scale Windows to an even greater number of customers with more partners and new devices at a broader range of price points
- Windows will be available for 0 dollars to our hardware partners
Combine all this happy rhetoric with the retuning of Windows 8.1 Update’s hardware requirements to 16 GB of storage and 2 GB of RAM, and you’ve got some a more competitive recipe for low-end devices that don’t impose a hefty hardware burden on hardware builders, and likewise limit the cost of climbing aboard the Windows wagon to grab some OS bits along the way. Throw in LeBlanc’s remark that “some of these devices, in particular tablets, will also come with Office or a one-year subscription to Office 365” and the whole thing starts to make sense.
My favorite part of the release: the line where Mr. LeBlanc reminds us that “… more people … will have access to an even broader selection of new devices with all the awesomeness that Windows 8.1 provides, and get Office too…” Who knew? Look out Android: here comes Windows!
When I saw the headline at pocket-lint.com this morning, my first thought was “marketing/advertising spam!” But when I dug a bit more deeply into “SSD breakthrough makes drives 300 per cent faster and 60 percent more power efficient” it actually started making some kind of sense. Here’s the scoop, in the simplest possible terms: SSDs use a complicated method to write data to their storage chips, which basically depends on writing entire “disk blocks” (memory blocks/pages, really) any time new information needs to be recorded. Researchers at Chuo University in Japan have figured out a new method that uses a “logical block address scrambler” to stop data from being written to a new page, and targets it at a block that’s scheduled to be erased in the next garbage collection sweep anyway. This means fewer new pages to write, less copying of data from page to page, and longer drive lifetimes (less writing overall means more time until the so-called “write ceiling” for individual memory locations is reached).
This time, it’s not faster hardware that delivers performance and power gains: it’s smarter software!
The real beauty of this discovery is that it involves only firmware changes and doesn’t require new hardware. Assuming SSD controller makers would be willing to implement these new algorithms (and why would they risk incurring customer ire by refusing to do so?) your next SSD firmware update could be something really special indeed. I’m going to be very curious to see if the big players in this space — Intel, Samsung, Marvell, LSI, and so forth — will ante up and invest in existing products, or if this will become a strictly next-generation-only phenomenon.
This should be an interesting set of developments to watch out for and track, that’s for sure! How often does one get a shot at a threefold speedup, while saving energy at the same time? That’s why my initial take was “Too good to be true!” Only time will tell if that turns out to be the case, not because the speed-up and savings aren’t possible, but because the SSD makers decide not to offer retroactive updates to owners of existing –especially older — SSDs.
[Note added 5/27: Robin Harris, a knowledgeable and tech-savvy writer for ZDNet, posted a story this morning about this research entitled “An SSD speed jump by four-times? Not so fast.” He provides more context, including the revelation that the research (unsurprisingly) is based on a simulation of how NAND Flash behaves and works in real devices. He provides some useful analysis, too, and concludes with this “bottom line” remark: “Don’t expect to see this impressive speed up in the next three years. Flash vendors have other techniques for speeding flash performance.” So indeed, perhaps it was truly “too good to be true!”]