Last week, I reported that OneDrive raised the ceiling on free storage, and that Google Drive was one of two other free providers (Copy.com is the other) to match their 15 GB storage allotment. I also reported that by signing up for Office 365, MS subscribers obtain a 1 TB OneDrive storage allotment in addition to access to the online components in Office. Today, I’m reporting that Google has fired another salvo in what appears to be an emerging online services and storage conflict — namely, access to Google Apps for Business plus unlimited storage (or 1 TB per user if 5 or fewer users from the same organization sign up) for $10 a month per user. Because Microsoft charges $12.50 to $15.00 a month for 25-300 users ($5.00 per user, per month for a minimum of 5 users, but no access to full installed versions nor tablet access on iPad or Windows PCs) this is very much in the same ballpark as the Microsoft offering, give or take $30 – 60 per user in annual fees.
Google Drive storage on a pay-as-you-go basis includes Google Apps for Business, unlimited storage for 5+ user accounts.
I can’t help but see the “unlimited storage” option as Google’s way of inducing prospective customers to buy into their apps package, as a presumably natural consequence of gaining the cloud-based storage they need. On the face of it, the storage is easy to access and simple to use. Because synchronization occurs in the background over time, users don’t notice how long it takes to transfer data from the cloud drive to a local drive, or vice-versa (a test copy of the 3.4 GB Windows 8.1 64-bit ISO file reported instantaneous completion, and resulted in upload rates between 800 KBits and 6 MBits as the copy made its way from my local drive to the Google repository online over a half-hour period or so). Could it be that Google is thinking users will ultimately say “I came for the storage, and stayed for the apps?” That’s my best guess at this point.
One thing’s for sure: the cost of online storage and a safe repository for online backups and other important digital assets, just got a whole, whole bunch cheaper. Look out Carbonite, Mozy, et al: Google is coming to eat your lunch!
Those who follow the Windows OS rumor mill have learned over the past few years that when Russian “leaker” WZOR offers up information — an identity carefully enough shrouded that nobody’s sure whether it’s a single individual or a group of like-minded Windows-heads — it’s usually worth checking out, and often turns out to be correct.
According to an archetypal “usually reliable source,” Windows 8.1 Update 2 will make its debut here in mid-July.
As reported in Woody Leonhard’s blog post for Infoworld yesterday (6/26/2014) entitled “Windows 8.1 Update 2 likely hit RTM, could debut in July,” WZOR tweeted two intriguing rumors about the next upcoming update to the current Windows desktop OS:
1. The update will be announced at the upcoming Worldwide Partner Conference (WPC) to be held in Washington DC from July 13-17
2. The official sign-off for the Windows 8.1 Update 2 release to manufacturing (RTM) build has reached (or is reaching) final, frozen status some time this week, possibly even yesterday (6/27) or today (6/28)
It’s not unusual for information about impending MS releases to make their way out the door in advance of Microsoft’s own carefully orchestrated PR maneuvers and official announcements. What is unusual about Windows 8.1 Update 2 is that we still know very little about what’s inside the update package, as it were. Even Woody can only speculate on its contents, and repeat shared beliefs about the upcoming release, which include the following recitations:
1. The Start menu is highly unlikely to make its return in this release, triumphant or otherwise
2. There probably won’t be a mechanism to run Modern UI apps in desktop Windows (a la Stardock’s ModernMix)
3. The update will most likely be available through the Windows Store rather than via Windows Update
4. After some delay period occurs (2-3 months, if the Update 1 history is any guide), this Update will be mandatory for those wishing to receive further future updates
5. Woody speculates further that “…most bets emphasize plumbing changes in Update 2 and very little if anything on the UI side” to which I’ll add the observation that key components in earlier Service Pack releases invariably included update roll-ups from past patches and fixes: something similar is highly likely to be part of Update 2 as well
Other than that, we don’t know much. But at least, we won’t have to wait too much longer (about three weeks from today) before these rumors can be confirmed, corrected, or denied by the reality presented at WPC — or not, as the case may be. Stay tuned!
Seems like there’s a lot happening with OneDrive lately. Above and beyond the features added or changed in the wake of June’s Patch Tuesday (here’s my blog on that stuff), MS has just announced that users’ free disk allocations are increasing. Everyday users will get their allocations bumped from 7 to 15 GB effective immediately, while Office 365 subscribers now get 1 TB of storage space per subscriber. Group Program Manager Omar Shahine blogged the details about this increase to the OneDrive Blog in a Monday, 6/23/2014 post entitled “Massive increase to OneDrive storage plans: 15 GB free for everyone, 1 TB for Office 365 subscribers.”
By giving away some appreciable space, does MS hope that OneDrive users will buy more at above-average rates?
This morning I was also pleased to see a ZDnet story from security and mobility maven Larry Seltzer entitled “Cloud storage price check” that helps to put this offer (and the further implications of buying in heavily to OneDrive) in perspective. There, he reveals that Microsoft’s upped allotments are actually fairly generous — only one other player listed, Bitcasa, exceeds their free space offer, while Google Drive and Copy.com match it — but that their prices for additional storage fall on the more expensive end of the spectrum. To be more specific, MS charges between $0.0321 up to $0.0388 per GB of online storage in OneDrive depending on the size commitment (curiously, with larger commitments costing more not less!), where the lowest prices fall in a range of $0.0024 to $0.0098 per GB from vendors MediaFire, Bitcasa, and Google Drive. Nobody’s saying that the free storage is an attempt to entice users to buy more at higher prices to make up for that lagniappe, but it’s hard not to think such thoughts when looking at the second table in Mr. Seltzer’s very nice article.
At the cheapest rates published for pay-as-you-go OneDrive storage space online, the 1 TB for Office 365 subscribers is worth $32.87 to $39.73 per month. That more than covers the cost of access to Office 365 by itself, and may actually provide a measurable inducement for customers who’ve been considering a subscription to go ahead and sign up. If that means you, I’d recommend acting soon enough to take advantage of this offer: it can’t last forever!
On a happier front, Jason Moore of the OneDrive team announced this morning (June 25) that MS is pushing updates to OneDrive for iOS and Android to simplify storing photos and videos from their devices onto OneDrive. iOS users will also be able to open Office documents of all kinds directly inside Office Mobile on their iPhones, or Office for iPad on the eponymous device.
I was looking at some screen caps in a utility from Sergey Tkachenko over at WinAero.com (for a nice little utility called This PC Tweaker) when I suddenly realized that the Libraries icon and listing had gone missing. After scratching my head a bit, a quick google search turned up a PC World article from Rick Broida that reminded me that MS had inexplicably turned off this display when they released the 8.1 update. Turns out I’ve been selectively restoring this on some, but not all of my PCs, and I just happened to be working on that hadn’t yet gotten this interface tweak.
If you’re in the same boat, here’s how to turn the Libraries back on in your Windows 8.1 installation, too:
1. Inside File Explorer, click the View tab.
2. Click the Navigation Pane, then make sure the check next to Libraries is selected (it’s turned off in the following screen cap, so you’d click it to turn it back on).
A single click turns libraries back on in File Explorer, once you get to the right control.
Though it’s a mystery as to why MS decided to turn Libraries off by default, at least it’s relatively easy to turn them back on. Go figure!
I’ve been shopping around for a new laptop lately. There are lots of interesting choices out there, but I’ve noticed something very interesting about pricing. Let me illustrate:
1. MS Surface Pro 3 i7 256 GB SSD, 8GB RAM $1,549 512 GB SSD $1,949 Difference $400 Cost of Samsung 500 GB $280 (Newegg)
2. Lenovo ThinkPad W540 Base model $1,569
Add $140 for i7-4800MQ CPU, $380 for i7-4900MQ, $930 for i7-4930MX
Add $70 for 1920×1080 screen with color sensor, add $200 for 2880×1621 IPS display, $70 more for same with color sensor
Add $250 to upgrade from NVidia Quadro K1100M 2G to K2100M 2G display card
Add $140 to upgrade from 8 GB to 16 GB RAM (4x4GB), $260 for same (2x8GB), $650 for 32 GB (4x8GB)
Add $170 to upgrade from 500 GB 7,200 RPM HD to 128 GB SSD, $270 for 256 GB SSD, $620 for 512 GB SSD
Add $40 to install 40 GB M.2 SSD (no other options available, though 42mm M.2 SSDs up to 256 GB are available for around $180)
3. Fujitsu Stylistic Q704 Hybrid Tablet $1,649 i5-4300U, 128GB SSD, 4GB RAM
Add $160 to upgrade to 8 GB RAM
Add $300 to upgrade to i7-4600U vPRO CPU
Add $200 to upgrade to 256 GB SSD
In cases 1 and 3, the ultra-slim tablet format for each of those PCs means that the builder must often choose to solder devices onto the motherboard rather than use a socket mount of some kind, simply to achieve the narrowest possible height profile. But they do mark up additions to memory and storage well beyond the retail price difference (and they’re not paying retail, either) between higher and lower capacity memory and storage configurations. To some extent, I can see this as taking advantage of a captive audience (which must either buy the higher capacity from the maker or forgo it completely). I don’t have to like it, though, and it bugs me to see this as a standard industry practice.
Lenovo’s W540 is more puzzling. Lenovo is always good about making its maintenance manuals available to owners of its equipment, where they’ll find detailed instructions on upgrading most internal components for those devices that can be swapped out in the field. Here’s what I did to my current traveling laptops, for example:
- I upgraded an X220 Tablet from 4 to 16 GB RAM, from an HD to an SSD, and to which I added an mSATA SSD in the PCI-e Express socket available for either storage or a WWAN device
- I did likewise to a T520 ThinkPad, and swapped out its DVD drive bay for a 2.5″ drive caddy that now accommodates a 1 TB Seagate SSHD for backup and extended storage as well
For the W540, there’s nothing to stop a motivated buyer from purchasing the minimum default configuration, then installing more RAM, swapping an SSD for the 500 GB HD, adding one or two M.2 42 MM SSDs into the available slots on that machine for up to 512 GB of additional SSD storage, and replacing the optical drive with an HD/drive bay combination. The cost differential between DIY and paying Lenovo to it for you could be over $1,000, even accounting for extra parts (additional memory modules, the 500 GB default HD, and a drive bay to replace the optical drive) that a DIY-er must buy that Lenovo need not purchase.
Is there a moral to this story? Yes, actually there are several. First, if you want to max out ultrabooks or tablets, you and your employer will have to resign yourselves to paying a premium to acquire added processing power, storage, or RAM. Second, if you decide to acquire a field-upgradable notebook or laptop, you may want to perform a time-vs-savings tradeoff analysis for management to ponder. Even accounting for the fully-burdened overhead cost of labor for acquisition, installation and testing of in-house upgrades, it may still be cheaper to use company staff to handle those tasks and do it yourselves, rather than to pay the premium prices that PC vendors routinely charge to deliver PCs with more oomph than standard configurations deliver.
Although Windows 8 versions do a much better job of accommodating and adjusting to solid state drives (SSDs) used for system/boot and other purposes as compared to earlier Windows versions, there are still certain ways to improve upon their default behaviors. As I recently worked through Les Tokar’s excellent article at TheSSDReview.com entitled “The SSD Optimization Guide Ultimate Windows 8 (and Win7) Edition” (5 pp, 4/23/2013), however, I realized that there are numerous things that sysadmins can and probably should do the extract the best possible performance benefits from using SSDs. The results can be beneficial: I squeezed an extra 15% in performance from an OCZ Vertex 4 system/boot drive simply by working my way through the list of 21 tips (18 or 19 of them offer substantive “do-this” instructions) around which this guide is built. I suspect that other Windows-heads charged with the care and feeding of systems with SSDs installed can do as well or better by doing likewise.
I skipped three of the steps in the guide as I worked my way through them in numerical order. Two of those I omitted because I didn’t want to implement them on my primary production system: Tip 12 asked me to turn off Windows Search (which I find useful on my old-fashioned data drives, all of them conventional hard disks); Tip 19 (against which Tokar himself inveighs) asks users to tweak BIOS settings to turn off CPU states that produce higher performance at the cost of reliability or system stability. The third, covered in Tip 10, explains how to tweak multi-boot Windows systems for performance gains (my production system only boots Windows 8.1 Update 1). At the end of my efforts CrystalDiskMark 3.0.3 (64-bit version) produced the following not-at-all-shabby results:
Values shown here don’t match those for the latest SATA3 or PCI-e SSDs, but they aren’t bad, either.
I have to believe that working through the list of tips on Windows 8 systems with SSDs installed will be beneficial in many if not most such cases. That’s what makes Tokar’s Guide worth consulting. Check it out!
Though this post wanders a little bit off topic — it actually comes from an app built for iOS and the iPhone — I just had to write about it because it underscores so much of the real value of taking more and better advantage of the computing power present in mobile devices. As I was taking my walk this morning, I was also listening to the news on NPR’s “Morning Edition.” That’s when I heard the story entitled “Father Devises A ‘Bionic Pancreas’ To Help Son With Diabetes,” which recounts NPR reporter Rob Stein’s interactions with Ed Diamano and his 15-year-old son, David, who was diagnosed with Type 1 diabetes at the tender age of 11 months, just over 14 years ago.
To make a long but interesting story short enough for this blog post, it suffices to observe that Diamano changed fields from mathematics to biomedical engineering to do everything he possibly could to help his son cope with the disease, and to try to avoid some of its most damaging long-term side-effects, which can include blindness, amputation of the extremeties, nerve damage, heart problems, and more. Diamano now works as an associate professor of biomedical engineering at Boston University, and has aimed his professional life and considerable energies toward what Stein describes in his story quite accurately as “developing a better way to care for people with Type 1 diabetes.”
To that end, Diamano has developed what he describes as a “bionic pancreas” to help diabetics manage their blood sugar as effectively as possible, and is working day and night to obtain FDA approval for the device before his son heads off to college in three more years. At present, Diamano monitors his son’s blood sugar with an active blood monitor that triggers an alarm whenever his blood sugar levels wander outside an acceptable middle range, so he can infuse insulin when blood sugar levels climb too high, or another hormone, glucagon, to increase blood sugar when levels drop too low.
This tiny transmitter monitors blood levels constantly and can transfer the data to nearby remote devices or monitors.
David already wears a transmitter on his abdomen that sends data on his glucose levels to a monitoring unit, which in turn triggers the aforementioned alarm so that Ed can take appropriate actions to adjust those levels as needed. But Diamano’s work goes beyond that approach for a series of tests and studies currently funded by the National Institutes of Health (NIH) and the Juvenile Diabetes Research Foundation. He’s built an iOS app that communicates with transmitters like the one David wears on dozens of adult and juvenile volunteers who will use this system for 11 days entirely on their own. The app manages blood sugar automatically, dispensing insulin or the other hormone to adjust levels up or down as needed, using a pair of compact pumps with reservoirs of each substance likewise plumbed into volunteer patients.
Early trials show that those volunteers found themselves less worried about managing their blood sugar than ever before. Diamano is absolutely obsessed with getting things ready, working, and approved before his son goes off to school, so it seems pretty likely that his research will lead to a method to help diabetics manage blood sugar 24/7 without having to resort to regular sticks and manual blood sugar level checks.
As amazing as this work is, and as significant as it could prove for diabetics around the world, the real import to me is that we’ve just begun to tap the possibilities inherent in the computing power available to an increasing portion of the global population through smart mobile devices. Trust me when I quote Mr. Joe Walsh on this subject: “You ain’t seen nothing yet!”
I’m a great fan of various web sites that track and announce new software and device drives. Two of my particular favorites include MajorGeeks and Station Drivers. While visiting the former this morning, I was alerted that a new version of Intel’s Rapid Storage Technology had been released: version 184.108.40.2061, to be specific. A quick check to the about screen on my production desktop showed me that I was running version 220.127.116.116, so I decided to download and install that update.
Ultimately, I did produce this screen from the latest Intel RST Help-About display, but not without some fear and loathing along the way.
Everything seemed to go swimmingly with the install, right up to the warning at the end of the process that the new version wouldn’t activate until after my next reboot. I elected to delay that reboot for a while, so I could work on other things. But as the following snippet of events from Event Viewer after ultimately firing off the reboot clearly show, something “interesting” happened on the way to a successful restart.
Please note the eight-minute gap from 9:35 to 9:43 in the event log.
The machine started to shut down as usual, and ended at a blue screen with the message “Restarting windows” with the cycling balls. Unfortunately, there it stayed for the next eight minutes, at which point I took my heart in mouth (and some comfort from a nightly image backup schedule) and hit the reset button on my production PC. My burning question while taking this action was, of course: “Will the machine boot correctly, or is this installation now hosed?”
Luckily for me the answer was “No, it’s not hosed” as Windows then promptly restarted without even complaining about an abnormal shutdown (nor does the event log include such a notification either). What’s highlighted in the event stream above apparently flags my hitting the reset button after eight minutes had elapsed since beginning the restart process. It looks like Windows didn’t hang until the shutdown process was nearly complete, since the OS didn’t complain about an abnormal shutdown. Fortunately for me, that also means that the processes involved in switching from the old version of RST to the new one (or at least flushing out the remains of the old one to be replaced by the elements of the new one upon the next successful boot-up) had already completed. I have to imagine that’s why my machine rebooted properly and is running correctly at this very moment.
But dang! I really HATE when this kind of thing happens because when storage drivers get interrupted in mid-update, there’s always the potential that the entire storage subsystem will be corrupted, damaged, or completely unusable. Although today may be Friday the 13th, I find myself thanking my lucky stars that whatever provoked my system hang didn’t strike until after key changes to my system had already been completed. And while it may have been completely in character to be reinstalling the old OS image from last night’s backup, I’m grateful to have been spared this unscheduled administrative chore. Whew! That was a close call…
Patch Tuesday has come and gone and along with it, my Windows 8.1 machines have picked up 20-26 new updates depending on what they’ve got installed (more MS applications generally mean more patches and fixes). One item of particular interest in the latest batch is summarized here:
Attentive users with OneDrive experience will notice lots of changes. There’s a new pop dialog that shows sync status, and provides buttons to force an immediate sync, or to pause a sync that might be underway (another link opens OneDrive in File Explorer, which was the former default behavior).
Options to control sync are immediately available, and one more click takes you to the File Explorer view.
Right-clicking the notification icon also now produces more options, including a jump to the OneDrive website, immediate access to the sync troubleshooter (“View sync problems”), OneDrive storage controls and related settings, and Help information. Storage options even let you force local copies for OneDrive contents so all files can be accessed offline (and will be synced automatically the next time you go online). All in all a nice set of changes, and a simpler, more understandable set of controls. Looks like MS got something right, and really made some improvements.
In the past few weeks, I’ve been involved in some BIOS updates and also with upgrading the firmware on a number of SSDs from makers including Samsung, Intel, OCZ, and Plextor. On a couple of occasions, the installation routine has called for converting an ISO to a bootable image so the computer can work its magic outside of the Windows OS environment, usually in the embrace of an alternate Linux-derived OS that runs the installer and firmware update process independently. This is often handled by burning a bootable CD or DVD to perform the necessary tasks, but that comes with some time disadvantages — namely, it take a while to burn optical media for use, and optical drives generally run at the bottom of the secondary storage performance hierarchy (slower than everything else: hard disks, SSDs, and USB flash drives).
Here’s Rufus with the Windows8.1-iso file as its install target, but any iso will do.
That’s why I turned to the latest version of Pete Batard’s excellent tool Rufus (The Reliable USB Formatting Utility), currently out in version 14.5.9 (Build 506) as I write this blog post. It works with any .iso file to build a bootable UFD to deliver the contents of that image to the target system at boot-time, simply by targeting the host UFD as the focus for the next boot-up. I used it to build bootable UFDs to update the BIOS on one of my Lenovo laptops (the X220 Tablet, model 4294-CTO). I also used it to update the firmware on the Samsung 840 SSD on my wife’s primary desktop machine. In both cases, it took less than 5 minutes to prepare the bootable UFD, and a similar amount of time to boot the machine to the UFD, and then let the corresponding utilities do their thing.
Rufus excels at building OS install UFDs so I was pretty familiar with the program already. This added capability makes it incredibly handy in those occasional situations where vendors don’t provide firmware or BIOS update utilities that run inside the normal host OS, but instead require a boot-up into an environment that they control completely (the Paragon disk migration and re-org tools do this as well, but they’ve created their own standalone environment that handles the process for you transparently, from start to finish). Good stuff!