December 5, 2011 4:16 PM
Posted by: Ed Tittel
CompTIA Storage+ to replace SNIA SCSP
, Storage+ poised to become THE entry-level storage IT credential
With the impending release of the CompTIA Storage+ credential on January 18, 2012, the Storage Networking Industry Association (SNIA) has announced the withdrawal and impending retirement of its SNIA Certified Storage Professional (SCSP) credential the day before. In fact the SCSP home page currently starts out this way:
Out with the old and in with the new!
I’d wondered how the old entry level SNIA credential (sometimes known as “Storage Foundations” or “Foundations” after the name of the associated exam) would compare to the new and presumably improved Storage+ joint effort from both CompTIA and SNIA. Now I guess we ALL know. But I look at this as a good thing, in adding more credibility to CompTIA and its industry partnerships, and in reducing potential confusion when it comes to selecting a good, solid, entry-level vendor neutral storage exam
December 2, 2011 2:58 PM
Posted by: Ed Tittel
another reason for UEFI systems revealed for Windows 8
, is the world ready to move from BIOS to UEFI based systems
, large disk support in Win8 calls for UEFI
In checking out the latest hub-bub on the Building Windows 8 blog this morning, I encountered a new post entitled “Enabling large disks and large sectors in Windows 8.” It’s the work of Bryan Matthew, a program manager on the Windows 8 Storage & File System team, and it’s definitely worth a read-through, not just for its explanation of how the file system in Windows 8 will support disks larger than 2.2 TB directly (without requiring an add-in driver like Seagate’s beyond-2TB software) but also for its explanation of where the technical limitations come from on BIOS-based systems that set the ceiling at 2.2 TB.
A big part of the answer to the support question involves another invocation of UEFI (Unified Extensible Firmware Interface) which is also the key to Microsoft’s boot-time security protection for Windows 8 as well. Here’s what Matthew has to say about UEFI and Windows 8: “Beginning with Windows 8, multiple new capabilities within Windows will necessitate UEFI. The combination of UEFI firmware + GPT partitioning + LBA allows Windows to fully address very large capacity disks with ease.” Although this is nice, it is also another indication that BIOS-based PCs–which represent over 99 percent of the current installed base, including every system I myself own–are heading for second-class status in the brave new Windows 8 world.
I’ve recently started speccing out a UEFI based system with a touchscreen to put a Windows 8 test system together, and I’ve learned that the number of UEFI-based motherboards is still very small (though it’s growing by leaps and bounds every month: when I first checked on this in September, I could find only 8 UEFI mobos on Newegg; this morning, the site lists 28 AMD-based mobos and 43 Intel-based mobos). Though I can understand and appreciate the technical advantages of UEFI, I wonder if the world is ready to make a mass migration to a new firmware architecture simply to take best advantage of Windows 8.
I’ve got two newish notebook and an equal number of newish desktop PCs and I’m not planning to junk them for at least another 2-3 years. Will the need for UEFI to obtain the best Windows 8 experience force me to change my mind? I don’t know yet. But I do know this is gonna be interesting!
November 30, 2011 4:35 PM
Posted by: Ed Tittel
checking recent hardware changes restores WiMax module to working order
, interesting adventures navigating Dell's tech support offerings
, mysterious WiMax outage on Dell M11X solved
I’m the reasonably proud owner of a Dell M11X (Alienware) mini-laptop with an i7 26xx processor, 8 GB RAM, and an OCZ Agility 3 SSD. It runs fast enough for me, and has enough horsepower to do everything I ask of it with aplomb, including some occasionally heavy-duty protocol trace capture and analysis (Wireshark rules!). But yesterday, when I turned it on, I couldn’t help but notice that its built-in Intel WiMax 6250 module had gone bye-bye.
When the laptop booted up, it reported that it couldn’t find the WiMax module. So, I ran the Intel WiMax connection utility to try to kick it into presence. No dice. Next, I read the help file to learn that I should probably duck into Programs and Features to run a “repair” operation on the WiMax connection software. And while that went without a hitch, there was still no WiMax module to be found after my next reboot.
Then, I went into Device Manager and, sure enough, no WiMax device appeared under the Network adapters heading, in contradiction to normal behavior. So I dug into the Dell Technical Support site to see if I could find any joy from their help and advice. Zilch.
After that I remembered seeing the WiMax module as I removed the unit’s conventional HD a couple of weeks ago to replace it with the SSD it’s now using. “Could I have jiggled an antenna wire, or loosening the module somehow?” I wondered out loud. Popped open the hatch, and saw nothing amiss with the WiMax 6250 module upon visual inspection but pushed on the antenna connections and made sure the module was tightly socketed anyway.
I’m not sure if this fixed the problem, or if the WiMax module had simply taken a short vacation, but when I rebooted the machine–Presto! The WiMax module appeared, and established its usual rockin’ connection to Clear Communications. Problem solved, with not too much muss or fuss, or time wasted.
What it reminded me of is the old adage about troubleshooting: always look at what else has changed recently on a system when components go kerflooey. Again, I can’t be 100% sure that opening the case and checking the WiMax module fixed my problem. But performing those actions did result in a return to normal function, so I’m not inclined to look this gift horse too deeply in the mouth.
I was kinda glad it turned out that way, because I’d started to investigate the Dell Technical Support channels I would have had to traverse had more serious fixes been required. The only phone number I could find was for premium, added-charge support. I’m not sure my same-day hardware replacement support warranty gets me that high up the food chain, and I guess I save the effort involved in climbing that learning curve for another day!
November 28, 2011 9:46 PM
Posted by: Ed Tittel
SmartDeploy solves hardware driver issues
, SmartDeploy works well with heterogeneous hardware OSes and applications
In my previous blog “The Changing Face of OS Migration: Windows 7 and Beyond” I ruminated on the state of Windows desktop deployments in modern enterprises, based on a discussion on November 16 with Aaron Suzuki, CEO of Prowess, the makers of the SmartDeploy toolset for Windows operating systems and applications. In that first blog in this two part series, I laid out the current situation that the vast majority of enterprises face on their desktops (and servers) today: a heterogeneous situation where OSes are no longer migrated in lock-step, and where the age of the hardware often dictates the OS it uses (particularly on desktops, where enterprise buyers often elect to have vendors provide them with pre-installed reference images on new machines, usually running Microsoft’s latest and greatest desktop OS).
- The SmartDeploy Home Page Gives Its Elevator Pitch
In this blog, I’ll recount some of the specific pain points that Mr. Suzuki reported hearing from his customers, and how SmartDeploy Enterprise (and the Prowess sister product SmartMigrate) can help to address them.
Managing Device Drivers Is Key
According to Suzuki, managing device drivers often ends up being a bigger problem than managing the OS itself, and also often involves a great deal more time and effort from a maintenance perspective as well SmartDeploy solves this problem by separating drivers both logically and physically from the OS/software imge, and packaging these elements separate and distinctly. Each driver package, called a “platform pack” in SmartDeploy parlance, enables enterprises to manage drivers by hardware model, configuration, device configurations, and so forth. Some organizations may manage as many as 60 or 70 such driver packs, while others find it necessary only to define and manage a dozen or two. Once a driver pack is defined, the regular SmartDeploy management console handles its application and use simply and directly. These ensure that the right drivers are available at deployment time, and help ease the most common (and vexatious) issues that can occur during actual roll-outs.
Single Point of Management and Control for All Windows Machines
Easy definition and management of drivers, plus OS and application images, enables SmartDeploy users to manage multiple Windows environments from the same toolset, all the way back to Windows 2000, along with XP, and even Vista, as well as Windows 7. And Prowess is working assiduously to get ready for the upcoming release of new server and desktop OSes from Microsoft around the end of 2012 as well. This provides a single point of management and control for all Windows desktops and servers in the enterprise, and is gaining considerable traction and adoption from a broad range of enterprise customers around the world. With over 84,000 desktops in its biggest deployment, Suzuki is also confident of the platform’s ability to scale well, and to handle heterogeneous hardware and operating systems with panache.
SmartDeploy also permits consolidating images from multiple operating systems into a single image file, something that I plan to learn more about after New Year’s from other members of the Prowess technical staff. At that point, I’ll report back with further details. In the meantime, check out the SmartDeploy home page for more information on the company’s products and platforms. Interesting and capable stuff!
November 21, 2011 3:27 PM
Posted by: Ed Tittel
look for increasing desktop heterogeneity in the enterprise
, wholesale OS migration and redeployment is fading from the scene
Last week I had the pleasure of talking to Aaron Suzuki, the co-founder and CEO of Prowess, a Seattle-and Salt Lake City-based software company best known for its SmartDeploy product suite. In an upcoming blog, I’ll write about what’s up with SmartDeploy and why it could be interesting to readers responsible for managing packaging, deployment, and maintenance of Windows desktops. In this blog, however, I want to ruminate about the changing face of desktop deployment and use in the enterprise.
Why not let attrition or lifecycle constraints dictate OS selection?
One of the elements of my conversation with Mr. Suzuki that struck me most as we were talking, also has come back to me repeatedly as I’ve tried to understand what’s happening in many, many enterprises around our globe. The facts that drive these ruminations are:
- only 20-25 percent of global enterprises have made significant progress in deploying Windows 7 in their production environments.
- wholesale migration is becoming less of a concern in many organizations, where migration can hinge on retirement or replacement of older desktops (which generally have XP installed, where replacements or new purchases generally have Windows 7 pre-installed).
- it’s increasingly clear that XP won’t have entirely disappeared by the time Windows 8 ships in Q4 2012, which raises the same prospect (older XP and/or Windows 7 PCs on their way out, replaced by newer machines with Windows 8 pre-installed on their way in).
The notion of some kind of massive switch-over, or wholesale “the world now runs Windows X” approach to migration and deployment appears to be fading from the scene. Tight IT budgets, lengthening desktop hardware lifecycles, and the realities of continual purchase of new desktops with new OSes appear to be blurring the lines somewhat. It’s also very much to the good that virtual machines can be made to work on newer OSes with ever-increasing ease and automation, so that runtime access to legacy systems and software remains feasible even as the underlying host architecture continues to change beneath the VM layer.
I’m starting to think that a “we were an XP shop, now we’re a Windows 7 shop” mentality is fading from the scene, too. It’s starting to look like enterprises will use all the tools at their disposal, including a mix of old and new OSes and hardware, along with increasingly sophisticated software tools to help them manage heterogeneous desktop environments, while doing their best to deliver a consistent and positive user experience across all supported environments. In my next blog, I’ll discuss how the Prowess SmartDeploy toolset makes that possible and affordable, as a case in point.
November 18, 2011 2:56 PM
Posted by: Ed Tittel
donate used computers and components for a second life after the business lifecycle ends
, The Helios Project repurposes used computers for disadvantaged youth
When PCs reach their end-of-life point within many organizations, the question then arises about what to do with them to achieve their responsible disposal or redisposition. In developing a lifecycle course for HP a few years back, based on HP’s in-house lifecycle expert Bruce Michelson’s excellent book, Closed Loop: Lifecycle Planning (A Complete Guide to Managing Your PC Fleet) (Addison-Wesley, 2007, ISBN-13: 9780321477149, List Price: $44.99), I had cause to ponder the many and various ways of dealing with obsolete or end-of-lifecycle computing gear.
Of course, vendors such as HP, Dell, IBM, and others do offer “take-back” programs to permit companies and organizations to dispose of used computing gear safely. But after its useful life in business is over, older PCs can still enjoy a useful second life in other worlds. I’m not necessarily in favor of outright disposal of such units, no matter how environmentally correct such disposal might be.
To this day, one of my favorite recommended methods for safe disposal of computing gear that adds jobs and income to communities where such equipment is received is through Goodwill Industries International. With a strong presence in North America, and an increasing presence in South America and Asia, Goodwill offers environmentally sound e-waste disposal around the world these days. But they scavenge donated equipment first and foremost for salvage and refurbishing, to gain extra life, money, and work for their employees out of handling these materials. Over the past 10 years, I’ve probably donated over $20,000 in in-kind gifts of computers, monitors, keyboards, networking cards and components, plus miscellaneous computer parts and components galore to Goodwill.
The Helios Project Logo and Slogan
Recently, I heard about a group based in Austin, TX, called The Helios Project. It’s run by Ken Starks, who works out of donated space in Taylor, Texas (about a 40-minute drive away from Austin). This group collects older computers and related parts for refurbishing or what you might call “reconstruction” so they can give working PCs to disadvantaged youths who might otherwise not have access to personal computing power. It’s a great cause and one well worth supporting, and they’re going to be getting my used gear going forward, primarily because they not only see to its safe and productive re-use, but also because the group voluntarily wipes gifted hard disks to DoD erasure standards to protect donors from potential illicit data mining and reuse. Ken and his directors (see “The Helios Project Directors” for some capsule summaries of the movers and shakers there) are also passionately committed to Linux and other Open Source software, and equip all of their outbound PCs with legally licensed operating systems and software that the new owners can maintain without having to incur re-licensing or annual maintenance fees.
I’d encourage you to look for similar kinds of organizations in your communities as potential recipients for used or end-of-lifecycle computing gear. These outfits will work with you to protect your data and information assets, but can also give that gear added life outside the business world. It’s nice to make a difference, and do something good for the community, in addition to practicing safe and secure disposition of no-longer-needed computers and equipment. Please, look around your neighborhood, and see if you can find an outreach organization to support. If not, you can still always turn to Goodwill, and generate some jobs and income in your community through your gifts in kind.
With the holidays approaching, The Helios Project is humping like mad to get a raft of new computers ready to show up under Christmas trees all around Central Texas. I’m driving out to Taylor myself next week to give them a barely used “One Laptop Per Child” machine I purchased from Negroponte’s organization a few years back, plus an Asus netbook PC, along with a collection of surplus memory, disk drives, and other spare parts they can use to bring computers back alive for re-use. Those interested in making financial donations to this organization should send checks c/o Ken Starks, 308A High Estates Drive, Round Rock, TX, 78664. But those checks should be made out to “Software in the Public Interest” (SIPI) with a memo notation that reads “Helios Project” (SIPI is a legitimate 501(c) organization that handles the donations for The Helios Project, and save them the expense of registering with the state to process donations and write IRS-accepted receipts for donations to a nonprofit charity).
November 16, 2011 3:31 PM
Posted by: Ed Tittel
another hurry-up Windows Update released 11/11/11
, KB2641690 issued for weak or fraudulent security certificates
Remember the recent hoopla this summer about fraudulent master-level (intermediate authority) digital certificates showing up in the wild? Well, Microsoft quietly released another out-of-band security update last Friday (11/11/2011) under the heading of KB2641690, with an accompanying Security Advisory. Apparently, Microsoft has also revoked its trust in the Digicert Malaysia Certificate Authority (doing business as DigiCert Sdn. Bhd.) for violation of the Microsoft Root Program requirements (see this Softpedia report for more information: “Microsoft Revokes Trust in Digitcert Malaysia Certificate Authority“).
MS Security Advisory for Digital Certificates
The Softpedia story nicely explains why Microsoft took this action, and issued an emergency security update to match:
“Microsoft was notified by Entrust, Inc, a certificate authority in the Microsoft Root program, that a Malaysian subordinate CA, DigiCert Sdn. Bhd issued 22 certificates with weak 512 bit keys,” revealed Jerry Bryant, Group manager, Response Communications, Trustworthy Computing.
“Additionally, this subordinate CA has issued certificates without the appropriate usage extensions or revocation information.”
Microsoft stressed that unlike the DigiNotar scenario from a few months back, this time around attackers did not get the chance to exploit the weak and deficient secure sockets layer certificates issued by Digicert Malaysia.
This is best understood as a pre-emptive measure designed to forestall possible security compromises or potential attack vectors BEFORE they occur. Nevertheless, it also dictates that this security update be fast-tracked into production for the selfsame reasons. Another one for your hurry-up schedule!
November 14, 2011 3:07 PM
Posted by: Ed Tittel
in the Apple world Windows always comes in second
, learning a new environment means accepting ignorance
In August, I agreed to help out a good friend–namely Chris Minnick of Minnick Web Services–by tech editing his forthcoming book WebKit For Dummies. Little did I know what profound effects this decision would have on my personal and professional life!
Webkit is a way-cool open source technology that calls itself a Web browswer engine, but is really much, much more. It’s also the same engine that runs under the hood in Mac OS X underneath Safari, Dashboard, Mail, and lots of other applications that enable access to Internet services(especially those based on the HTTP/HTTPS protocols).
In editing the book, I quickly realized I would have to acquire a Macintosh PC because that’s what Chris used to generate most of the screen shots for the software referenced in the book. This was followed by the need for an iPhone, which represented the mobile platform that Webkit so frequently targets–namely iOS 5. In turn, this led to an iPad so I could see what was up with that “insanely great” tablet, and then an iPod Touch, so my 7-year-old son would give me my iPhone 4S back.
Along the way, I realized several things I’d known in theory but not in practice. First and foremost, it’s both interesting and frustrating to wander into a world where Windows machines are tolerated and encouraged, but where they play a distinctly second-class role. Second and perhaps more stimulating is all the learning I’ve been forced to do to understand how the Maciverse (or is that Appleverse) works, and how I can do what comes quickly and easily to me in the Windows environment on the other side of this street.
Some of this stuff comes naturally or simply, but some of it is a real struggle. There haven’t been too many cases of “you can’t get there from here” so far, but I have found plenty of reasons to take the long way around in figuring out how to move music, video, and other stuff from PC to Apple platforms. I’ve also learned of the nearly irresisitible allure of iTunes for the younger generation, which is whacking off hunks of our monthly nut in little but insatiable $0.99 or $1.29 bites.
It’s really great to get into something that’s so well engineering and so attractive looking. I’ve yet to encounter any obvious bugs, either, and so far have had to recover from only one mysterious hardware glitch across all four of my Apple platforms (after shutting down my MacBook Air at a client site last Friday, I couldn’t get my screen to illuminate with proper brightness after a restart at home; a second restart fixed the problem instantly, however).
So here I go again, climbing a massive learning curve. I must say this one is proving to be quite a bit of fun. Now, if only I had more time to really learn EVERYTHING in depth, so that I could ascend to power user status in record time. But that’s not the way the game works, so I’ll just have to keep ploughing away until I can develop some perspective and useful experience. The funny thing is, my first computer — purchased way back in 1984 at the UT Austin Apple Store — was a 512K Fat Mac, and I was a total Mac bigot until 1994 when I switched over entirely to Windows. Now, I’m trying to straddle that gap again, and realizing how much I’ve forgotten but even more how very much things have changed. That’s life!
November 11, 2011 2:00 PM
Posted by: Ed Tittel
Building Windows 8 blog post on Windows 8 power smarts
, try out the powercfg utility in Windows 7
As is most of the rest of the Windows-aware world, I’ve been following the various posts on the Building Windows 8 blog with great interest and regular attention. In fact, checking in on this blog has become part of my regular “keeping up with Windows” routine, and routinely serves as fodder for this blog.
A particularly interesting post popped up this past Tuesday (11/8/2011) entitled “Building a power-smart general-purpose Windows” from Pat Stemen, a Principal Program Manager on Microsoft’s Windows 8 OS Kernel development team. It not only details the design objectives for power consumption (less power, more efficiency, longer battery life) that inform Windows 8′s ongoing design, it also describes the kinds of testing and measurement Microsoft performs to keep those objectives on-target.
In that vein, I found Stemen’s recitation of how each Windows 8 build’s power consumption profile is measured particularly interesting, as was his explanation of what happens when occasional software changes introduce spikes in that consumption (ultimately such changes get tracked down and fixed to keep power consumption under control). It also sheds some interesting light on the powercfg.exe utility, and its energy parameter, already available in Windows 7 (as well as Windows 8).
The little-known powercfg.exe command is profiled in this blog
This utility produces the output file identified near the end of the command’s on-screen output as captured above, and has lots of interesting things to say about how power is being managed (or not). The resulting HTML page includes a plethora of information about which drivers and power settings defeat or turn off power management functions (sleep, hibernation, suspension, and so forth). I also discovered some very interesting timer settings emerging from some surprising programs (TechSmith’s SnagIt and Google Talk, of all things), as well as pretty detailed list of which runtime processes were making “…significant processor utilization” to quote from those report headings.
This is an interesting blog post and is well worth reading. If you’ve never played with powercfg.exe before, either, it’s probably worth a try, too. Please launch cmd.exe using the right-click “run as administrator” option, though: otherwise, the program will refuse to run.