Boy! Talk about a hidden gem showing up in the Windows trade news. On Halloween evening, Paul Thurrott was obviously not servicing trick-or-treaters, because he posted an article at 8:55 PM that evening entitled “Microsoft Embraces HTML 5, Deemphasizes Silverlight.” This in turn cites long-time MS maven Mary Jo Foley who broke this news on Friday in her story: “Microsoft: Our strategy with Silverlight has shifted.” She noticed that Silverlight came in for very little mention at the recent Microsoft Professional Developers Conference (PDC: an annual developer shindig that Microsoft usually hosts in the Seattle area, PDC10 was held this year on October 28 and 29; the preceding link provides access to video recordings for most if not all of the content from this event ).
MJ caught up with Bob Muglia, the MS President who’s in charge of the MS server and tools arm, and asked him what was up with Silverlight and got the following answers (I paraphrase here, see the original for direct quotes):
- HTML5 will be the dynamic Web technology for everything except Windows Phone: it’s the key to building pages that “…feel and run like an app or a game” (this quote from Dean Hachamovitch, MS VP in charge of Internet Explorer, during the opening PDC keynote on 10/28)
- Silverlight remains the development platform for Windows Phone (a new version, Silverlight 5 is in the works but no timing or release info is available yet)
- HTML(5) is a true cross-platform solution for everything else, including the Apple IOS platform
This is great news for Web developers everywhere, because it means that MS is going to support the full range of HTML5 capabilities in Internet Explorer. Because HTML is a real industry standard — albeit an emerging one, with the W3C not committed to a finished recommendation for many years to come — this should help Web site designers, developers, and even companies that build Web development tools and technologies, hew the HTML5 line with more confidence and competence than they might otherwise be able to do.
My ISP is Time Warner Cable/Road Runner, to whom I pay about $50 a month for so-called “Turbo Internet” service (which advertises download speed of 15.0 Mbps and a Turbo with PowerBoost mode that can take those speeds up to 30 Mbps, and upload speeds of up to 3.0 Mbps). Most of the time, I’m very happy with the service that they provide to me. But sometimes, as is inevitable with any service that provides egress to a public and shared resource, things do bog down from time to time. When such slowdowns are intermittent and infrequent, I’ve been around long enough to know it’s my job to grin and bear it. When such slowdowns become frequent or chronic, I always contact Time-Warner to try to find out what’s up.
Lately, they’ve been employing a couple of tactics that drive me absolutely crazy. Let me describe them, and then I’ll explain why I’m convinced that these tactics are unfair, misguided, and designed to exploit the ignorance of most Internet users about the way networks work and behave.
Tactic 1: The Speed Test
Any time a Time-Warner/RoadRunner tech support person analyzes network behavior they rely on a special network speed test tool. For my part of the world that tool resides at http://speedtest.texas.rr.com/, and it usually produces results that look like this:
What’s wrong with this picture? Not much, apparently, except that the results don’t necessarily reflect the speed of access that Time-Warner/Road Runner provides to the Internet. This speed test accesses the internal Road Runner network, to which my home is directly attached (in fact, the HTML Title value for this Web page reads “TWC Austin RDC Bandwidth Speed Test” in my Web browsers). Thus, it represents a “best-case” value for the maximum upload and download speeds that any node on my home network can achieve going through the gateway and onto the Time-Warner network, irrespective of final destination.
Now let’s compare these values to those returned by the SpeedTest link that’s built into my Network Meter desktop gadget, located online at http://ip-address.cc/speedtest/. I don’t necessary endorse or recommend this tool, but it’s readily available to me because the link is integrated into the desktop gadget that I use to monitor my network connection (and performance and health values) on my Windows 7 PCs. Here are values recorded within seconds of completing the previous Road Runner sanctioned speed test (and I like them, because they do not reflect some of the pathology I’ve been trying to get Road Runner to explain to me recently):
Notice that the very difference that you see between the two reports is what I’m trying to get Road Runner to address — namely, why is my connection to the Internet (that vast realm of IP addresses outside the Time-Warner/Road Runner network domains) running slowly? In this case the difference is 25.48 Mbps inside, and 8.46 outside, or 13.02 Mbps (153% of the smaller value and 51% of the greater value). When I call Time-Warner, it’s usually between 12 and 25 Mbps inside, and under 3 Mbps outside, or slow enough to make my e-mail quit working reliably and to make many Web pages refuse to load.
Frankly, I don’t necessarily care how fast my connection to Time-Warner’s servers might be, since I don’t use them very often, except perhaps as a sanity check to make sure my premises equipment is working properly (if the connection to the Time Warner network were as slow as my connection to the Internet that would strongly suggest that my network was the source for such problems, rather than what’s happening on the ISP’s side of that interconnection). And in fact, that leads me directly to Tactic number 2.
Tactic Number 2: It’s your network/PC, stupid!
The last time I contacted Time-Warner tech support was last Friday, when my wife expressed her exasperation at not being able to access her Yahoo email or the Russian-language news sites she accesses every day (both of which are outside the Time-Warner umbrella, and thus subject to the “inside-outside” speed differences I describe in the preceding tactic). I spent a very frustrating half-hour chatting with a support associate who put forward the following assertions:
1. Because Time-Warner’s speed test levels were at or above guaranteed service level values, their network was behaving properly. I get the reasoning behind this, but my service is not called “Time-Warner Turbo local network access” it’s called Time-Warner Turbo Internet, and in fact the Internet is what I want to access.
2. The next questions from the support tech elicited the facts that (a) I had a LAN at my home and (b) that Time-Warner was neither providing nor supporting the boundary equipment (I have a D-Link DIR-655 switch/router/firewall/WAP device at the boundary, and a D-Link D2100-AP 802.11g WAP hanging off a NetGear 108GS switch in my office. FWIW, both appear to be behaving properly and the logs from neither device suggest problems with either one).
3. Because my machines were having problems accessing the Internet at higher (or in fact, acceptable) speed levels, the problems lay with my machines and/or my Internet gateway device. We explored the idea that the Web browser caches needed refreshing and that local PC firewalls were causing the problems, but I had to observe that with up to half-a-dozen machines running different versions of Windows 7 and different Web browsers, it was unlikely that all would suffer the same slowdowns because all their environments were different. It’s also not clear to me how a poisoned Web cache or misconfigured browser could deliver normal results for access to the Time-Warner network and slow results for Internet access without indicating the presence of some kind of bottleneck at the boundary between Time-Warner’s internal networks and the Internet, rather than between my network and theirs.
3. This led to the observation that Ethernet is a shared medium and thus, individual machines on my LAN would have only a fraction of total bandwidth available to them for Internet access. This explanation totally overlooks that only one or two nodes are likely to access the Internet at any given moment, and that the speed difference values between inside and outside exceed the ratio of machines making simultaneous access to the overall bandwidth available. And the same observation I made for the preceding item also applies here — namely, that if the slowdown/bottleneck straddled the boundary between my network and theirs, then the speed test results for Time-Warner local access and real honest-to-goodness Internet access would have to be closer, and service level guarantees not met for that local connection/access as well.
That was the point where I gave up, and resolved to write this blog to document what I’d experienced. I’m going to send it to a local manager at Time-Warner and ask for a comment. I’ll report back here on what happens, and find myself more than ordinarily curious to see what kind of response I’m going to get.
Frankly, I don’t see how my network can be at fault when it can access the ISP’s own network at or above guaranteed performance values. It’s only the next “big hop” onto the Internet where problems manifest, and as far as I can tell, that has to be their problem not mine. So when their support associate indicated that I should schedule a for-a-fee service call to have a technician come and troubleshoot my network for me, I kind of lost my cool and terminated the support chat session. Upon reflection, I’m reconsidering and my decide to bring in a technician to see what he or she can find — but only if Time-Warner will refund the charges if the network problems turn out to be on their side of the demark, not on my side, as I strongly believe will turn out to the case.
In my blog yesterday (“Happy First Birthday Windows 7“) I observed the one-year anniversary of Windows 7’s General Availability release. This morning, I’m starting to see evidence that some vendors are celebrating with special deals on that selfsame operating system. Though this may fall under the heading of “Any excuse for a sale is a good excuse” it could still offer deals for users seeking to acquire licenses for personal or home machines. Here’s a snapshot from the Tiger Direct Website that colorfully illuminates this observation:
Most of the other vendors I can dredge up using Google and Bing shopping search functions fall at or around the same ballpark (with $10 – 20 of the prices shown here, invariably on the plus rather than the minus side). Nevertheless, it looks as if the anniversary may offer at least some small dividends to those who want to build new machines, install dual- or multi-boot setups (which require a separate license for each Windows OS you boot), or upgrade XP or Vista machines to Windows 7.
Today it’s been exactly one year since Windows 7 was released to the general public — what Microsoft calls “General Availability” or GA. It’s been a busy hectic year for the latest Windows OS, and it looks like Microsoft is finally starting to back itself out of the hole that it dug for itself with Windows Vista and the terrible driver debacle that went along with it. For me, the past couple of years have been an incredibly busy period of learning and writing activity, as I’ve learned to understand and use Windows 7 more efficiently and effectively on my own systems and networks, and in documenting its use in those of others who work on a much grander scale.
For some interesting further ruminations on Windows 7’s birthday, see Preston Gralla’s terrific Seeing Through Windows column at ComputerWorld entitled “On Windows 7’s one-year anniversary, Windows XP still rules” where he points out that despite Windows 7 dramatic gains in market share, XP still enjoys a 4-to-1 advantage in terms of total deployments. He also observes that at Windows 7’s current growth rate, it won’t surpass XP in the field until Q3’2012 &mash; at which point it’s quite likely that Windows 8 will be on the scene. He further observes that Windows XP is a solid, stable OS that “…keeps chugging along” (in terms of the coming Halloween season, I like to think of it as “the OS that wouldn’t die!” ;-).
A more enthusiastic (and less historical) viewpoint is available in Brandon LeBlanc’s entry on the Windows Blog for today entitled “Celebrating Windows 7 at 1 year — More than 240 Million Licenses Sold.” Check this out for all kinds of breathless statistics that should leave you wondering how the world possibly managed to cope without Windows 7 before October 21, 2009. It also offers no fewer than seven lists of favorite Windows 7 highlights (better grab and drink a cup of coffee before you try to plow through this stuff, though some of it is worth looking at and remembering).
At any rate, it is another anniversary for the world to remember, and on the whole rather better than worse as anniversaries go. Happy Birthday Windows 7!
Right now for $149.99 (or $145 at Amazon, if you’d care to bargain hunt) you can pick up a 3-license upgrade pack for Windows 7 Home User. This is truly one of those “act fast, supplies are limited” deals, too. Last year, MS offered the same deal and it lasted less than three weeks, which makes it tempting to speculate that they built a certain number of packages for this offering. When they ran out, the offering ended. I’d be very surprised if things didn’t turn out the same way this time around, too.
While those supplies do last, you can pick them up at all the usual e-tailers or at the Microsoft Store. What’s fascinating to me is the Microsoft has to do special, limited-time promotions to push these multipacks out of the door. You’d think they’d want to make these available all the time, and get more copies of Windows out there in the world.
Gadzooks! I almost fell out of my chair yesterday, when I checked Windows Update just after lunch, as is my usual wont on Patch Tuesdays. There were no less than seventeen (17!!) updates waiting for me to download onto my production machine, and as I worked my way through my notebook (I’m currently caring and feeding for 6 of them right at this moment, because my Dad and his D630 notebook are visiting right now, and I bought a D620 for him to take back to Virginia with him for my nephew, Collin) and four desktop PCs, the number of updates ranged from a high of 15 to a low of 12 for the rest of those other machines. Of course, one of those items is the standard Windows Malicious Software Removal Tool (October 2010/KB890830) so there is really one less actual security updates and patches involved in this latest batch.
For the complete details on this enormous batch of items, see the Microsoft Security Bulletin Summary for October 2010. For other interesting coverage, see Emil Protalinski’s October 2010 Patch Tuesday will come with the most bulletins ever at Ars Technica, and SoftPedia’s Patches Released for 49 Vulnerabilities in Windows, IE, Office, and .NET. Of the 16 actual security bulletins, 4 are rated Critical, 10 Important, and 2 Moderate. Looking over the mix of operating systems, platforms, frameworks, and applications involved, IT or security staff will want to look these over quickly, and get all of the Critical and at least some of the Important and perhaps even Moderate items into testing, and ultimately into deployment, with some dispatch. Windows 7 is now definitely into the mainstream with many more of the individual bulletins listing both 32- and 64-bit versions of Windows 7 among the affected platforms (only 4 of 13 show “not applicable” for those bulletins that target Windows OSes).
I’ve been working on revising a couple of books lately — namely, the CISSP Study Guide (going into a 5th edition) and Computer Forensics JumpStart (for a 2nd edition; both books for Sybex/Wiley) — and in that context found this SuperUser.com Q&A extremely interesting. It’s entitled “What’s the best way to compleely remove everything from a computer, without re-installing?” and it addresses sanitization of a PC in advance of its sale to a third party.
In a nutshell, the solution posted to this query essentially involves multiple levels of hard disk clean-up, then free-space scrubbing (so-called/claimed “secure erasure”) to remove all traces of files removed during the clean-up effort (see the posting for the details, which may of interest to individuals seeking to squeak a few extra bucks out of personal equipment, perhaps to help fund the purchase of replacement gear). This is all well and good for machines that have never played host to anything sensitive, proprietary, or protected under various rules and regulations governing customer, client, or patient confidentiality.
This is not a viable solution for corporate gear. Under no circumstances should hard disks that have been used for business purposes ever be re-sold to a third party. Baldly put, these devices need to be securely destroyed to prevent their contents from getting into the wrong hands. After reading (and in some cases writing) about the kinds of tools and techniques available for data recovery and restoration, and understanding the liability and risk exposures that unauthorized access to such data can pose, the only truly safe way to dispose of hard disks used for business purposes is to subject them to some process that damages the platters on hard disks sufficiently that the pieces can’t be put back together again for aggressive scanning and recovery efforts.
That means crushing, shredding, or otherwise mangling the devices so that they simply can’t be accessed and read ever again. With storage as cheap as it is today, it even makes sense to remove and replace drives when equipment is slated for donation to schools or charities, as is sometimes the case with corporate equipment retirements. Anything less risks data discovery, and is simply not worth the potential exposure incurred thereby. Make this stipulation part of your security and lifecycle management policies, and you’ll never have cause to regret this decision.
Back in 2005, I had the privelege and the pleasure to work with Mike Chin (the guru behind SilentPCReview.com) and Matt Wright (a PC video maven who writes for sites like MissingRemote.com and HTPCnews.com) on a book entitled Build the Ultimate Home Theater PC. It let me ride a favorite hobby-horse of mine: namely, an intelligent and productive link-up between a PC and a home entertainment/home theater system. Because PCs can do storage, grab stuff off the Web, and organize music, video, and multimedia so nicely, the marriage between a properly equipped PC and a high-end receiver or pre-amp is appealing to me and all kinds of other audio- and videophiles.
Because of that book, and other writings I’ve done for Tom’s Guide and Tom’s Hardware on multimedia and media center PCs, I still occasionally hear from other movers and shakers in that space. About three months ago, a gentleman by the name of Mike Wigle contacted me about his product, ZZcoustics, probably because I’ve been active in the HTPC world for some time and still write about it from time to time. The product is a small box (dimensions of 4x2x1in/10.2×5.2×2.5cm) with a male mini-jack on one side and a pair of RCA jacks on the other side (white and red for left and right stereo channels, respectively).
Unless you’ve gone to the trouble of installing a high-end audio card into your PC (such as the Asus Xonar HDAV 1.3 Deluxe or the HT Omega CLARO Plus) you will probably encounter line output issues should you try to connect the outputs from a headphone jack or audio connectors from your PC’s motherboard). Simply put, the ZZcoustics does a bang-up job of amplifying the output voltage from its input (the headphone jack side) and delivering standard line-in voltage and impedance from its RCA outputs to your home entertainment sound system.
What does this mean? The oscilloscope audio/video demo on the ZZcoustics site does a pretty good job of showing you: weak, puny audio from the headphone jack on a notebook or desktop PC, blazing hot, clear sound from the output of the ZZcoustics box. At $40 a pop, these units are a great investment for anybody seeking great (stereo) audio from their notebook or desktop PC, or from a compact audio device (like an iPod or MP3 player), into a home entertainment system of some kind.
As I was looking over the boot/system disk for my Dell D620 laptop this morning, I discovered a directory named $Windows.~Q that contained nearly 3 GB of data. Recognizing the leading $ (dollar) sign as a technique that Windows uses to hide shares and directories from casual display and access, I looked the directory up online to discover that it is created during the upgrade process from Vista to Windows 7 (you can’t upgrade from XP or earlier systems to Windows 7; for those older Windows versions a clean install is required). Looking around further, I discovered another hidden folder named $INPLACE.~TR as well.
A little quick research online (see this interesting article at the HowToGeek’s site, for example) informed me that not only it it safe to delete these folders and their contents, but that the effort would free up at least 1 GB of disk space, and often more than that (2.90 GB for the $Windows.~Q folder and 667 MB for the $INPLACE.~TR folder, for a total of 3.55 GB). In fact, I learned that the Disk Cleanup utility will display checkboxes for these items after you click the “Clean up system files” button following an initial cleanup scan on a system that has been upgraded from Vista to Windows 7, as the following screenshot illustrates:
I realize the number of machines that will be subject to in-place upgrades from Vista to Windows 7 will be small at any given company or organization, but if you do find yourself in the boat of using Windows upgrade to build a new image, or perform the upgrade on one or more PCs, don’t forget to take this additional cleanup step after the upgrade is complete. In fact, a quick post-install batch file will take care of these two directories quickly, even if Microsoft didn’t see fit to remove them on your behalf automatically.
I’ve been a big fan of the SysInternals stuff since the mid-90s, so it was with some interest that I noticed a recent update to their “ultimate Windows start-up management tool” Autoruns.exe on September 29, 2010 (it took me a while to get around to checking out the new release, so that’s why I’m writing about it today, and will probably write about it further fairly soon). Here’s a basic screen cap of what the program looks like:
I’ve been writing regularly about a Windows boot-up optimization tool lately called Soluto (9/13/2010, 9/16/2010, and again on 9/27/2010) and have thus stayed interested in topics related to speeding Windows boot-up and start-up lately. When I noticed the latest Autoruns.exe had a tab devoted to Drivers I decided to drop in and take a look at what Windows was loading on my production machine, and found a bunch of items getting loaded that I knew my current runtime environment wasn’t using and would probably never need — most notably, a whole slew of RAID, Serial-Attached SCSCI (SAS), SCSI, and Fibre Channel (!) device drivers, but also various AMD drivers for this all-Intel machine, along with some PS/2 stuff as well.
If you’re inclined to avoid loading drivers you’re not going to use, I do recommend that you first make an image backup of your system before turning things off willy-nilly. That way if you turn something off that hoses your machine, you’ll be able to get back to operation if all other fast-fix strategies fail (like returning to the LKGC or booting in Safe Mode to rerun Autoruns and turn things back on). That said, I turned off all the device drivers I knew weren’t in use on my machine and realized a pretty substantial decrease in boot time: from 1:16 to 0:41 (info courtesy of Soluto) for a savings of 0:35 (46%!!). I’m going to back and turn some other things off that I missed on the fist round and see what happens next, but this looks like another great way to boost Windows start-up times substantially. [Note added 15 minutes after original posting: turning off all Adaptec and other SCSI related storage drivers and a few other odds’n’ends dropped my start-up time to 0:38, exactly half of the original 1:16. Way to do, Autoruns!!]
Soluto, are you guys looking into testing for presence of devices for which drivers are loaded as part of your optimization analysis? If not, let me be the first to recommend adding this to your bag of tricks (which would be easy to implement simply by comparing the results of device enumeration in Windows to the drivers and driver classes actually loaded during startup, and recommending turn-off for those not actually in use). I’m even going to copy the Soluto team on this blog so they can let me know what they are doing in this regard.
Count on me to report further on the latest version of Autoruns, as I spend more time with the program and learn further tips and tricks.
[Note added 10/11/2010: After I shared this blog with the folks at Soluto, they responded by saying they, too, were keenly aware of AutoRuns and while they do have future plans to offer options to remove unused and unneeded drivers from the boot-up sequence, they’re not ready to talk about them just yet. I guess we’ll just have to stay tuned…]