Yottabytes: Storage and Disaster Recovery

June 30, 2011  8:02 PM

The Bandwidth vs. Storage Equation Includes Cost, Availability

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

People sometimes ask me why I still have DVDs when I have Netflix, why I still have CDs when I have iTunes, why I still have a landline when I have a cellphone and Skype, and why I still have books when I can download all that stuff off the Internet. The thing is, I’m a mistrustful old cuss and I don’t like depending on a single source for things.

Consequently, it makes me nervous when people talk about putting everything on the cloud. Yes, I agree, there’s specific use cases where that’s preferable. And there’s replication, multiple copies, worldwide access, I can get to it anywhere. Fine.

But it’s not here. It’s that high tech/high touch thing, as John Naisbitt would say.

So that’s why it was particularly interesting to read a couple of different takes on the cloud and how it relates to storage. Supposedly, part of the reason for going to the cloud is to save money — by paying operational expenses to someone to manage your storage instead of having to pay capital expenses and salary to buy and manage your own storage.

But an organization called Backblaze did a truly wonderful infographic talking about the cost of storage vs. the cost of bandwidth over time — and bandwidth wasn’t winning. The price of buying a gigabyte vs. downloading a megabit per second crossed over around 1995 — and the split’s been getting wider since then. In fact, if bandwidth had decreased in cost in the U.S. as quickly as the cost of storage, we’d get 985 Mbps for $45 a month.

A couple of pundits found this infographic very interesting.

“Cloud computing, if anything, depends on the idea that we will have ample and cheap bandwidth that will allow us to access various types of information and services on any kind of device, anywhere. The rapid growth of cloud as outlined by Amazon CTO Werner Vogels at our Structure 2011 conference only underscores the need for more bandwidth. This need only goes up as we start living in an on-demand world, streaming instead of storing information locally,” wrote Om Malik of GigaOm.

And says Tim Worstall at ChannelRegister UK:

“Yes, access from anywhere is lovely, and being able to get at your data as long as you have access to the cloud is cool. Being able to time-share is also pretty good: we don’t all need to have huge computing power at our fingertips all the time and the cloud can provide us with that when we need it. However, part of the basic contention does depend upon the relative prices of local storage and the cost of transporting from remote storage to local use and usability: in short, the cost of bandwidth. If local disk space is falling in price faster than bandwidth, then the economics are moving in favour of local storage, not the cloud.”

But that’s just the cost factor. Steven J. Vaughan-Nichols at ZDNet went on to talk — in the context of cloud-based applications such as Microsoft Office 365, but the principle is the same — about the dangers of relying on access to the cloud to run a business, and how such access was becoming less sure over time, not more. He points out potential problems such as delays in Internet access during busy times and the increasing number of Internet service providers imposing bandwidth caps:

“Wouldn’t that be just wonderful! Locked out of your local high-speed ISP for a year because you spent too much time working on Office 365 and watching The Office reruns.”

I’m thinking I should buy another terabyte of storage for my home office. Maybe two.

June 24, 2011  5:36 PM

Windows 8 to Include Virtualization?

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

In a move that might let every individual user learn the joys of virtualization, a Windows blogger, Robert McLaws, discovered this week that the forthcoming Windows 8 operating system has a Microsoft Hyper-V hypervisor built into it.

This made a lot of people very excited (more than 12,000 people hit Robert’s blog by the time I write this). Previously, Hyper V has only been used as a server-end virtualization utility, noted International Business Times.

So, that leaves us with two major questions:

What does this mean for Windows users?

McLaws laid out a dozen new features with the hypervisor, including support for more than four cores and the ability to support up to a 16-TB disk. In addition, the system has the potential to offer much improved emulation support, including for Windows XP and Windows 7, as well as Windows Phone 7. That’s particularly good for developers writing applications for those operating systems, he says. Linux or Apple operating systems are also a possibility.

It would also help organizations that still run legacy software, says Mike Halsey of Windows 8 News and Tips. “[T]he largest problem facing Windows is the need for legacy support, which can account for 80% of the latches and updates delivered to the operating system, and is also a major factor in some older software not working.”

It could also help protect the machines by virtualizing everything. In particular, it could help solve the problem of people using their business computers for personal reasons, said Eric Knorr and Doug Dineley in InfoWorld.

One basic division would be between a “business virtual machine” and a “personal virtual machine” running on the same client. The business virtual machine would be a supersecure environment without any of the personal stuff users download or run; changes to that business virtual machine would be synced to the server when users were online. If the client hardware was lost or stolen or the user’s relationship with the company ended, the virtual machine could be killed by admins remotely.

(InfoWorld’s J. Peter Bruzzese and ZDNet’s Mary Jo Foley were also competing about which of them had first come up with the idea that Microsoft should do this, with Bruzzese pointing to a 2009 column and Foley pointing to a 2010 article that referenced a 2009 blog post).

Not everyone, though, was enamored. Take Kevin Fogerty of ITWorld, who expects it to show up only in high-end versions of the operating system:

It would be a great idea if – at least right now – provisioning, storing, launching and managing VMs on a desktop weren’t already too complicated for most users to handle. Rather than reducing support requirement, it might increase them. It would also confuse users who often can’t tell the difference between the monitor, the computer, the applications and the “cloud” what they’re actually working with, making support calls infinitely longer and even more frustrating than they are now.

What does this mean for VMWare users?

Hyper-V isn’t the only hypervisor out there, of course, and what will having Hyper-V built into Windows 8 do for people running VMWare? As Fogerty says:

[O]nce all those copies of Hyper-V are running on everyone’s desktop, what possible reason could there be to go buy desktop virtualization from VMware, Citrix or elsewhere. Microsoft would put itself back in the game for the virtual desktop by giving away in the operating system all the goodies other vendors rely on for revenue and growth. I wonder if Microsoft has ever done that before?

Needless to say, the VMWare support boards were buzzing with the news, as well, and while VMWare people weren’t talking, they didn’t sound particularly nervous or surprised, either. “Unfortunately NDAs prohibit the release of any information and VMware won’t officailly comment, so we will all just have to wait and see,” noted VMRoyale, one of the moderators of the VMWare community. “Who knows – maybe the intent is already there??” replied Maishsk, another moderator.

June 18, 2011  10:08 PM

Executives Leery, Disappointed About Virtualization, Cloud

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Vendor surveys are always dicey; it’s remarkable how often survey responses just happen to line up perfectly with the vendor’s product line. But Symantec’s survey on Virtualization and Evolution to the Cloud seems on the up-and-up, if only because the results are such a bummer to virtualization and cloud vendors.

Most notably, organizations that have implemented various kinds of virtualization and cloud technology indicated that they were frequently disappointed by the results not meeting their expectations. Server virtualization was actually one of the better ones, having an average expectation gap of 4% overall, including 7% scalability, 12% in reducing capital expenditures, and 10% in reducing operational expenditures. In fact, more than half of respondents (56%) said storage costs somewhat or significantly increased with server virtualization.

In contrast, private Storage as a Service had an average expectation gap of 37%, including 34% in scalability, 40% in reducing complexity, and 37% in efficiency. Storage virtualization was almost as bad, with an average expectation gap of 34%, including 32% in agility, 35% in scalability, and 32% in reducing operational expenditures. Hybrid/private cloud computing had a 32% average expectation gap, composed of 39% in time to provision new resources, 34% in scalability, and 29% in security. Finally, endpoint/desktop virtualization had an average expectation gap of 26%, including 27% in new endpoint deployment, 30% in application delivery, and 27% in application compatibility.


Symantec attributed the varying gaps to the varying degrees of maturity of the various technologies, noting that server virtualization is more mature than storage-as-a-service, for example. “Expectations are unlikely to be matched by reality until IT organizations gain sufficient experience with these technologies to understand their potential,” the report said. “These gaps are a hallmark of early stage markets where expectations are out of step with reality.”

Similarly, organizations indicated that they were more willing to virtualize business-critical applications than they were to put them on the cloud — probably for the same reason.

“Among those currently implementing hybrid/private cloud computing, the most common concerns regarding placing business-critical applications into the cloud are related to disaster recovery, security and maintaining control over data. Disaster Recovery concerns were expressed by 70 percent of respondents, and more than two-thirds expressed concerns over loss of physical control over data and fear of hijacked accounts or traffic. Other concerns involve performance and compliance issues.”

The survey also noted that executives were much more hesitant about placing business-critical applications into a virtualized or cloud environment than were more IT-specific people such as server groups and application owners, due to concerns about reliability, security, and performance. At the same time, actual implementation results typically met performance goals. Symantec attributed this misperception in the face of reality to a lack of communication between IT and executives, meaning that executives weren’t hearing from IT the degree to which such implementations actually were successful.

That said, of enterprises that are implementing virtualization, more than half (59%) plan to virtualize database applications in the next 12 months, 55% plan to virtualize Web applications, 47% plan to virtualize email and calendar applications and 41% plan to virtualize ERP applications.

About a quarter of the survey’s respondents said their organizations have already implemented some form of virtualization or cloud, with another quarter in the midst of implementing, 20% in pilot projects, and about 20% discussing or planning for it.

The survey was performed in April, and consisted of 3,700 organizations of various sizes in 35 countries, including small, medium, and large enterprises. Respondents represented a wide range of industries and included a mix of C-level (CIO, CISO, etc.) executives (31%), IT management who were primarily focused on strategic issues (35%), and IT management primarily focused on tactical issues (34%). 60% were 31 to 49 years of age, with the rest split between those less than 30 (30%) or older than 50 (10%). 79% were male. The typical respondent had worked in IT for 10 years. 20% said their companies were shrinking in terms of revenue, while 61% reported growth, Symantec said.

June 12, 2011  9:30 PM

Fusion-io IPO Attracts Investment Excitement to Storage Vendors

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

“Excitement” and “storage” aren’t really words that go together very often. But this week was different, with the first storage IPO in three years sparking a surge of interest not only in the newly public flash vendor Fusion-io (NYSE:FIO) but investments in several other storage companies as well.

Fusion-io has the advantage of a couple of big names associated with it — chief technology officer Steve Wozniak (am I old enough that I have to explain his connection with Apple?) and Facebook, which uses the company’s storage devices. Another couple of big names that helped were LinkedIn and Zynga, not because they use the company’s products but by having successful IPOs in the computer industry in the past few weeks that paved the way.

Like LinkedIn, Fusion-io raised the planned price of its IPO the day before it went public, to $16 to $18 per share after originally suggesting it would be priced at $13 to $15 — and then actually priced it at $19, raising $233.7 million and giving the company a valuation of $1.8 billion, according to Investor’s Business Daily.

The Debbie Downers at the Wall Street Journal, however, pointed out a number of issues with Fusion-io:

  • It doesn’t expect it to maintain its growth
  • It’s never made a profit for an entire year at a time
  • The nine-month period ending March 31 showed a slender profit of $35,000
  • 10 of its clients account for 91% of its revenue
  • Facebook alone accounts for 47% of its revenue
  • Oh, and by the way, that was going to decrease
That said, for the moment, interest in storage startups is high and people are excited. Really! Look!
  • Virsto Software, a virtual machines storage company, raised $12 million in Series B venture capital funding led by InterWest Partners with August Capital and Canaan Partners also participating
  • Virsto also acquired EvoStor, which specializes in storage virtualization technology for VMware environments, for an undisclosed amount
  • Flash array maker Violin Memory raised a $40 million Series C round from public-market investors
  • VeloBit raised an undisclosed amount of Series A funding from Fairhaven Capital and Longworth Venture Partners
Enjoy it while we can.

June 6, 2011  7:52 PM

Disk Storage Systems Sales Continue to Show Good Growth

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

IDC recently released its Q1 disk storage systems sales figures, and there’s good news and…well, actually, it’s pretty much just good news, unless you’re Dell or a small vendor.

Here’s several aspects of the good news:

  • 13.2% growth in external disk storage factory revenues year over year
  • 17.3% growth in open networked disk storage systems
  • 13.4% growth in open SAN
  • 27.1% growth in NAS
  • 23.0% growth in iSCSI SAN
  • 12.1% growth in total disk storage systems
  • Fifth quarter in a row of double-digit growth
  • 46.3% growth in capacity


Broken down by vendor, in terms of market share, things haven’t changed much, relatively speaking. In external disk storage, the top five vendors are EMC, NetApp, IBM, HP, and Fujitsu — NetApp and IBM swapped places compared with a year ago. In the total open networked disk storage market, EMC led NetApp. Broken out into components, Open SAN had EMC, IBM, and HP; NAS had EMC and NetApp; and iSCSI SAN had Dell, followed by HP and EMC tied for second. Finally, in worldwide total disk storage factory revenue, we have EMC, HP, IBM, Dell, and NetApp, the same order as a year ago.

There are, however, a couple of interesting points to be made:

  • We saw a case of “the rich getting richer.” Generally, the market shares of the top vendors increased, while the market share of “other” decreased.
  • The one exception was Dell, which went from 12.7% to 11.4% — and that was *after* IDC started including Compellant in its figures, after the company’s acquisition. Chris Mellor of the Register UK points out that Dell fell completely out of the top 5 in external disk revenues, being replaced by Hitachi, with which it had tied in the previous quarter. In fact, in total revenues, NetApp may overtake Dell in the next quarter, he adds.
Now, if IDC ranked its vendors by revenue growth, we’d have seen different orders.
  • In external disk storage, ranked by revenue growth, we’d have seen NetApp, Hitachi, EMC, IBM, and HP.
  • In total disk storage, we’d have seen NetApp, EMC, IBM, HP, and Dell. Mellor points out, however, that NetApp’s growth has slowed compared to previous quarters.

Some of HP and EMC’s growth is due to acquisition — in HP’s case , it’s H3C and 3PAR, while in EMC’s case it’s Isilon.

It will be interesting to see how things change in the next quarter.

  • With everyone talking about the cloud, will fewer people be buying fewer drives?
  • Or will the storage sold to all the cloud vendors make up for it?
  • Or, will the Amazon outage send people scrambling to take care of their own storage again?
  • What will happen to disk storage sales as flash becomes more popular?
  • How might acquisitions in the drive manufacturing space change things in the system space?
  • What will happen with Dell?

And what about Naomi?

May 30, 2011  9:15 PM

Expect a Wave of E-Discovery Acquisitions

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Two events happened last week that are expected to lead to acquisitions of a quarter of the electronic discovery vendors by 2014 — and one of them even provided a shopping list.

The first event was security vendor Symantec acquiring e-discovery vendor Clearwell. Dave Raffo already talked about the details of the acquisition; what’s interesting about it in this context is that it’s simply the first domino, as predicted by the second event.

The second event was Gartner releasing its first “Magic Quadrant” analysis of the e-discovery marketplace, which, among other things, predicted that a quarter of all e-discovery companies will be consolidated by 2014, with the acquirers likely to be mainstream companies such as Hewlett-Packard, Oracle, Microsoft, and storage vendors.

Symantec’s acquisition of Clearwell fit right into predictions: Clearwell was named to the leaders quadrant, and Symantec had been named to the challengers quadrant, meaning it primarily needed more vision — which Clearwell could provide.

Now it’s likely that in the kind of musical chairs M&A people go through because they don’t want to be the one standing when the music stops, the sorts of vendors Gartner talked about as acquirers — particularly the major vendors in the challengers quadrant, IBM and EMC, as well as Nuix — will start looking at the list of contenders so helpfully provided in the report.

Likely to be up, of course, are the other vendors in the Leaders quadrant — Autonomy, which just acquired niche player’s Iron Mountain’s digital business itself; FTI Technology; Guidance Software; and kCura. Less attractive, but also likely to be less expensive and, maybe, more desperate, will be the other vendors, such as AccessData Group, CaseCentral, Catalyst Repository Systems, CommVault, Exterro, Recommind and ZyLab in the “visionaries” quadrants, and Daegis, Epiq Systems, Integreon, Ipro, Kroll Ontrack,  as well as the ediscovery components of Lexis/Nexis and Xerox Litigation Services.

Anybody placing bets?

May 24, 2011  4:24 PM

What Zombies Can Teach Us About Disaster Recovery

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

The Center for Disease Control recently issued an emergency preparedness and response circular about….zombies.

You may laugh now, but when it happens you’ll be happy you read this,” the circular warns. It goes on to describe the zombie threat, and what people can do to be prepared in the event of a zombie apocalypse.

No, it wasn’t issued on April Fool’s Day, and no, it wasn’t a joke. Well, sort of. Does the CDC really expect a zombie apocalypse anytime soon? No, probably not. But its purpose in creating the alert was deadly serious, and it’s something we can all take lessons from in developing disaster recovery plans:

  • It got attention. The CDC had more than 1,000 articles published about the Zombie Apocalypse circular, and got so many Internet hits that its server crashed. Even so, the zombie circular got 60,000 hits in that first day. In contrast, a typical CDC blog post might get between 1,000 and 3,000 hits, and the most traffic on record had been a post that saw around 10,000 visits, a CDC spokesman told Reuters.
  • Disasters are all the same. Seriously, there’s not going to be that much difference in preparing for an earthquake vs. preparing for a pandemic vs. preparing for…well, a zombie apocalypse. The CDC suggested list of preparations included getting together food and water supplies, making arrangements to meet with loved ones, etc. — all the same sorts of things you’d do if you were making *any* disaster plan. True, some of it was with the theme (“Plan your evacuation route. When zombies are hungry they won’t stop until they get food (i.e., brains), which means you need to get out of town fast!”) but generally the suggestions were generic and could apply to any disaster.
  • Bypasses the Critical Censor. As a former resident of the Bay Area, I can testify that people who live in an area prone to disasters can develop a certain kind of blinders. Yes, we all knew there’d be an earthquake sometime, and some of us even had some preparation, but in general people don’t worry about it all the time. “Human beings are hard-wired to believe in their heart and soul that disasters don’t happen and won’t happen to them,” Dennis Mileti, a retired University of Colorado sociology professor and researcher, told MSN Money. Writing the circular about zombies allowed people to read it and absorb the lessons without them getting into the whole “Oh, I know all that, I don’t need that, lalala” reaction that a more realistic disaster could have elicited. (In point of fact, the CDC wrote the circular to help people prepare for hurricanes.)

It seems pretty self-evident that one could apply these same lessons to writing a disaster recovery plan — just write all the same preparations, but wrap it into another event that could get people’s attention and make them laugh a little as they read it.

I hear the next Rapture is scheduled for October 21.

May 18, 2011  11:22 PM

Researcher Files FTC Complaint Against Dropbox

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Well, the other Dropbox shoe has, uh, dropped. In response to last month’s revelation that the Dropbox file sharing service can’t actually promise to keep your files secure, but can look at them and will turn them in to law enforcement if requested, a researcher has filed a complaint with the Federal Trade Commission claiming deceptive practices.

The complaint was filed on May 11 by Christopher Soghoian, who was a busy boy this month; as you may recall, he also hit the front pages by breaking the story on May 3 of an unknown perpetrator, which turned out to be Facebook, attempting to smear Google with privacy accusations.

It turns out that the whole reason Dropbox changed its privacy policy and brought up the issue of law enforcement in the first place was due to Soghoian, who did some research on encryption, deduplication, and how Dropbox saves storage space. As it happens, it checks to see whether it already has the file being uploaded, and, if so, puts in just a pointer to it in the new user’s space. Very efficient.

The problem is, that’s something someone else can see, too. They can upload a file, and, if much less data transmits than the file size, they know it’s a file Dropbox already has. This is where law enforcement comes in. Writes Soghoian:

What this means, is that from the comfort of their desks, law enforcement agencies or copyright trolls can upload contraband files to Dropbox, watch the amount of bandwidth consumed, and then obtain a court order if the amount of data transferred is smaller than the size of the file.

Last year, the New York Attorney General announced that Facebook, MySpace and IsoHunt had agreed to start comparing every image uploaded by a user to an AG supplied database of more than 8000 hashes of child pornography. It is easy to imagine a similar database of hashes for pirated movies and songs, ebooks stripped of DRM, or leaked US government diplomatic cables.

Do you see how this is even worse than simply Dropbox having to cough up a specific user’s data upon request from law enforcement? Law enforcement can now say, we *know* you have this data online, *you* tell *us* who has it.

And think of how this would play with the new PROTECT-IP bill that’s being proposed, which would let a third party shut down a site for having a copy of its intellectual property: Viacom, say, uploads a copy of a movie it suspects is available on Dropbox, finds it’s already there, demands to know who it owns it, and then shuts down that company’s site — potentially all without ever getting a warrant, because if Dropbox won’t tell, Viacom can shut *it* down for having a copy of the file. And if Dropbox gets shut down, what happens to all its other, innocent users’ files?

Moreover, Soghoian writes in his complaint, users now run the risk of having either rogue employees or hackers breaking into the Dropbox system to steal files and the stored keys that enable the company to decrypt and deduplicate files.

Recent high profile data breaches experienced by RSA, 32 Comodo, and Lastpass demonstrate that hackers are increasingly sophisticated, and are now seeking out high‐value infrastructure targets that can deliver more than just a few million credit card numbers.

(Oddly, Soghoian doesn’t list Epsilon as one of his examples, the electronic mail service bureau that was broken into in March in a data breach, the costs of which could eventually reach $3 to $4 billion.)

Soghoian’s not asking for much in return: Just that Dropbox tell people they can decrypt files, by emailing all its users rather than just changing its terms of service, make Dropbox give their money back to anybody who wants it, and never, ever to do it again.

While Dropbox has responded to the basic facts of the complaint in its blog, it hasn’t addressed the security hole associated with law enforcement or other data owner being able to tell what’s already on the service by sending another copy of it up.

Between this and Facebook/Google, one wonders what Soghoian’s going to do for an encore.

May 12, 2011  11:06 PM

The ‘Lost or Seized Laptop Data’ Case for the Chromebook

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

A lot of my friends spent the day scoffing at the notion that anybody would spend $28 a month (for a business user), $20 a month (for a student), or almost $500 to outright purchase a Chromebook, a netbook computer that uses Google Apps to use data stored entirely in the cloud.

Okay, I’ve got geeky friends. Granted.

The thing is, I think my friends are wrong, and that there’s quite the business case to be made for Chromebooks.

Consider. In March alone, there were several incidents of laptops lost with large amounts of sensitive personally identifiable information. And in recent months, the Ponemon Institute has performed studies about the cost involved in data lost through laptops, both in Europe and the U.S. The numbers are astonishing.

According to the findings, the number of lost or stolen laptops is huge. Participating organizations reported that in a 12 month period 86,455 laptops were lost or missing. The average number of lost laptops per organization was 263.”

That’s in the U.S. In Europe, the figures 72,789 laptops, and 265 laptops per organization. This adds up to $2.1 billion in the U.S., and 1.29 billion Euros in Europe.

That’d lease a lotta Chromebooks.

But even if companies suddenly became much more careful of their laptops, there’s another issue, one over which they don’t have much control, and that’s search and seizure by the U.S. government.

In August 2009, the U.S. government implemented a new policy for the Department of Homeland Security giving the department the right to search laptops in border areas. The problem is, according to Udi Ofer, Advocacy Director for the New York Civil Liberties Union, in a letter he wrote to the New York Times in August, 2010, Border Patrol agents have the right to conduct such seizures within 100 miles of the U.S. border, which covers much more of the United States than it sounds. In fact, two-thirds of the population of the U.S. lives in one of those areas, he wrote — and people in those areas could be subject to losing their laptops. (Indeed, the Ninth Circuit Court recently ruled that such laptops could be transported more than 100 miles away to do a more thorough search.)

In addition to business executives, this makes two other groups very nervous: Attorneys, who are concerned about privileged client information, and photojournalists, who are concerned about having their pictures taken away. This is why, last September, the National Association of Criminal Defense Lawyers (NACDL), the American Civil Liberties Union (ACLU), and the New York Civil Liberties Union (NYCLU) announced they were fighting this law. (The Electronic Frontier Foundation, which had already been following the issue, supported them.)

The advantage of data in the cloud is, it can’t be seized at the border. You might be out a $500 notebook, but not the much more valuable data that would otherwise be on it.

That’s not to say that data can be stored in the cloud with impunity — there are indications that cloud providers, too, are vulnerable to persuasion from law enforcement. But there’s at least some standard of proof required for that.

And yes, as my friends argued, there’s other ways to get thin client cloud-oriented notebooks than from Google. But Google is making it simple. And considering how many people are managing to lose their laptops these days, simple may be what we need.

May 6, 2011  8:10 PM

Why al-Qaida Hopes Osama bin Laden Did a Backup, and Other Cautionary Tales

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Granted, it’s not every IT administrator who has to deal with a C-level executive in a remote office losing confidential company data because an elite armed military force broke into the place he was staying and took it. That said, there’s a number of lessons that IT administrators can take away from this week’s news.

It’s one of an IT administrator’s worse nightmares, to lose 10 hard drives, five computers and more than 100 thumb drives. But even if it’s left in the back of a cab, rather than being taken by Navy SEALs, it’s still a problem. So let’s look at some of the issues.

1. Backups. Did bin Laden do a backup? We already know his system wasn’t replicated, because the news articles have all said he didn’t have Internet access to his compound. If he did do a backup, then what? Was it located in the same hideout, and also taken? Or did someone use Sneakernet — or, in this case, Sandalnet — and manually carry backups to another location? If not, al-Qaida may have permanently lost access to this data. Takeaway: Do backups, and make sure copies are stored off-site.

2. Encryption. Was the data on the hard disks and thumb drives encrypted? If so, how hard is it going to be for computer experts in the government to find a key? Sent through plain text in an email message, perhaps? On one of the thumb drives? Or, Allah forbid, on a yellow sticky on the computer like some offices I’ve seen?

Failing that, how hard is it going to be for government computer experts to crack the encryption? Does bin Laden use 128-bit or 256-bit?  What method? Security experts had varying opinions as to whether bin Laden practiced safe computing, or used one of his wives’ names as the key like ordinary people do.

If the data is encrypted, the U.S. government isn’t saying at this point. Officials are saying the drives contained “very valuable information,” which means either it wasn’t encrypted or it used the encryption equivalent of pig Latin. Or, for that matter, the officials could be shining us on as well. What’re they going to say? “All we found is three seasons’ worth of pirated Friends episodes and some goat porn”?

Ironically, according to MSNBC, this sort of data capture has happened before.

“The most notable previous bonanza that has publicly been revealed was uncovered in July 2004, when al-Qaida computer expert Mohammed Naeem Noor Khan was captured in Pakistan. His laptop computer provided a trove of information and more than 1,000 compact disk drives that were found in his apartment.”

You’d think they’d have learned.

Or maybe they did. One hopes that the government computer experts are taking precautions as well. Keep in mind that a number of incidents of malware — including Stuxnet – have been spread using thumb drives, under the theory that even intelligent people will pick up a thumb drive and pop it onto their computer to see what it does. Says writer Wayne Rash:

This is exactly what happened a couple of years ago in Iran when the Israeli Defense Forces quietly planted some USB memory sticks in places frequented by Iranian nuclear engineers. Like everyone else, they popped the devices into their computers and the rest is history.”

If U.S. government computers start going nuts in a few days, we’ll know why.

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: