Yottabytes: Storage and Disaster Recovery


November 18, 2014  2:11 PM

Here’s the One Thing to Look At to See If Your Hard Drive Will Fail

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Backup, Storage

Anyone who’s had a hard drive fail just as they were about to do a backup on it, honest! will understand how much we’d all like to know when our hard disks are about to fail.

Some time ago (between 1995 and 2004, depending on how you count), a standard was developed called Self-Monitoring, Analysis and Reporting Technology (SMART, get it?) that was intended to help with this problem.

Unfortunately, like many other technologies, its user experience was not the best. SMART defines — and measures, for those vendors that support it — more than 70 characteristics of a particular disk drive. But while it’s great to know how many High Fly Writes or Free Fall Events a disk has undergone, these figures aren’t necessarily useful in any real sense of being able to predict a hard drive failure.

Part of this is because of the typical problem with standards: Just because two vendors implement a standard, it doesn’t mean they’ve implemented it in the same way. So the way Seagate counts something might not be the same way as Hitachi counts something. In addition, vendors might not implement all of the standard. Finally, in some cases, even the standard itself is…unclear, as with Disk Shift, or the distance the disk has shifted relative to the spindle (usually due to shock or temperature), where Wikipedia notes, “Unit of measure is unknown.”

That’s not going to be helpful if, for example, one vendor is measuring it in microns and one in centimeters.

There have been various attempts at dealing with this problem of figuring out which of these statistics are actually useful. One in particular was a paper presented at 2007 Usenix by three Google engineers, “Failure Trends in a Large Disk Drive Population.” What was interesting about Google is that it used enough hard drives to be able to develop some useful correlations between these 70-odd (and some of them are very odd) measurements and actual failure.

Now there’s sort of an update to that paper, but it uses littler words and is generally more accessible to people. It’s put out by Brian Beach, an engineer at BackBlaze; we’ve written about them before. Like Google, their insights into commodity hard disk drives are useful, simply because they use so darn many of them.

What BackBlaze has done this time is look at all the drives they have that have failed, and then go back and look at all their SMART statistics, and then correlate them. The company also looked at how different vendors measure these different statistics, so they have a good idea about which statistics are relatively common across vendors. This gives us a better idea of which statistics we should actually be paying attention to.

As it turns out, there’s really just one: SMART 187 – Reported_Uncorrectable_Errors.

“Number 187 reports the number of reads that could not be corrected using hardware [Error Correcting Code] ECC,” BackBlaze explains. “Drives with 0 uncorrectable errors hardly ever fail. Once SMART 187 goes above 0, we schedule the drive for replacement.”

Interestingly, this particular statistic isn’t even mentioned in the Google paper, nor is it called out in the Wikipedia entry for SMART as being a potential indicator of imminent electromechanical failure.

BackBlaze also discusses its results with several other statistics, and explains why it doesn’t find them useful. Finally, for the statistics wonks among you, the company also published a complete list of SMART results among its 40,000 disk drives. (And for some, that’s still not enough; in the comments section, people are asking BackBlaze to release the raw data in spreadsheet form.)

In addition to giving us one useful stat to look at rather than 70 un-useful ones, this research will hopefully encourage hardware vendors to work together to report their statistics more meaningfully, and for software vendors to develop better, more useful tools to interpret the statistics.

Disclaimer: I am a BackBlaze customer.

October 31, 2014  10:15 PM

Yes, Cops Can Make You Use Your Fingerprint to Unlock Your Phone

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Encryption, privacy, Security, Smartphones

While courts are still arguing back and forth about whether people can be compelled to give up the encryption key for their laptops and other devices, it looks like they may have decided that it’s okay to force you to use your fingerprint to unlock smartphones with that capability.

Judge Steven C. Frucci, of Virginia Beach, Va., ruled that David Baust, who was charged in February with trying to strangle his girlfriend, had to give up his fingerprint so prosecutors could check whether his cellphone had video of the incident. 

The distinction that courts draw in general is that a physical thing, like a key to a lockbox, is not protected by the Fifth Amendment. But the “expression of the contents of an individual’s mind,” such as the combination to a safe, is protected. Courts have been debating for a couple of years now about whether an encryption key is something you have or something you know. A fingerprint, however, is something you have, similar to the way that you can be compelled to give up a blood sample to test for alcohol, ruled the judge.

Phones that include fingerprint detectors include the Apple iPhone 5S and the Samsung Galaxy S5, according to the Wall Street Journal. In fact, when phones with fingerprint capability came out last year, organizations such as the Electronic Freedom Foundation and other legal experts warned that this could happen. “It isn’t hard to imagine police also forcing a suspect to put his thumb on his iPhone to take a look inside,” Brian Hayden Pascal, a research fellow at the University of California Hastings Law School’s Institute for Innovation Law, told the Journal last fall. Ironically, fingerprint scanners were supposed to make the phones more secure.

This also fits in with recent moves from companies such as Apple to make encryption the default on smartphones so the companies can’t be compelled to reveal information on the phones. If the phone is protected only by a fingerprint, then police could use the fingerprint to decrypt data on the phone. “One of the major selling points for the recent generation of smartphones has been that many of them don’t save their data in a way accessible to anyone without the phone itself,” writes Eric Hal Schwartz in In the Capital. “It’s something that has annoyed law enforcement like FBI director James Comey, but it chips away at some of that much-touted privacy if police can get into a phone with your fingerprint without your permission.”

Actually, Frucci made a distinction between Baust giving up his fingerprint, which he could be forced to do, and not having to give up a password for the phone, which the judge said he could not be forced to do. In other words, if the smartphone was protected by both a fingerprint and a password — such as, if the phone had been turned off — prosecutors would still be out of luck. If you’re concerned about this, some people are recommending turning off your phone when police approach, or by messing up the fingerprint unlocking multiple times, to force the phone to require you to enter a password.


October 30, 2014  6:37 PM

The Breakup of the Great Ottoman Storage and E-Discovery Empires

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Autonomy, Documentum, E-discovery, ediscovery, EMC, HP, Storage, Symantec

With this being the centennial of the start of World War I, and with what’s going on in the storage industry lately, it isn’t surprising if you’re also being reminded of the decline and fall of the Ottoman Empire.

Well, okay. Maybe only if you’re a history buff.

In case you were dozing in the back row during world history class in tenth grade (or, if, like me, your history teacher was actually a repurposed Latin teacher and you spent all but the last two weeks of the school year on Greece and Rome, meaning you covered a millennium a day those last two weeks), the Ottoman Empire lasted in one way, shape, or form for more than 500 years. It spanned three continents — Europe, Asia, and Africa — and contained 29 provinces and many other states. But it fell during World War I, and nations such as Britain and France carved up the pieces willy-nilly into ways that made sense to them, without paying much attention to cultural boundaries or what the people in those states might actually want to do. (In fact, some of the current conflict in the Middle East dates directly back to those actions. But I digress.)

Any of this ringing a bell yet?

So at this point, in the storage and e-discovery industry that this blog covers, we have not one but three Ottoman empires potentially in the process of dissolving, with a bunch of people on both the outside and the inside watching and speculating about how the pieces might all eventually fit together.

We’ve already talked about EMC, which is under pressure from shareholders to break itself up so the pieces can be worth more — a case of the whole being worth less than the sum of the parts. It isn’t clear yet exactly what’s going to happen with EMC, though there’s been plenty of speculation. (To further complicate things, EMC and Cisco are breaking up their partnership, which resulted in the software-defined networking joint venture VCE, with EMC taking control of it. More pieces to juggle.)

In the meantime, both HP and Symantec have announced their intentions to split in two. HP’s pieces are going to be one for its printer and PC business, and one for its corporate computer hardware and software business. Symantec’s pieces are going to be one for its security management products and one for its information management products.

And while the Britains and the Frances of the computer industry are arguing over the bigger pieces and how they will best fit together, other people — especially in e-discovery — are talking about some of the other pieces that haven’t gotten as much love lately and how this could all work out for them.

The HP split, for example, could result in new support for Autonomy, which HP bought for what everyone — including HP — agrees was way too much money. Not only was it not great for HP, but it hasn’t been too great for the Autonomy people either, who are kinda HP’s red-headed stepchildren.

The HP split, in fact, is “probably good news for long-suffering customers of the former Autonomy products,” writes Tony Byrne of Real Story Group. “You know why? Because things couldn’t get much worse for them.”

Meanwhile, Gartner pointed out this summer in its e-discovery Magic Quadrant that although it still positioned Symantec in the Leaders quadrant, its Clearwell product — one of the first big acquisitions in the 2011 e-discovery land grab — had languished under Symantec’s control. Or, as Gartner puts it, “The innovation pipeline for the eDiscovery Platform has slowed during Symantec’s acquisition and integration of Clearwell Systems, resulting in the product’s lack of growth and new releases.”

(Keep in mind that Autonomy and Clearwell had both individually been listed in the Leaders quadrant in the original 2011 e-discovery Magic Quadrant. Almost makes you wish that some company that really had a great vision for e-discovery would buy both pieces, integrate them, and really do it right.)

At the same time, some people are looking at some of the less-loved, neglected pieces of EMC, such as Documentum, and thinking that maybe there’s some way these could get involved, too.

“[Documentum] doesn’t seem to play a role in EMC’s survival,” writes Virginia Backaitis in CMSWire, before going on to suggest that HP buy it and integrate it with Autonomy. “In EMC’s quarterly call with investors last week, neither EMC CEO Joe Tucci nor his lieutenants (David Goulden, CEO of EMC Information Infrastructure and CFO Zane Rowe) uttered the name of its spawn at all.”

It remains to be seen how the various pieces of all three companies will combine (hopefully not in some e-discovery version of Iraq, with different factions battling for control). If nothing else, it could mean that next year’s Gartner e-discovery Magic Quadrant, which has been pretty much of a snore the last couple of years, has the potential to be a lot more interesting.


October 29, 2014  9:57 PM

To Heck With Station Wagons. How Much Data Fits on BART?

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
privacy, Security, Storage

Periodically, people take the new capacity of storage media — not to mention the new increasing sizes of motor vehicles — and uses it to recalculate that lovely statistic, “what is the bandwidth of a station wagon full of tapes speeding down the highway?” So now we have a new one — how much data goes back and forth to major cities, especially using public transit?

We now have that data courtesy of Mozy, a cloud backup service that describes itself as the “most trusted.” (Exactly how they figured out it was the “most trusted,” they don’t say.) According to the company, when you add up laptops, smartphones, personal hard drives, thumb drives, and so on, you end up with a pretty horrendous amount of data leaving the office every day:

  •  The average commuter takes 470GB of company data home with them at the end of every day – 2,500 times the amount of data they’ll move across the Internet in the same timeframe
  •  Every day, 1.4 exabytes of data moves through New York City alone – that’s more data than will cross the entire Internet in a day
  •  As much as 33.5PB of data will travel over the Oakland Bay Bridge every day
  •  As much as 49 PB of data will travel through the Lincoln Tunnel each day
  •  Up to 328PB of data travels in the London Tube network every day
  •  Up to 69PB of data leaves Munich’s Hauptbahnhof on a daily basis
  •  The Paris Metro carries as much as 138PB of data every day

(There’s also some really cool maps showing where the data is coming from.)

There is, however, one flaw in the Mozy description, which is that it refers to this phenomenon as a “data drain.” That’s not really accurate. A “brain drain,” for example, typically refers to people leaving an area. Their brains are therefore gone from the area. But this data isn’t actually leaving the area, in the context of it being gone. Instead, the data is copied. This leads to its own issues, such as version control, security, and simply taking up much more storage space than is really required. (Good thing storage is so cheap these days, amirite?)

And certainly one could quibble with the figure. Mozy doesn’t explain the methodology, but presumably it’s adding up the storage in each of the devices that people carry back and forth. And who knows, really, how much of it is actually corporate data, and how much of it is cat pictures? That said, it’s certainly a fun back-of-the-envelope statistic to calculate.

Anyway, it’s the security issue that is particularly catching Mozy’s interest. “With 41.33 percent of people having lost a device that stores data in the past 12 months, huge amounts of business data is put at risk every rush hour,” the company writes. “There isn’t a CIO we know who would risk sending massive volumes of data over the internet without protecting it first.”

Well, we have to say, Mozy must not know very many CIOs. That aside, the company has a point: with all the evidence we have of companies and governments behaving badly with personally identifiable data, there’s an awful lot of data at risk every day.

“A thief holding up a New York subway car at rush-hour capacity could walk away with over 100TB of data,” the company notes. (Which actually sounds like an interesting premise for a movie. Starring Denzel Washington? Jeff Goldblum? Sandra Bullock?)

This commuting data is vulnerable in two ways, Mozy notes. First, bad guys could get access to the data. Second, the person with whom the data is riding could lose access to the data, if that data is the only copy. “It’s also the most-critical data; the edits to the contract that we’ve just worked through in today’s meeting, the presentation that we’re giving tomorrow morning, the tax forms that you’re halfway through filling in,” the company writes. “Losing this data can have an immediate impact on a company’s success.”

Mozy, however, doesn’t go far enough. Let’s go to the root cause: Why are people taking so much data home with them? And if this is something we don’t want to have happen, what is the alternative? There’s already been any amount of hand-wringing over the notion of people setting up Dropbox and similar accounts to make copies of corporate data. Is carrying the data on a device more or less secure, or desirable, than saving it to a public cloud service?

Either way, device or cloud, it boils down to the same issue: People are making copies of the corporate data, by and large, because they feel they need to do that to do their jobs. So either there isn’t a reliable way for them to gain access to the corporate data they need any other way, or, if there is, they don’t know about it.

The point being, if people feel they have to do this to do their jobs, then you need to give them a better way. Simply issuing an edict that Thou Shalt Not is not going to work, even if you put teeth in it. Because, ultimately, they’re not as afraid of you as they are of their boss.


October 27, 2014  10:24 PM

Here’s Something Practical You Can Do to Prepare for Ebola

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Disaster Recovery

Depending on whom you ask, either everything is fine or we should be locking up the country to protect us all from Ebola. And frankly, at this point we don’t really know for sure. But there is something you can do: Make sure your company is prepared.

Whether it’s Ebola, enterovirus 68, or the flu, it’s possible for an illness to affect your company and the world it operates in. In a sense, preparing for a pandemic of any sort is no different from preparing for any other natural disaster, whether it’s a flood, hurricane, or tornado. (Except that a pandemic can last for weeks or months, while natural disasters are typically over in a few days.)

So, think about the same sorts of preparations you’d make for, say, a blizzard, and adapt them a bit.

  • Make sure staff can work from home, whether it’s because they can’t use the roads or because there’s a quarantine. Do they have the access they need? Computer equipment? Passwords? Is there a virtual private network set up to help protect the company when people are dialing in over a public network? Check with everyone and get these things set up now to make sure they’ll be available at a moment’s notice should you need them. And have everyone test their systems periodically to make sure they can still get online.
  • Make sure that there isn’t any single point of failure in the daily processes. Are there passwords or procedures that only one person knows? What happens if that person gets sick or can’t make it into the office, for whatever reason? The disaster recovery manual — you have one, right? — also needs to be accessible from a remote location.
  • While there’s not really an Ebola vaccine yet, there are vaccines for other illnesses with the potential to become pandemics, such as the flu. To ensure that employees get vaccinated, arrange to have someone from the health department come in to administer vaccines — and if necessary, have the company pay for it, to ensure that everyone gets vaccinated. The cost is minimal compared with the cost of the lost productivity if employees get sick.
  • It never hurts to stock up on hand sanitizer and alcohol wipes, and talk with staff — including the custodial staff — about how to keep from spreading germs. And while you’re at it, make sure people know that they should stay home if they or a family member is sick. If your company doesn’t currently offer paid sick leave, it might be a good time to add it. Again, think you can’t afford it? How well could you afford having half the office sick?

Hopefully, none of these plans will be needed. But in case they are, you’ll want to be ready, and in the meantime, it’ll give you something practical to do that could be useful sometime. As the saying goes, prevention is the best medicine.


September 30, 2014  10:57 PM

Will Only Outlaws Be Able to Have Smartphone Encryption?

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Apple, Encryption, Google

Now that Apple and Google have announced that they will incorporate encryption in smartphones by default, the question is how long law-abiding Americans will be allowed to continue to have encryption at all.

In case you missed it, Apple announced on September 17 that future editions of the iPhone would have encryption turned on by default in a way that no longer allows Apple to have access to encrypted data. “Apple’s new move is interesting and important because it’s an example of a company saying they no longer want to be in the surveillance business — a business they likely never intended to be in and want to get out of as quickly as possible,” writes Chris Soghoian, Principal Technologist and Senior Policy Analyst for the American Civil Liberties Union’s Speech, Privacy, and Technology Project. “Rather than telling the government that they no longer want to help the government, they re-architected iOS so they are unable to help the government.”  The following day, Google announced that future versions of the Android smartphone operating system would have encryption turned on by default as well.

Predictably, the FBI and law enforcement had kittens. “The notion that someone would market a closet that could never be opened – even if it involves a case involving a child kidnapper and a court order – to me does not make any sense,” said FBI director James Comey. He also went on to invoke the notion of the terrorism that could surely befall the U.S. when this happens. “Two big tech providers are essentially creating sanctuary for people who are going to do harm,”  agreed Ron Hosko, a former assistant director of the FBI’s criminal investigative division, told Marketplace. And pulling out the big guns, “Apple will become the phone of choice for the pedophile,” John J. Escalante, chief of detectives for Chicago’s police department, told the Washington Post. “The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”

Yep, terrorism, kidnapping, and pedophiles. They got the trifecta.

Company executives at Apple and Google told the New York Times that the government had essentially brought this on themselves with incidents like Edward Snowden, and that it was increasingly difficult for American companies to compete overseas with the perception that the U.S. government had its fingers in everything.

“The head of the FBI and his fellow fear-mongerers are still much more concerned with making sure they retain control over your privacy, rather than protecting everyone’s cybersecurity,” writes Trevor Trimm, in the U.K. paper The Guardian, after offering a line-by-line critique of Comey’s statement. Security experts pointed out that the government still had many other options by which it could legally request access to people’s electronic data, that the FBI didn’t cite any examples of cases where encryption would have prevented them from solving a case, and that one case cited by Hosko turned out to be irrelevant.

So the next question becomes, at what point might the federal government attempt to outlaw encryption again? Or mandate a back door?

In case you were born sometime after MTV, at one point in time, anything more powerful than 40-bit encryption was actually classified as a munition – you know, like bombs and missiles — and illegal to export from the U.S., and not all that easy to get hold of even inside the U.S. In fact, a guy named Philip Zimmerman got himself into a peck of trouble when he developed Pretty Good Privacy (PGP), intended to be Everyman’s data encryption. While it was fine inside the U.S., copies of it surfaced internationally, and for several years it looked like Zimmerman might face charges, which led to him being a cause celebre in the computer community.

In 1993, the Bill Clinton White House went further and proposed the Clipper Chip, an encryption system that included a back door so that law enforcement organizations could still read any data encrypted by the device. Which, of course, they’d only use if you were a bad guy, of course. But by 1996, partly due to the enormous wave of protest against the notion — and partly due to technical issues, such as bugs that were found in it (by a guy named Matt Blaze, who’s still around these days, commenting on the Apple/Google encryption flap) — the government had dropped the project. At the same time, the Clinton White House relaxed the rules on greater than 40-bit encryption.

These days, encryption is readily available, but generally you have to know about it and how to turn it on. What Apple and Google are doing are selling devices with it already turned on — and, in response to the increasing number of requests from the government for user data, they no longer will even have access to the user’s data.

(This does, of course, mean that if you lose your encryption key, you’re hosed.)

So what other effects can we expect from the Apple/Google decision?

  • Courts are still trying to figure out whether an encryption key is like a key or a combination to a safe — something you have or something you know — so they can decide whether you have to give it up. So law enforcement organizations are still taking people to court to force them to reveal encryption keys, and sometimes they win. Conceivably, with encryption turned on by default, this could happen a lot more often.
  • Having encryption be the default could also eliminate the “why are you encrypting it if you don’t have anything to hide?” presumption.
  • Of course, bad guys have pretty much figured encryption out, even when it’s not the default. To be blunt, Apple and Google’s actions simply mean that regular people will have the same capabilities as the bad guys. And in an era where companies and individuals alike are regularly losing laptops, disk drives, and smartphones with personal data in them — and then getting fined for losing the data and having it not encrypted in the first place — having encryption as the default simply makes sense.

So what would happen if the government were to outlaw encryption, or mandate a back door? The Electronic Frontier Foundation, which lives for this sort of thing, has a nice long list of possible repercussions.

Realistically, though, would outlawing encryption even be practical in this day and age? Look at it this way — if encryption were made illegal, that would mean that all the personal data on all the devices that get lost or stolen would then be accessible . It would make the Target incident look like a picnic. (The ACLU’s Soghoian pointed out that in 2011, the FBI was encouraging people to encrypt their data to keep it out of the hands of criminals.)

And everyone who believes that bad guys wouldn’t continue to be able to use encryption or would have to have a back door to their communications, please poke out your right eye. In the same way that bad guys continue to get access to illegal firearms today, bad guys would still get access to encryption, one way or another. Sorry, FBI, but that genie is out of the bottle.

It isn’t clear whether the government is going to try to outlaw encryption again, or try to mandate a law enforcement back door. There is some talk about Congress enacting a law, but due to the Edward Snowden revelations, few Congressional representatives want to touch it, according to Bloomberg Business Week. Still, it’s something we need to watch out for — but it looks like computer vendors are increasingly unlikely to help, according to Iain Thompson in the Register. “It’s unlikely Apple or Google is going to bow down to the wishes of government and install backdoors in their own products,” he writes. ” This would be disastrous to sales if found out, and there are increasing signs that the tech sector is gearing up for a fight over the issue.”


September 29, 2014  12:03 AM

Nobody Puts EMC In the Corner

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Cisco, Dell, EMC, HP, Oracle, VMware

You may recall that an activist investor called Elliot Management Corp. bought a $1 billion chunk of EMC earlier this summer — about two percent, big enough that it could start throwing its weight around and make suggestions about how EMC could make even more money for it, such as by selling VMware. EMC CEO and chairman Joe Tucci — who’s scheduled to retire in February for the nth time, said he’d meet with the company but wouldn’t make any promises.

Now it’s starting to look as though the EMC we all know and love may not survive Tucci’s departure.

As nearly everyone suspected, it all started out with Tucci saying he wouldn’t sell VMware, of which EMC owns 80 percent. Why exactly he’s so enamored of the virtualization company isn’t clear, other than the fact that it in turn comprises up to 75 percent of EMC’s value these days.

Instead, it’s looking like Tucci would rather sell EMC itself — or, at least, break it up and sell some of the parts. In the process, the storage company is being partnered (or, as my teenage daughter would say, “shipped,” as in “relationship”) with just about every major other computer vendor out there. And some of them are willing and some of them aren’t.

Cisco. While there was talk about a potential merger with Cisco, Cisco Chief Executive Officer John Chambers himself said there was nothing to it, and if there had been anything, it would have been a year ago. 

HP. Because, after all, HP acquisitions always go so well. Look at Autonomy. And Compaq. Anyway, reportedly the notion of a “merger of equals” with HP went pretty far, with the notion that HP’s Meg Whitman would be CEO and Tucci would be chair, “but there were disagreements over price and the next layer of management,” according to Bloomberg. Incidentally, according to the Wall Street Journal, EMC and HP had been talking for the past year The merged companies would be bigger than IBM, notes Business Insider. But it isn’t clear what HP would gain from the deal, according to Mike Wheatley of Silicon Angle. “Its storage organization is unlikely to relish the prospect of taking on EMC’s people and products, and EMC’s VNX and VMAX arrays compete with HP’s 3PAR products,” he writes. “In addition, HP’s various object, backup and virtual SAN products all overlap with equivalents from EMC.”

Dell. In addition to HP, EMC had also been talking to Dell about buying at least part of the company, the Journal added. But sites such as Silicon Angle are dubious due to the disparity in size between the two companies, as well as overlap between their product lines. “For example, Dell’s Compellent and EqualLogic storage arrays compete with EMC’s VNX line,” Wheatley writes.

Oracle. Forbes‘ Peter Cohan is pushing Oracle, noting that the company would likely pay more than HP and that Oracle doesn’t have much of a storage presence. This was actually rumored in 2010. But some other analysts now find it unlikely.

IBM or SAP. Even these companies were mentioned by Forbes, but IBM probably doesn’t have the budget for it and SAP probably wouldn’t make such a big non-European acquisition, writes Sarah Cohen.

This is all in the context of to what degree massive storage array companies like EMC even have a future, in an era of cloud computing. Yes, mainframe company IBM survived the PC era, but most of the other mainframe companies didn’t, and it’s speculated that EMC might have similar troubles surviving the cloud era. “The question is why anyone would want EMC gear at all,” Jon Toigo, of IT consultancy Toigo Partners, told Silicon Angle. “With VMware and Microsoft pushing server-side direct attached storage topologies to replace centralized SAN storage, I think of the monolithic storage products of EMC as niche players in a shrinking market.”

Harsh.

In fact, Arik Hesseldahl of Re/code thinks that EMC has already run out of potential suitors, and nothing is going to happen at this point. Bloomberg thinks that EMC should be acquiring more companies, and helpfully provided a laundry list.

In any event, EMC stock has been hitting new one-year highs, which has to make Elliott happy.

Meanwhile, people are once again starting to speculate about which one of EMC’s deep bench is going to take over once Tucci finally makes up his mind to retire.

EMC has traditionally been a fairly drama-free company, but it looks like it’s making up for it now.


September 21, 2014  10:32 PM

Will Cloud Storage be the Pork Bellies of the Future?

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Storage

It’s said that one of the major advantages of cloud storage is that you can add storage on the fly as you need it. But what if it went further? What if cloud storage was even more disintermediated than it is today?

An electric utility, for example, typically offers power generated several different ways, with each way having a particular cost associated with generating it and a certain amount of time it takes to crank it up and shut it down again. If the utility suddenly needs a lot of power, it can buy power – typically more expensive – on the spot market, but it ensures that the utility doesn’t have a brownout or a blackout. Having a utility also means that individuals and businesses don’t have to worry about generating their own power and making sure they have enough when they need it; they just trust the utility to provide it.

So what if cloud storage was like that? What if you just used it as you needed it, didn’t buy it and hold it, and you contracted with a provider who might get it from Amazon, Microsoft, or some other vendor at any moment, depending on who had it available at a particular time?

That’s a notion being discussed in connection with a couple of recent announcements. First is the announcement by Accelion that its kiteworks content connectors would now include Box and Dropbox, as well as Google Drive and Microsoft OneDrive. Kiteworks reportedly provides the management capabilities, such as providing the same interface regardless of which cloud service is being used, as well as security settings and access control. Second is the announcement that cloud company 6Fusion had signed a deal with the Chicago Mercantile commodity exchange, after originally signing a non-binding agreement last fall, and is expected to offer a beta product later this year.

“If all works out, the deal will mean that buyers and sellers of cloud computing services can do business on a spot exchange and, in a few years, trade derivatives too,” writes Jeff John Roberts at GigaOm. “The exchange will be a place to buy hours of ‘WAC,’ a term invented by 6Fusion that stands for Workload Allocation Cube. The idea behind the WAC is to create a standard unit of cloud computing infrastructure that can be bought and sold by the thousands.”

Basically, 6Fusion hopes that the WAC will become the cloud storage equivalent of the watt of power or the barrel of oil, Roberts writes. That is, of course, predicated on whether the rest of the industry accepts it, he warns.

Eventually, users and organizations might not even know what company is providing its storage, in the same way that we aren’t typically aware of whether our power is coming from hydroelectric, natural gas, coal, solar, or other sources – which will be easier with developments like Accelion’s, which provide the same interface to multiple providers’ storage. Moreover, because 6Fusion is working with the commodities markets, people could invest in “storage futures,” in the same way that they buy pork bellies now.

“The IT infrastructure of a company like JP Morgan could soon consist of private cloud servers for sensitive data, supplemented by public cloud supplies purchased from an ever-changing roster of third party cloud computing providers,” Roberts continues. “At the same time, such purchases of cloud computing ‘by the bushel’ would also mean lower prices as traders, rather than vendors, start to set the price of key ingredients of IT infrastructure.”

6Fusion itself has had a couple of interesting blog posts on the concept since the agreement with the mercantile exchange was first announced in April.

The concept isn’t exactly new; Forbes points out that such “spot markets” have been discussed, tried before and failed. Even 6Fusion itself had been talking about the notion publicly since last spring.

Just think – if they expand it to compute power itself, they could reinvent timesharing.


September 16, 2014  1:22 PM

‘Size Matters,’ Storage Vendors Say

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Hitachi, SanDisk, Storage, western digital

Okay, it’s official: Storage vendors are all a big bunch of size queens.

Just a couple of weeks after Seagate announced an 8TB drive, Western Digital has announced a 10TB drive. And if you prefer an SD card? SanDisk has just announced a 512GB card. That’s half a TB.

Grunt.

(This is probably a good time to reminisce about how my first computer had a hard disk as an add-on that was the size of the computer itself and cost the same amount — for 10MB. Yes, I’m old, whippersnapper, and get off my lawn.)

The Western Digital drive, like the Seagate drive, uses shingled magnetic recording technology, which puts more data in the same space though it’s is slower. In fact, Western Digital comes right out and says the drive is mainly intended for “cold storage” facilities.

That means “slow,” because you put stuff in cold storage that you don’t need all the time, and so you spin down the drives rather than keep them running all the time because it saves energy. So every time you retrieve something from cold storage, you have to go kick the drive to start it up again. It’s like keeping the beer in the fridge out in the garage. You can put a lot more beer out there, but you have to traipse out to the garage every time you want a beer.

There’s one big difference between the 8TB Seagate and the Western Digital UltraStar HE10 — well, besides the size. The Western Digital 10TB drive uses the helium technology it first announced last November for its 6TB drive, while the Seagate drive doesn’t. (Hence the He in the product name.)

We don’t know how much it’s going to cost. Western Digital claimed it would have the lowest cost per gigabyte in the industry, but didn’t release actual prices. Some sources, such as ExtremeTech, were dubious about this. That said, the helium-filled 6TB drives cost from $449.99 to $899.99, depending on the source, compared with commodity-level 6TB drives costing $275, according to Backblaze.

(Can I just say how awesome it is to speak of “commodity-level 6TB drives.” What a world we live in.)

Similarly, the 512GB SanDisk Extreme PRO SDKX UHS-I memory card comes at a premium price: $800. (Though you can get it for $729 some places.) Considering that you can get 32GB cards for 50 cents a gigabyte, that may seem a little high. But the disk is aimed at high-end audio-video people for things like recording 4K video, so they don’t have to swap storage in and out so often.

This is not intended for Dad recording the kids opening presents at Christmas, is what I’m saying.

But at some point we do have to raise the question: Just how big should a single storage device be? I’m not talking about “10MB was good enough for me and it should be good enough for you kids!” I’m thinking about things like reliability. How do you back the things up? And, more to the point, how much data are you willing to lose? When’s the last time you accidentally laundered an SD card you left in your pants pocket?

Lossage is particularly an issue for the helium-filled drives. Yes, yes, we know they’re guaranteed for five years, but getting your $499 back is going to be cold comfort if in the process you’ve lost $10,000 worth of data. The 6TB devices have been out for only a few months, and it’s not clear just what the MTBF rate is going to be for them. Even the MIT Technology Review said last year that it would probably take a year before anyone had any idea of how well they’d last.

Sadly, the world of disk drive testing isn’t what it used to be. As we’ve mentioned before, weirdly, some of the best disk drive testing is done by BackBlaze, but since these helium-filled drives aren’t commodity items it’s unlikely that BackBlaze is testing them yet. We can hope that CNET or ZDnet happened to buy a few of them, set them up in a corner somewhere, and is preparing to give us a great review in a couple of months about just how long we can expect the helium to last, but personally I’m dubious.

One can argue that if the 10TB drives are used in cold storage, you don’t really have to worry that much, because chances are that archived data is stored in multiple places anyway. Well, perhaps, but that raises the question of what the point is. Helium-filled drives cost almost twice as much as commodity non-helium drives, so will a helium-filled drive that’s almost twice as big cost almost four times as much? If so, at what point does the reduction in maintenance and infrastructure make it worthwhile? ExtremeTech points out that according to Western Digital, the power consumption of the drives is 23 percent less than its air-filled drives, and if you have a big enough data center, that would certainly add up, but you’d really want to know how long they’re going to last.

Perhaps the storage vendors are just hoping that storage managers are size queens too.


August 31, 2014  10:54 PM

What We Can Learn From the ‘ISIS Terror Laptop of Doom’

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Encryption, privacy, Security

Our government loses so many laptops, it’s kind of nice when the tables are turned once in a while. That said, a lot of people are talking about what’s purported to be a laptop belonging to the Sunni jihadist group ISIS.

You may recall in 2011, when Osama bin Laden’s hideout was raided and he was killed, that Navy SEALS retrieved several computers and storage devices. This is similar, except the laptop was captured in January by a moderate rebel group in northern Syria, from an ISIS hideout, according to Foreign Policy, which broke the story. 

In fact, according to reporter Harald Doornbas, the laptop — which he dubbed the “terror laptop of doom” — was conveniently neither password-protected nor encrypted, and the material it contained was only nominally protected. “Buried in the ‘hidden files’ section of the computer were 146 gigabytes of material, containing a total of 35,347 files in 2,367 folders,” he writes. And what was in those files? Besides “videos of Osama bin Laden, manuals on how to make bombs, instructions for stealing cars, and lessons on how to use disguises,” it also contained what is said to be detailed information on how to weaponize bubonic plague for biological warfare. 

Eek.

Opinions vary on the veracity of the laptop (incidentally, apparently the laptop of choice for biological terrorists is Dell, and it’s black). Some found it evidence that the U.S. should attack and some — damn few, sadly — found that there are aspects of the story that lack credibility. Such as, really? You capture an enemy laptop in January and don’t look at it til months later, in front of a reporter?

Similarly, the reporter was criticized for looking at the laptop in the first place, as opposed to having it examined forensically. because running it would modify dates and other information that could be useful in determining its veracity , noted one online commenter, who went on to point out that similar “magic laptops” justifying conspiracy theories had been found in Colombia in 2008.

One of the less hysterical reactions — though it didn’t address the veracity issue — is from Outbreak News Today. Specializing in covering infectious diseases, the publication not only points out the long history of attempts to use plague in biological warfare — dating back to the 14th century — but also talked about the difficulties of such.

Other commenters noted that none of the information was particularly a smoking gun, that such information is readily available to anybody (Anarchist Cookbook, anyone?), and that perhaps the laptop was a plant intended to encourage a war. “How convenient!” posted one. “Just as the US have troubles coming up with a reasonable justification in international law for air strike operations, a laptop – luckily the one with all the plans – comes up.”

Perhaps we could call it the Zimmermann Laptop, after the Zimmermann Telegram — originally thought to be fake but later demonstrated to be true — which helped propel the U.S. entry into World War I.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: