With this being the centennial of the start of World War I, and with what’s going on in the storage industry lately, it isn’t surprising if you’re also being reminded of the decline and fall of the Ottoman Empire.
Well, okay. Maybe only if you’re a history buff.
In case you were dozing in the back row during world history class in tenth grade (or, if, like me, your history teacher was actually a repurposed Latin teacher and you spent all but the last two weeks of the school year on Greece and Rome, meaning you covered a millennium a day those last two weeks), the Ottoman Empire lasted in one way, shape, or form for more than 500 years. It spanned three continents — Europe, Asia, and Africa — and contained 29 provinces and many other states. But it fell during World War I, and nations such as Britain and France carved up the pieces willy-nilly into ways that made sense to them, without paying much attention to cultural boundaries or what the people in those states might actually want to do. (In fact, some of the current conflict in the Middle East dates directly back to those actions. But I digress.)
Any of this ringing a bell yet?
So at this point, in the storage and e-discovery industry that this blog covers, we have not one but three Ottoman empires potentially in the process of dissolving, with a bunch of people on both the outside and the inside watching and speculating about how the pieces might all eventually fit together.
We’ve already talked about EMC, which is under pressure from shareholders to break itself up so the pieces can be worth more — a case of the whole being worth less than the sum of the parts. It isn’t clear yet exactly what’s going to happen with EMC, though there’s been plenty of speculation. (To further complicate things, EMC and Cisco are breaking up their partnership, which resulted in the software-defined networking joint venture VCE, with EMC taking control of it. More pieces to juggle.)
In the meantime, both HP and Symantec have announced their intentions to split in two. HP’s pieces are going to be one for its printer and PC business, and one for its corporate computer hardware and software business. Symantec’s pieces are going to be one for its security management products and one for its information management products.
And while the Britains and the Frances of the computer industry are arguing over the bigger pieces and how they will best fit together, other people — especially in e-discovery — are talking about some of the other pieces that haven’t gotten as much love lately and how this could all work out for them.
The HP split, for example, could result in new support for Autonomy, which HP bought for what everyone — including HP — agrees was way too much money. Not only was it not great for HP, but it hasn’t been too great for the Autonomy people either, who are kinda HP’s red-headed stepchildren.
The HP split, in fact, is “probably good news for long-suffering customers of the former Autonomy products,” writes Tony Byrne of Real Story Group. “You know why? Because things couldn’t get much worse for them.”
Meanwhile, Gartner pointed out this summer in its e-discovery Magic Quadrant that although it still positioned Symantec in the Leaders quadrant, its Clearwell product — one of the first big acquisitions in the 2011 e-discovery land grab — had languished under Symantec’s control. Or, as Gartner puts it, “The innovation pipeline for the eDiscovery Platform has slowed during Symantec’s acquisition and integration of Clearwell Systems, resulting in the product’s lack of growth and new releases.”
(Keep in mind that Autonomy and Clearwell had both individually been listed in the Leaders quadrant in the original 2011 e-discovery Magic Quadrant. Almost makes you wish that some company that really had a great vision for e-discovery would buy both pieces, integrate them, and really do it right.)
At the same time, some people are looking at some of the less-loved, neglected pieces of EMC, such as Documentum, and thinking that maybe there’s some way these could get involved, too.
“[Documentum] doesn’t seem to play a role in EMC’s survival,” writes Virginia Backaitis in CMSWire, before going on to suggest that HP buy it and integrate it with Autonomy. “In EMC’s quarterly call with investors last week, neither EMC CEO Joe Tucci nor his lieutenants (David Goulden, CEO of EMC Information Infrastructure and CFO Zane Rowe) uttered the name of its spawn at all.”
It remains to be seen how the various pieces of all three companies will combine (hopefully not in some e-discovery version of Iraq, with different factions battling for control). If nothing else, it could mean that next year’s Gartner e-discovery Magic Quadrant, which has been pretty much of a snore the last couple of years, has the potential to be a lot more interesting.
Periodically, people take the new capacity of storage media — not to mention the new increasing sizes of motor vehicles — and uses it to recalculate that lovely statistic, “what is the bandwidth of a station wagon full of tapes speeding down the highway?” So now we have a new one — how much data goes back and forth to major cities, especially using public transit?
We now have that data courtesy of Mozy, a cloud backup service that describes itself as the “most trusted.” (Exactly how they figured out it was the “most trusted,” they don’t say.) According to the company, when you add up laptops, smartphones, personal hard drives, thumb drives, and so on, you end up with a pretty horrendous amount of data leaving the office every day:
- The average commuter takes 470GB of company data home with them at the end of every day — 2,500 times the amount of data they’ll move across the Internet in the same timeframe
- Every day, 1.4 exabytes of data moves through New York City alone – that’s more data than will cross the entire Internet in a day
- As much as 33.5PB of data will travel over the Oakland Bay Bridge every day
- As much as 49 PB of data will travel through the Lincoln Tunnel each day
- Up to 328PB of data travels in the London Tube network every day
- Up to 69PB of data leaves Munich’s Hauptbahnhof on a daily basis
- The Paris Metro carries as much as 138PB of data every day
(There’s also some really cool maps showing where the data is coming from.)
There is, however, one flaw in the Mozy description, which is that it refers to this phenomenon as a “data drain.” That’s not really accurate. A “brain drain,” for example, typically refers to people leaving an area. Their brains are therefore gone from the area. But this data isn’t actually leaving the area, in the context of it being gone. Instead, the data is copied. This leads to its own issues, such as version control, security, and simply taking up much more storage space than is really required. (Good thing storage is so cheap these days, amirite?)
And certainly one could quibble with the figure. Mozy doesn’t explain the methodology, but presumably it’s adding up the storage in each of the devices that people carry back and forth. And who knows, really, how much of it is actually corporate data, and how much of it is cat pictures? That said, it’s certainly a fun back-of-the-envelope statistic to calculate.
Anyway, it’s the security issue that is particularly catching Mozy’s interest. “With 41.33 percent of people having lost a device that stores data in the past 12 months, huge amounts of business data is put at risk every rush hour,” the company writes. “There isn’t a CIO we know who would risk sending massive volumes of data over the internet without protecting it first.”
Well, we have to say, Mozy must not know very many CIOs. That aside, the company has a point: with all the evidence we have of companies and governments behaving badly with personally identifiable data, there’s an awful lot of data at risk every day.
“A thief holding up a New York subway car at rush-hour capacity could walk away with over 100TB of data,” the company notes. (Which actually sounds like an interesting premise for a movie. Starring Denzel Washington? Jeff Goldblum? Sandra Bullock?)
This commuting data is vulnerable in two ways, Mozy notes. First, bad guys could get access to the data. Second, the person with whom the data is riding could lose access to the data, if that data is the only copy. “It’s also the most-critical data; the edits to the contract that we’ve just worked through in today’s meeting, the presentation that we’re giving tomorrow morning, the tax forms that you’re halfway through filling in,” the company writes. “Losing this data can have an immediate impact on a company’s success.”
Mozy, however, doesn’t go far enough. Let’s go to the root cause: Why are people taking so much data home with them? And if this is something we don’t want to have happen, what is the alternative? There’s already been any amount of hand-wringing over the notion of people setting up Dropbox and similar accounts to make copies of corporate data. Is carrying the data on a device more or less secure, or desirable, than saving it to a public cloud service?
Either way, device or cloud, it boils down to the same issue: People are making copies of the corporate data, by and large, because they feel they need to do that to do their jobs. So either there isn’t a reliable way for them to gain access to the corporate data they need any other way, or, if there is, they don’t know about it.
The point being, if people feel they have to do this to do their jobs, then you need to give them a better way. Simply issuing an edict that Thou Shalt Not is not going to work, even if you put teeth in it. Because, ultimately, they’re not as afraid of you as they are of their boss.
Depending on whom you ask, either everything is fine or we should be locking up the country to protect us all from Ebola. And frankly, at this point we don’t really know for sure. But there is something you can do: Make sure your company is prepared.
Whether it’s Ebola, enterovirus 68, or the flu, it’s possible for an illness to affect your company and the world it operates in. In a sense, preparing for a pandemic of any sort is no different from preparing for any other natural disaster, whether it’s a flood, hurricane, or tornado. (Except that a pandemic can last for weeks or months, while natural disasters are typically over in a few days.)
So, think about the same sorts of preparations you’d make for, say, a blizzard, and adapt them a bit.
- Make sure staff can work from home, whether it’s because they can’t use the roads or because there’s a quarantine. Do they have the access they need? Computer equipment? Passwords? Is there a virtual private network set up to help protect the company when people are dialing in over a public network? Check with everyone and get these things set up now to make sure they’ll be available at a moment’s notice should you need them. And have everyone test their systems periodically to make sure they can still get online.
- Make sure that there isn’t any single point of failure in the daily processes. Are there passwords or procedures that only one person knows? What happens if that person gets sick or can’t make it into the office, for whatever reason? The disaster recovery manual — you have one, right? — also needs to be accessible from a remote location.
- While there’s not really an Ebola vaccine yet, there are vaccines for other illnesses with the potential to become pandemics, such as the flu. To ensure that employees get vaccinated, arrange to have someone from the health department come in to administer vaccines — and if necessary, have the company pay for it, to ensure that everyone gets vaccinated. The cost is minimal compared with the cost of the lost productivity if employees get sick.
- It never hurts to stock up on hand sanitizer and alcohol wipes, and talk with staff — including the custodial staff — about how to keep from spreading germs. And while you’re at it, make sure people know that they should stay home if they or a family member is sick. If your company doesn’t currently offer paid sick leave, it might be a good time to add it. Again, think you can’t afford it? How well could you afford having half the office sick?
Hopefully, none of these plans will be needed. But in case they are, you’ll want to be ready, and in the meantime, it’ll give you something practical to do that could be useful sometime. As the saying goes, prevention is the best medicine.
Now that Apple and Google have announced that they will incorporate encryption in smartphones by default, the question is how long law-abiding Americans will be allowed to continue to have encryption at all.
In case you missed it, Apple announced on September 17 that future editions of the iPhone would have encryption turned on by default in a way that no longer allows Apple to have access to encrypted data. “Apple’s new move is interesting and important because it’s an example of a company saying they no longer want to be in the surveillance business — a business they likely never intended to be in and want to get out of as quickly as possible,” writes Chris Soghoian, Principal Technologist and Senior Policy Analyst for the American Civil Liberties Union’s Speech, Privacy, and Technology Project. “Rather than telling the government that they no longer want to help the government, they re-architected iOS so they are unable to help the government.” The following day, Google announced that future versions of the Android smartphone operating system would have encryption turned on by default as well.
Predictably, the FBI and law enforcement had kittens. “The notion that someone would market a closet that could never be opened – even if it involves a case involving a child kidnapper and a court order – to me does not make any sense,” said FBI director James Comey. He also went on to invoke the notion of the terrorism that could surely befall the U.S. when this happens. “Two big tech providers are essentially creating sanctuary for people who are going to do harm,” agreed Ron Hosko, a former assistant director of the FBI’s criminal investigative division, told Marketplace. And pulling out the big guns, “Apple will become the phone of choice for the pedophile,” John J. Escalante, chief of detectives for Chicago’s police department, told the Washington Post. “The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”
Yep, terrorism, kidnapping, and pedophiles. They got the trifecta.
Company executives at Apple and Google told the New York Times that the government had essentially brought this on themselves with incidents like Edward Snowden, and that it was increasingly difficult for American companies to compete overseas with the perception that the U.S. government had its fingers in everything.
“The head of the FBI and his fellow fear-mongerers are still much more concerned with making sure they retain control over your privacy, rather than protecting everyone’s cybersecurity,” writes Trevor Trimm, in the U.K. paper The Guardian, after offering a line-by-line critique of Comey’s statement. Security experts pointed out that the government still had many other options by which it could legally request access to people’s electronic data, that the FBI didn’t cite any examples of cases where encryption would have prevented them from solving a case, and that one case cited by Hosko turned out to be irrelevant.
So the next question becomes, at what point might the federal government attempt to outlaw encryption again? Or mandate a back door?
In case you were born sometime after MTV, at one point in time, anything more powerful than 40-bit encryption was actually classified as a munition — you know, like bombs and missiles — and illegal to export from the U.S., and not all that easy to get hold of even inside the U.S. In fact, a guy named Philip Zimmerman got himself into a peck of trouble when he developed Pretty Good Privacy (PGP), intended to be Everyman’s data encryption. While it was fine inside the U.S., copies of it surfaced internationally, and for several years it looked like Zimmerman might face charges, which led to him being a cause celebre in the computer community.
In 1993, the Bill Clinton White House went further and proposed the Clipper Chip, an encryption system that included a back door so that law enforcement organizations could still read any data encrypted by the device. Which, of course, they’d only use if you were a bad guy, of course. But by 1996, partly due to the enormous wave of protest against the notion — and partly due to technical issues, such as bugs that were found in it (by a guy named Matt Blaze, who’s still around these days, commenting on the Apple/Google encryption flap) — the government had dropped the project. At the same time, the Clinton White House relaxed the rules on greater than 40-bit encryption.
These days, encryption is readily available, but generally you have to know about it and how to turn it on. What Apple and Google are doing are selling devices with it already turned on — and, in response to the increasing number of requests from the government for user data, they no longer will even have access to the user’s data.
(This does, of course, mean that if you lose your encryption key, you’re hosed.)
So what other effects can we expect from the Apple/Google decision?
- Courts are still trying to figure out whether an encryption key is like a key or a combination to a safe — something you have or something you know — so they can decide whether you have to give it up. So law enforcement organizations are still taking people to court to force them to reveal encryption keys, and sometimes they win. Conceivably, with encryption turned on by default, this could happen a lot more often.
- Having encryption be the default could also eliminate the “why are you encrypting it if you don’t have anything to hide?” presumption.
- Of course, bad guys have pretty much figured encryption out, even when it’s not the default. To be blunt, Apple and Google’s actions simply mean that regular people will have the same capabilities as the bad guys. And in an era where companies and individuals alike are regularly losing laptops, disk drives, and smartphones with personal data in them — and then getting fined for losing the data and having it not encrypted in the first place — having encryption as the default simply makes sense.
So what would happen if the government were to outlaw encryption, or mandate a back door? The Electronic Frontier Foundation, which lives for this sort of thing, has a nice long list of possible repercussions.
Realistically, though, would outlawing encryption even be practical in this day and age? Look at it this way — if encryption were made illegal, that would mean that all the personal data on all the devices that get lost or stolen would then be accessible . It would make the Target incident look like a picnic. (The ACLU’s Soghoian pointed out that in 2011, the FBI was encouraging people to encrypt their data to keep it out of the hands of criminals.)
And everyone who believes that bad guys wouldn’t continue to be able to use encryption or would have to have a back door to their communications, please poke out your right eye. In the same way that bad guys continue to get access to illegal firearms today, bad guys would still get access to encryption, one way or another. Sorry, FBI, but that genie is out of the bottle.
It isn’t clear whether the government is going to try to outlaw encryption again, or try to mandate a law enforcement back door. There is some talk about Congress enacting a law, but due to the Edward Snowden revelations, few Congressional representatives want to touch it, according to Bloomberg Business Week. Still, it’s something we need to watch out for — but it looks like computer vendors are increasingly unlikely to help, according to Iain Thompson in the Register. “It’s unlikely Apple or Google is going to bow down to the wishes of government and install backdoors in their own products,” he writes. ” This would be disastrous to sales if found out, and there are increasing signs that the tech sector is gearing up for a fight over the issue.”
You may recall that an activist investor called Elliot Management Corp. bought a $1 billion chunk of EMC earlier this summer — about two percent, big enough that it could start throwing its weight around and make suggestions about how EMC could make even more money for it, such as by selling VMware. EMC CEO and chairman Joe Tucci — who’s scheduled to retire in February for the nth time, said he’d meet with the company but wouldn’t make any promises.
Now it’s starting to look as though the EMC we all know and love may not survive Tucci’s departure.
As nearly everyone suspected, it all started out with Tucci saying he wouldn’t sell VMware, of which EMC owns 80 percent. Why exactly he’s so enamored of the virtualization company isn’t clear, other than the fact that it in turn comprises up to 75 percent of EMC’s value these days.
Instead, it’s looking like Tucci would rather sell EMC itself — or, at least, break it up and sell some of the parts. In the process, the storage company is being partnered (or, as my teenage daughter would say, “shipped,” as in “relationship”) with just about every major other computer vendor out there. And some of them are willing and some of them aren’t.
Cisco. While there was talk about a potential merger with Cisco, Cisco Chief Executive Officer John Chambers himself said there was nothing to it, and if there had been anything, it would have been a year ago.
HP. Because, after all, HP acquisitions always go so well. Look at Autonomy. And Compaq. Anyway, reportedly the notion of a “merger of equals” with HP went pretty far, with the notion that HP’s Meg Whitman would be CEO and Tucci would be chair, “but there were disagreements over price and the next layer of management,” according to Bloomberg. Incidentally, according to the Wall Street Journal, EMC and HP had been talking for the past year. The merged companies would be bigger than IBM, notes Business Insider. But it isn’t clear what HP would gain from the deal, according to Mike Wheatley of Silicon Angle. “Its storage organization is unlikely to relish the prospect of taking on EMC’s people and products, and EMC’s VNX and VMAX arrays compete with HP’s 3PAR products,” he writes. “In addition, HP’s various object, backup and virtual SAN products all overlap with equivalents from EMC.”
Dell. In addition to HP, EMC had also been talking to Dell about buying at least part of the company, the Journal added. But sites such as Silicon Angle are dubious due to the disparity in size between the two companies, as well as overlap between their product lines. “For example, Dell’s Compellent and EqualLogic storage arrays compete with EMC’s VNX line,” Wheatley writes.
Oracle. Forbes‘ Peter Cohan is pushing Oracle, noting that the company would likely pay more than HP and that Oracle doesn’t have much of a storage presence. This was actually rumored in 2010. But some other analysts now find it unlikely.
IBM or SAP. Even these companies were mentioned by Forbes, but IBM probably doesn’t have the budget for it and SAP probably wouldn’t make such a big non-European acquisition, writes Sarah Cohen.
This is all in the context of to what degree massive storage array companies like EMC even have a future, in an era of cloud computing. Yes, mainframe company IBM survived the PC era, but most of the other mainframe companies didn’t, and it’s speculated that EMC might have similar troubles surviving the cloud era. “The question is why anyone would want EMC gear at all,” Jon Toigo, of IT consultancy Toigo Partners, told Silicon Angle. “With VMware and Microsoft pushing server-side direct attached storage topologies to replace centralized SAN storage, I think of the monolithic storage products of EMC as niche players in a shrinking market.”
In fact, Arik Hesseldahl of Re/code thinks that EMC has already run out of potential suitors, and nothing is going to happen at this point. Bloomberg thinks that EMC should be acquiring more companies, and helpfully provided a laundry list.
In any event, EMC stock has been hitting new one-year highs, which has to make Elliott happy.
EMC has traditionally been a fairly drama-free company, but it looks like it’s making up for it now.
It’s said that one of the major advantages of cloud storage is that you can add storage on the fly as you need it. But what if it went further? What if cloud storage was even more disintermediated than it is today?
An electric utility, for example, typically offers power generated several different ways, with each way having a particular cost associated with generating it and a certain amount of time it takes to crank it up and shut it down again. If the utility suddenly needs a lot of power, it can buy power – typically more expensive – on the spot market, but it ensures that the utility doesn’t have a brownout or a blackout. Having a utility also means that individuals and businesses don’t have to worry about generating their own power and making sure they have enough when they need it; they just trust the utility to provide it.
So what if cloud storage was like that? What if you just used it as you needed it, didn’t buy it and hold it, and you contracted with a provider who might get it from Amazon, Microsoft, or some other vendor at any moment, depending on who had it available at a particular time?
That’s a notion being discussed in connection with a couple of recent announcements. First is the announcement by Accelion that its kiteworks content connectors would now include Box and Dropbox, as well as Google Drive and Microsoft OneDrive. Kiteworks reportedly provides the management capabilities, such as providing the same interface regardless of which cloud service is being used, as well as security settings and access control. Second is the announcement that cloud company 6Fusion had signed a deal with the Chicago Mercantile commodity exchange, after originally signing a non-binding agreement last fall, and is expected to offer a beta product later this year.
“If all works out, the deal will mean that buyers and sellers of cloud computing services can do business on a spot exchange and, in a few years, trade derivatives too,” writes Jeff John Roberts at GigaOm. “The exchange will be a place to buy hours of ‘WAC,’ a term invented by 6Fusion that stands for Workload Allocation Cube. The idea behind the WAC is to create a standard unit of cloud computing infrastructure that can be bought and sold by the thousands.”
Basically, 6Fusion hopes that the WAC will become the cloud storage equivalent of the watt of power or the barrel of oil, Roberts writes. That is, of course, predicated on whether the rest of the industry accepts it, he warns.
Eventually, users and organizations might not even know what company is providing its storage, in the same way that we aren’t typically aware of whether our power is coming from hydroelectric, natural gas, coal, solar, or other sources – which will be easier with developments like Accelion’s, which provide the same interface to multiple providers’ storage. Moreover, because 6Fusion is working with the commodities markets, people could invest in “storage futures,” in the same way that they buy pork bellies now.
“The IT infrastructure of a company like JP Morgan could soon consist of private cloud servers for sensitive data, supplemented by public cloud supplies purchased from an ever-changing roster of third party cloud computing providers,” Roberts continues. “At the same time, such purchases of cloud computing ‘by the bushel’ would also mean lower prices as traders, rather than vendors, start to set the price of key ingredients of IT infrastructure.”
The concept isn’t exactly new; Forbes points out that such “spot markets” have been discussed, tried before and failed. Even 6Fusion itself had been talking about the notion publicly since last spring.
Just think – if they expand it to compute power itself, they could reinvent timesharing.
Okay, it’s official: Storage vendors are all a big bunch of size queens.
(This is probably a good time to reminisce about how my first computer had a hard disk as an add-on that was the size of the computer itself and cost the same amount — for 10MB. Yes, I’m old, whippersnapper, and get off my lawn.)
The Western Digital drive, like the Seagate drive, uses shingled magnetic recording technology, which puts more data in the same space though it’s is slower. In fact, Western Digital comes right out and says the drive is mainly intended for “cold storage” facilities.
That means “slow,” because you put stuff in cold storage that you don’t need all the time, and so you spin down the drives rather than keep them running all the time because it saves energy. So every time you retrieve something from cold storage, you have to go kick the drive to start it up again. It’s like keeping the beer in the fridge out in the garage. You can put a lot more beer out there, but you have to traipse out to the garage every time you want a beer.
There’s one big difference between the 8TB Seagate and the Western Digital UltraStar HE10 — well, besides the size. The Western Digital 10TB drive uses the helium technology it first announced last November for its 6TB drive, while the Seagate drive doesn’t. (Hence the He in the product name.)
We don’t know how much it’s going to cost. Western Digital claimed it would have the lowest cost per gigabyte in the industry, but didn’t release actual prices. Some sources, such as ExtremeTech, were dubious about this. That said, the helium-filled 6TB drives cost from $449.99 to $899.99, depending on the source, compared with commodity-level 6TB drives costing $275, according to Backblaze.
(Can I just say how awesome it is to speak of “commodity-level 6TB drives.” What a world we live in.)
Similarly, the 512GB SanDisk Extreme PRO SDKX UHS-I memory card comes at a premium price: $800. (Though you can get it for $729 some places.) Considering that you can get 32GB cards for 50 cents a gigabyte, that may seem a little high. But the disk is aimed at high-end audio-video people for things like recording 4K video, so they don’t have to swap storage in and out so often.
This is not intended for Dad recording the kids opening presents at Christmas, is what I’m saying.
But at some point we do have to raise the question: Just how big should a single storage device be? I’m not talking about “10MB was good enough for me and it should be good enough for you kids!” I’m thinking about things like reliability. How do you back the things up? And, more to the point, how much data are you willing to lose? When’s the last time you accidentally laundered an SD card you left in your pants pocket?
Lossage is particularly an issue for the helium-filled drives. Yes, yes, we know they’re guaranteed for five years, but getting your $499 back is going to be cold comfort if in the process you’ve lost $10,000 worth of data. The 6TB devices have been out for only a few months, and it’s not clear just what the MTBF rate is going to be for them. Even the MIT Technology Review said last year that it would probably take a year before anyone had any idea of how well they’d last.
Sadly, the world of disk drive testing isn’t what it used to be. As we’ve mentioned before, weirdly, some of the best disk drive testing is done by BackBlaze, but since these helium-filled drives aren’t commodity items it’s unlikely that BackBlaze is testing them yet. We can hope that CNET or ZDnet happened to buy a few of them, set them up in a corner somewhere, and is preparing to give us a great review in a couple of months about just how long we can expect the helium to last, but personally I’m dubious.
One can argue that if the 10TB drives are used in cold storage, you don’t really have to worry that much, because chances are that archived data is stored in multiple places anyway. Well, perhaps, but that raises the question of what the point is. Helium-filled drives cost almost twice as much as commodity non-helium drives, so will a helium-filled drive that’s almost twice as big cost almost four times as much? If so, at what point does the reduction in maintenance and infrastructure make it worthwhile? ExtremeTech points out that according to Western Digital, the power consumption of the drives is 23 percent less than its air-filled drives, and if you have a big enough data center, that would certainly add up, but you’d really want to know how long they’re going to last.
Perhaps the storage vendors are just hoping that storage managers are size queens too.
Our government loses so many laptops, it’s kind of nice when the tables are turned once in a while. That said, a lot of people are talking about what’s purported to be a laptop belonging to the Sunni jihadist group ISIS.
You may recall in 2011, when Osama bin Laden’s hideout was raided and he was killed, that Navy SEALS retrieved several computers and storage devices. This is similar, except the laptop was captured in January by a moderate rebel group in northern Syria, from an ISIS hideout, according to Foreign Policy, which broke the story.
In fact, according to reporter Harald Doornbas, the laptop — which he dubbed the “terror laptop of doom” — was conveniently neither password-protected nor encrypted, and the material it contained was only nominally protected. “Buried in the ‘hidden files’ section of the computer were 146 gigabytes of material, containing a total of 35,347 files in 2,367 folders,” he writes. And what was in those files? Besides “videos of Osama bin Laden, manuals on how to make bombs, instructions for stealing cars, and lessons on how to use disguises,” it also contained what is said to be detailed information on how to weaponize bubonic plague for biological warfare.
Opinions vary on the veracity of the laptop (incidentally, apparently the laptop of choice for biological terrorists is Dell, and it’s black). Some found it evidence that the U.S. should attack and some — damn few, sadly — found that there are aspects of the story that lack credibility. Such as, really? You capture an enemy laptop in January and don’t look at it til months later, in front of a reporter?
Similarly, the reporter was criticized for looking at the laptop in the first place, as opposed to having it examined forensically. because running it would modify dates and other information that could be useful in determining its veracity , noted one online commenter, who went on to point out that similar “magic laptops” justifying conspiracy theories had been found in Colombia in 2008.
One of the less hysterical reactions — though it didn’t address the veracity issue — is from Outbreak News Today. Specializing in covering infectious diseases, the publication not only points out the long history of attempts to use plague in biological warfare — dating back to the 14th century — but also talked about the difficulties of such.
Other commenters noted that none of the information was particularly a smoking gun, that such information is readily available to anybody (Anarchist Cookbook, anyone?), and that perhaps the laptop was a plant intended to encourage a war. “How convenient!” posted one. “Just as the US have troubles coming up with a reasonable justification in international law for air strike operations, a laptop – luckily the one with all the plans – comes up.”
Perhaps we could call it the Zimmermann Laptop, after the Zimmermann Telegram — originally thought to be fake but later demonstrated to be true — which helped propel the U.S. entry into World War I.
As you may recall, there are a number of vendors using a monstrous lot of hardware by buying a lot of commodity hardware, stripping off everything extraneous, and then stuffing boxes full of them. In addition, many of these vendors are also being nice and sharing information with us about their experiences.
Along with companies such as Facebook and Google, one of these companies is the online backup provider BackBlaze, which creates giant storage “pods” out of commodity disk drives. It both uses them itself and sells them to other companies, such as Netflix. BackBlaze not only publishes details on how to build your own, but also reveals data about how well the various commodity disk drives work. This can be valuable best practices information for any company.
Consequently, BackBlaze periodically creates a new pod with a new type of disk drive just to check it out, and that’s where we are today. As of February 2013, the company was building its pods with 4TB disk drives, which meant the pods could store up to 180TB. Recently, however, the company is starting to test pods with 6TB drives, which not only means the pod can now store 270TB — half as much again, in the same space — but gives the company a chance to check out the new models of 6TB drives, for when prices drop down enough later. So far there’s one pod with Western Digital drives, and the company is planning to build another pod with Seagate drives.
There’s a couple of interesting nuances to the BackBlaze experiment, which, as usual, it details in a blog post.
First, let’s talk about electricity. The 6TB drives use less power than the 4TB ones. Moore’s Law FTW. However, BackBlaze pays a flat rate per rack for electricity rather than using metered electricity, so that it has a more regular expense flow. This might be something for other businesses to consider, if their power companies offer it.
Second, let’s talk about cost. BackBlaze explains that it typically upgrades to a new size of disk drive when the price differential between the two sizes drops to half a cent per gigabyte. Currently, the differential between 4TB and 6TB disk drives is a nickel per gigabyte, which is why the company is only testing the 6TB drives and not switching to them. However, due to its experience in the industry, it has a fairly good idea of how storage prices change, and that they decrease on a fairly regular curve, meaning the company can already predict that it’s likely to be able to switch to 6TB drives in early 2015.
In the meantime, BackBlaze is testing various vendors’ 6TB drives so that by the time prices do reach that point, it will know which disk drives are faster, use less power, fail less often, and so on.
On the other hand, with Seagate now shipping 8TB drives this month, it means BackBlaze already has a new kind of disk drive to test — which would make a 360TB pod. No word yet on how much it will cost.
Disclaimer: I am a BackBlaze customer.
Now we’re finding it’s true of email too.
(Stipulated: Child pornography is bad. Moving on.)
John Henry Skillern, of Texas, was arrested earlier this month for child pornography in his Gmail account, after Google alerted police. Police then came with a warrant, searched his home, found other evidence, and arrested him.
Wait a minute. How did Google know? Doesn’t the company talk all the time about how it doesn’t really look at the content of our email? That it’s just looking for keywords so it can sell ads?
It works like this. Really, there’s not that many newly generated child porn images out there; old ones keep getting sent around. As we’ve written about before, companies such as Dropbox, Facebook, and LinkedIn have a database of known child pornography images that have been hashed, or reduced by algorithm to a much smaller size. This is helped by an organization called the Internet Watch Foundation, which is co-funded by Google. Those companies compare the hashes of files being sent or stored with the database of child pornography hashes, and look for a match. It saves time, it saves space, and it means the companies don’t need to keep a database of eeeeevil pictures around for comparison.
It’s also why you don’t get arrested for sending a picture of your kid in the bathtub — because that picture isn’t in the database.
Turns out that Google is doing this searching with Gmail as well. It claims, in fact, that it is required by U.S. law to do so. Sort of. It is required by law to notify the National Center for Missing and Exploited Children if it finds people sending child pornography. It is less clear whether it is required to search for them doing so.
Either way, what’s the problem? If you don’t send out or receive child pornography in your Gmail, you don’t have to worry, right?
First of all, this incident raises the question of, what else does Google (and, presumably, other email providers, such as Microsoft and Yahoo!, according to CNN) look for in our Gmail? What else might they be willing to turn over to the police or other government agency?
Google claims that it doesn’t do this for anything other than child pornography. “if you’re Gchatting with a friend about buying marijuana, Google doesn’t want you to worry about being turned in,” writes CNN. But according to the legal expert CNN consulted, there was no reason Google couldn’t do that — it’s right in the terms of service.
“This kind of search technique can’t be easily translated to other crimes,” Business Insider reassures us blithely. “It’s not the same as a keyword search looking for words like ‘murder, ‘killed,’ ‘stolen’ or ‘bomb.’ Think how many times people use use those words innocently.”
On the other hand, as you may recall, Dropbox used a similar method of storing files to eliminate duplicates — by hashing them to see if a file was already stored online, and if it was, putting in a pointer rather than another copy of the file. But that would also make it easy for a law enforcement organization to determine whether a person was storing copyrighted material, such as movies, in their accounts — just create a similar hash database of popular movies and television programs. The same could be done for music files.
Not to mention anything considered to be terrorist activity, which is right up there with child pornography in terms of the throw-your-civil-liberties-out-the-window card. Or, as GigaOm suggests, fraud or illegal drugs.
Second, just how automated is this process? If someone receives a child pornography file through email that they didn’t want and didn’t ask for, how likely is it that the email provider is going to turn them in? What a great way to take care of your enemies!
Third, what other online products do this? If someone Googles “child pornography,” is this going to come back to haunt them later? The CNN piece indicated that it applied to search as well. What if they’re doing research for an article or a blog post?
Hypothetically, of course.