Now that Apple and Google have announced that they will incorporate encryption in smartphones by default, the question is how long law-abiding Americans will be allowed to continue to have encryption at all.
In case you missed it, Apple announced on September 17 that future editions of the iPhone would have encryption turned on by default in a way that no longer allows Apple to have access to encrypted data. “Apple’s new move is interesting and important because it’s an example of a company saying they no longer want to be in the surveillance business — a business they likely never intended to be in and want to get out of as quickly as possible,” writes Chris Soghoian, Principal Technologist and Senior Policy Analyst for the American Civil Liberties Union’s Speech, Privacy, and Technology Project. “Rather than telling the government that they no longer want to help the government, they re-architected iOS so they are unable to help the government.” The following day, Google announced that future versions of the Android smartphone operating system would have encryption turned on by default as well.
Predictably, the FBI and law enforcement had kittens. “The notion that someone would market a closet that could never be opened – even if it involves a case involving a child kidnapper and a court order – to me does not make any sense,” said FBI director James Comey. He also went on to invoke the notion of the terrorism that could surely befall the U.S. when this happens. “Two big tech providers are essentially creating sanctuary for people who are going to do harm,” agreed Ron Hosko, a former assistant director of the FBI’s criminal investigative division, told Marketplace. And pulling out the big guns, “Apple will become the phone of choice for the pedophile,” John J. Escalante, chief of detectives for Chicago’s police department, told the Washington Post. “The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”
Yep, terrorism, kidnapping, and pedophiles. They got the trifecta.
Company executives at Apple and Google told the New York Times that the government had essentially brought this on themselves with incidents like Edward Snowden, and that it was increasingly difficult for American companies to compete overseas with the perception that the U.S. government had its fingers in everything.
“The head of the FBI and his fellow fear-mongerers are still much more concerned with making sure they retain control over your privacy, rather than protecting everyone’s cybersecurity,” writes Trevor Trimm, in the U.K. paper The Guardian, after offering a line-by-line critique of Comey’s statement. Security experts pointed out that the government still had many other options by which it could legally request access to people’s electronic data, that the FBI didn’t cite any examples of cases where encryption would have prevented them from solving a case, and that one case cited by Hosko turned out to be irrelevant.
So the next question becomes, at what point might the federal government attempt to outlaw encryption again? Or mandate a back door?
In case you were born sometime after MTV, at one point in time, anything more powerful than 40-bit encryption was actually classified as a munition — you know, like bombs and missiles — and illegal to export from the U.S., and not all that easy to get hold of even inside the U.S. In fact, a guy named Philip Zimmerman got himself into a peck of trouble when he developed Pretty Good Privacy (PGP), intended to be Everyman’s data encryption. While it was fine inside the U.S., copies of it surfaced internationally, and for several years it looked like Zimmerman might face charges, which led to him being a cause celebre in the computer community.
In 1993, the Bill Clinton White House went further and proposed the Clipper Chip, an encryption system that included a back door so that law enforcement organizations could still read any data encrypted by the device. Which, of course, they’d only use if you were a bad guy, of course. But by 1996, partly due to the enormous wave of protest against the notion — and partly due to technical issues, such as bugs that were found in it (by a guy named Matt Blaze, who’s still around these days, commenting on the Apple/Google encryption flap) — the government had dropped the project. At the same time, the Clinton White House relaxed the rules on greater than 40-bit encryption.
These days, encryption is readily available, but generally you have to know about it and how to turn it on. What Apple and Google are doing are selling devices with it already turned on — and, in response to the increasing number of requests from the government for user data, they no longer will even have access to the user’s data.
(This does, of course, mean that if you lose your encryption key, you’re hosed.)
So what other effects can we expect from the Apple/Google decision?
- Courts are still trying to figure out whether an encryption key is like a key or a combination to a safe — something you have or something you know — so they can decide whether you have to give it up. So law enforcement organizations are still taking people to court to force them to reveal encryption keys, and sometimes they win. Conceivably, with encryption turned on by default, this could happen a lot more often.
- Having encryption be the default could also eliminate the “why are you encrypting it if you don’t have anything to hide?” presumption.
- Of course, bad guys have pretty much figured encryption out, even when it’s not the default. To be blunt, Apple and Google’s actions simply mean that regular people will have the same capabilities as the bad guys. And in an era where companies and individuals alike are regularly losing laptops, disk drives, and smartphones with personal data in them — and then getting fined for losing the data and having it not encrypted in the first place — having encryption as the default simply makes sense.
So what would happen if the government were to outlaw encryption, or mandate a back door? The Electronic Frontier Foundation, which lives for this sort of thing, has a nice long list of possible repercussions.
Realistically, though, would outlawing encryption even be practical in this day and age? Look at it this way — if encryption were made illegal, that would mean that all the personal data on all the devices that get lost or stolen would then be accessible . It would make the Target incident look like a picnic. (The ACLU’s Soghoian pointed out that in 2011, the FBI was encouraging people to encrypt their data to keep it out of the hands of criminals.)
And everyone who believes that bad guys wouldn’t continue to be able to use encryption or would have to have a back door to their communications, please poke out your right eye. In the same way that bad guys continue to get access to illegal firearms today, bad guys would still get access to encryption, one way or another. Sorry, FBI, but that genie is out of the bottle.
It isn’t clear whether the government is going to try to outlaw encryption again, or try to mandate a law enforcement back door. There is some talk about Congress enacting a law, but due to the Edward Snowden revelations, few Congressional representatives want to touch it, according to Bloomberg Business Week. Still, it’s something we need to watch out for — but it looks like computer vendors are increasingly unlikely to help, according to Iain Thompson in the Register. “It’s unlikely Apple or Google is going to bow down to the wishes of government and install backdoors in their own products,” he writes. ” This would be disastrous to sales if found out, and there are increasing signs that the tech sector is gearing up for a fight over the issue.”
You may recall that an activist investor called Elliot Management Corp. bought a $1 billion chunk of EMC earlier this summer — about two percent, big enough that it could start throwing its weight around and make suggestions about how EMC could make even more money for it, such as by selling VMware. EMC CEO and chairman Joe Tucci — who’s scheduled to retire in February for the nth time, said he’d meet with the company but wouldn’t make any promises.
Now it’s starting to look as though the EMC we all know and love may not survive Tucci’s departure.
As nearly everyone suspected, it all started out with Tucci saying he wouldn’t sell VMware, of which EMC owns 80 percent. Why exactly he’s so enamored of the virtualization company isn’t clear, other than the fact that it in turn comprises up to 75 percent of EMC’s value these days.
Instead, it’s looking like Tucci would rather sell EMC itself — or, at least, break it up and sell some of the parts. In the process, the storage company is being partnered (or, as my teenage daughter would say, “shipped,” as in “relationship”) with just about every major other computer vendor out there. And some of them are willing and some of them aren’t.
Cisco. While there was talk about a potential merger with Cisco, Cisco Chief Executive Officer John Chambers himself said there was nothing to it, and if there had been anything, it would have been a year ago.
HP. Because, after all, HP acquisitions always go so well. Look at Autonomy. And Compaq. Anyway, reportedly the notion of a “merger of equals” with HP went pretty far, with the notion that HP’s Meg Whitman would be CEO and Tucci would be chair, “but there were disagreements over price and the next layer of management,” according to Bloomberg. Incidentally, according to the Wall Street Journal, EMC and HP had been talking for the past year. The merged companies would be bigger than IBM, notes Business Insider. But it isn’t clear what HP would gain from the deal, according to Mike Wheatley of Silicon Angle. “Its storage organization is unlikely to relish the prospect of taking on EMC’s people and products, and EMC’s VNX and VMAX arrays compete with HP’s 3PAR products,” he writes. “In addition, HP’s various object, backup and virtual SAN products all overlap with equivalents from EMC.”
Dell. In addition to HP, EMC had also been talking to Dell about buying at least part of the company, the Journal added. But sites such as Silicon Angle are dubious due to the disparity in size between the two companies, as well as overlap between their product lines. “For example, Dell’s Compellent and EqualLogic storage arrays compete with EMC’s VNX line,” Wheatley writes.
Oracle. Forbes‘ Peter Cohan is pushing Oracle, noting that the company would likely pay more than HP and that Oracle doesn’t have much of a storage presence. This was actually rumored in 2010. But some other analysts now find it unlikely.
IBM or SAP. Even these companies were mentioned by Forbes, but IBM probably doesn’t have the budget for it and SAP probably wouldn’t make such a big non-European acquisition, writes Sarah Cohen.
This is all in the context of to what degree massive storage array companies like EMC even have a future, in an era of cloud computing. Yes, mainframe company IBM survived the PC era, but most of the other mainframe companies didn’t, and it’s speculated that EMC might have similar troubles surviving the cloud era. “The question is why anyone would want EMC gear at all,” Jon Toigo, of IT consultancy Toigo Partners, told Silicon Angle. “With VMware and Microsoft pushing server-side direct attached storage topologies to replace centralized SAN storage, I think of the monolithic storage products of EMC as niche players in a shrinking market.”
In fact, Arik Hesseldahl of Re/code thinks that EMC has already run out of potential suitors, and nothing is going to happen at this point. Bloomberg thinks that EMC should be acquiring more companies, and helpfully provided a laundry list.
In any event, EMC stock has been hitting new one-year highs, which has to make Elliott happy.
EMC has traditionally been a fairly drama-free company, but it looks like it’s making up for it now.
It’s said that one of the major advantages of cloud storage is that you can add storage on the fly as you need it. But what if it went further? What if cloud storage was even more disintermediated than it is today?
An electric utility, for example, typically offers power generated several different ways, with each way having a particular cost associated with generating it and a certain amount of time it takes to crank it up and shut it down again. If the utility suddenly needs a lot of power, it can buy power – typically more expensive – on the spot market, but it ensures that the utility doesn’t have a brownout or a blackout. Having a utility also means that individuals and businesses don’t have to worry about generating their own power and making sure they have enough when they need it; they just trust the utility to provide it.
So what if cloud storage was like that? What if you just used it as you needed it, didn’t buy it and hold it, and you contracted with a provider who might get it from Amazon, Microsoft, or some other vendor at any moment, depending on who had it available at a particular time?
That’s a notion being discussed in connection with a couple of recent announcements. First is the announcement by Accelion that its kiteworks content connectors would now include Box and Dropbox, as well as Google Drive and Microsoft OneDrive. Kiteworks reportedly provides the management capabilities, such as providing the same interface regardless of which cloud service is being used, as well as security settings and access control. Second is the announcement that cloud company 6Fusion had signed a deal with the Chicago Mercantile commodity exchange, after originally signing a non-binding agreement last fall, and is expected to offer a beta product later this year.
“If all works out, the deal will mean that buyers and sellers of cloud computing services can do business on a spot exchange and, in a few years, trade derivatives too,” writes Jeff John Roberts at GigaOm. “The exchange will be a place to buy hours of ‘WAC,’ a term invented by 6Fusion that stands for Workload Allocation Cube. The idea behind the WAC is to create a standard unit of cloud computing infrastructure that can be bought and sold by the thousands.”
Basically, 6Fusion hopes that the WAC will become the cloud storage equivalent of the watt of power or the barrel of oil, Roberts writes. That is, of course, predicated on whether the rest of the industry accepts it, he warns.
Eventually, users and organizations might not even know what company is providing its storage, in the same way that we aren’t typically aware of whether our power is coming from hydroelectric, natural gas, coal, solar, or other sources – which will be easier with developments like Accelion’s, which provide the same interface to multiple providers’ storage. Moreover, because 6Fusion is working with the commodities markets, people could invest in “storage futures,” in the same way that they buy pork bellies now.
“The IT infrastructure of a company like JP Morgan could soon consist of private cloud servers for sensitive data, supplemented by public cloud supplies purchased from an ever-changing roster of third party cloud computing providers,” Roberts continues. “At the same time, such purchases of cloud computing ‘by the bushel’ would also mean lower prices as traders, rather than vendors, start to set the price of key ingredients of IT infrastructure.”
The concept isn’t exactly new; Forbes points out that such “spot markets” have been discussed, tried before and failed. Even 6Fusion itself had been talking about the notion publicly since last spring.
Just think – if they expand it to compute power itself, they could reinvent timesharing.
Okay, it’s official: Storage vendors are all a big bunch of size queens.
(This is probably a good time to reminisce about how my first computer had a hard disk as an add-on that was the size of the computer itself and cost the same amount — for 10MB. Yes, I’m old, whippersnapper, and get off my lawn.)
The Western Digital drive, like the Seagate drive, uses shingled magnetic recording technology, which puts more data in the same space though it’s is slower. In fact, Western Digital comes right out and says the drive is mainly intended for “cold storage” facilities.
That means “slow,” because you put stuff in cold storage that you don’t need all the time, and so you spin down the drives rather than keep them running all the time because it saves energy. So every time you retrieve something from cold storage, you have to go kick the drive to start it up again. It’s like keeping the beer in the fridge out in the garage. You can put a lot more beer out there, but you have to traipse out to the garage every time you want a beer.
There’s one big difference between the 8TB Seagate and the Western Digital UltraStar HE10 — well, besides the size. The Western Digital 10TB drive uses the helium technology it first announced last November for its 6TB drive, while the Seagate drive doesn’t. (Hence the He in the product name.)
We don’t know how much it’s going to cost. Western Digital claimed it would have the lowest cost per gigabyte in the industry, but didn’t release actual prices. Some sources, such as ExtremeTech, were dubious about this. That said, the helium-filled 6TB drives cost from $449.99 to $899.99, depending on the source, compared with commodity-level 6TB drives costing $275, according to Backblaze.
(Can I just say how awesome it is to speak of “commodity-level 6TB drives.” What a world we live in.)
Similarly, the 512GB SanDisk Extreme PRO SDKX UHS-I memory card comes at a premium price: $800. (Though you can get it for $729 some places.) Considering that you can get 32GB cards for 50 cents a gigabyte, that may seem a little high. But the disk is aimed at high-end audio-video people for things like recording 4K video, so they don’t have to swap storage in and out so often.
This is not intended for Dad recording the kids opening presents at Christmas, is what I’m saying.
But at some point we do have to raise the question: Just how big should a single storage device be? I’m not talking about “10MB was good enough for me and it should be good enough for you kids!” I’m thinking about things like reliability. How do you back the things up? And, more to the point, how much data are you willing to lose? When’s the last time you accidentally laundered an SD card you left in your pants pocket?
Lossage is particularly an issue for the helium-filled drives. Yes, yes, we know they’re guaranteed for five years, but getting your $499 back is going to be cold comfort if in the process you’ve lost $10,000 worth of data. The 6TB devices have been out for only a few months, and it’s not clear just what the MTBF rate is going to be for them. Even the MIT Technology Review said last year that it would probably take a year before anyone had any idea of how well they’d last.
Sadly, the world of disk drive testing isn’t what it used to be. As we’ve mentioned before, weirdly, some of the best disk drive testing is done by BackBlaze, but since these helium-filled drives aren’t commodity items it’s unlikely that BackBlaze is testing them yet. We can hope that CNET or ZDnet happened to buy a few of them, set them up in a corner somewhere, and is preparing to give us a great review in a couple of months about just how long we can expect the helium to last, but personally I’m dubious.
One can argue that if the 10TB drives are used in cold storage, you don’t really have to worry that much, because chances are that archived data is stored in multiple places anyway. Well, perhaps, but that raises the question of what the point is. Helium-filled drives cost almost twice as much as commodity non-helium drives, so will a helium-filled drive that’s almost twice as big cost almost four times as much? If so, at what point does the reduction in maintenance and infrastructure make it worthwhile? ExtremeTech points out that according to Western Digital, the power consumption of the drives is 23 percent less than its air-filled drives, and if you have a big enough data center, that would certainly add up, but you’d really want to know how long they’re going to last.
Perhaps the storage vendors are just hoping that storage managers are size queens too.
Our government loses so many laptops, it’s kind of nice when the tables are turned once in a while. That said, a lot of people are talking about what’s purported to be a laptop belonging to the Sunni jihadist group ISIS.
You may recall in 2011, when Osama bin Laden’s hideout was raided and he was killed, that Navy SEALS retrieved several computers and storage devices. This is similar, except the laptop was captured in January by a moderate rebel group in northern Syria, from an ISIS hideout, according to Foreign Policy, which broke the story.
In fact, according to reporter Harald Doornbas, the laptop — which he dubbed the “terror laptop of doom” — was conveniently neither password-protected nor encrypted, and the material it contained was only nominally protected. “Buried in the ‘hidden files’ section of the computer were 146 gigabytes of material, containing a total of 35,347 files in 2,367 folders,” he writes. And what was in those files? Besides “videos of Osama bin Laden, manuals on how to make bombs, instructions for stealing cars, and lessons on how to use disguises,” it also contained what is said to be detailed information on how to weaponize bubonic plague for biological warfare.
Opinions vary on the veracity of the laptop (incidentally, apparently the laptop of choice for biological terrorists is Dell, and it’s black). Some found it evidence that the U.S. should attack and some — damn few, sadly — found that there are aspects of the story that lack credibility. Such as, really? You capture an enemy laptop in January and don’t look at it til months later, in front of a reporter?
Similarly, the reporter was criticized for looking at the laptop in the first place, as opposed to having it examined forensically. because running it would modify dates and other information that could be useful in determining its veracity , noted one online commenter, who went on to point out that similar “magic laptops” justifying conspiracy theories had been found in Colombia in 2008.
One of the less hysterical reactions — though it didn’t address the veracity issue — is from Outbreak News Today. Specializing in covering infectious diseases, the publication not only points out the long history of attempts to use plague in biological warfare — dating back to the 14th century — but also talked about the difficulties of such.
Other commenters noted that none of the information was particularly a smoking gun, that such information is readily available to anybody (Anarchist Cookbook, anyone?), and that perhaps the laptop was a plant intended to encourage a war. “How convenient!” posted one. “Just as the US have troubles coming up with a reasonable justification in international law for air strike operations, a laptop – luckily the one with all the plans – comes up.”
Perhaps we could call it the Zimmermann Laptop, after the Zimmermann Telegram — originally thought to be fake but later demonstrated to be true — which helped propel the U.S. entry into World War I.
As you may recall, there are a number of vendors using a monstrous lot of hardware by buying a lot of commodity hardware, stripping off everything extraneous, and then stuffing boxes full of them. In addition, many of these vendors are also being nice and sharing information with us about their experiences.
Along with companies such as Facebook and Google, one of these companies is the online backup provider BackBlaze, which creates giant storage “pods” out of commodity disk drives. It both uses them itself and sells them to other companies, such as Netflix. BackBlaze not only publishes details on how to build your own, but also reveals data about how well the various commodity disk drives work. This can be valuable best practices information for any company.
Consequently, BackBlaze periodically creates a new pod with a new type of disk drive just to check it out, and that’s where we are today. As of February 2013, the company was building its pods with 4TB disk drives, which meant the pods could store up to 180TB. Recently, however, the company is starting to test pods with 6TB drives, which not only means the pod can now store 270TB — half as much again, in the same space — but gives the company a chance to check out the new models of 6TB drives, for when prices drop down enough later. So far there’s one pod with Western Digital drives, and the company is planning to build another pod with Seagate drives.
There’s a couple of interesting nuances to the BackBlaze experiment, which, as usual, it details in a blog post.
First, let’s talk about electricity. The 6TB drives use less power than the 4TB ones. Moore’s Law FTW. However, BackBlaze pays a flat rate per rack for electricity rather than using metered electricity, so that it has a more regular expense flow. This might be something for other businesses to consider, if their power companies offer it.
Second, let’s talk about cost. BackBlaze explains that it typically upgrades to a new size of disk drive when the price differential between the two sizes drops to half a cent per gigabyte. Currently, the differential between 4TB and 6TB disk drives is a nickel per gigabyte, which is why the company is only testing the 6TB drives and not switching to them. However, due to its experience in the industry, it has a fairly good idea of how storage prices change, and that they decrease on a fairly regular curve, meaning the company can already predict that it’s likely to be able to switch to 6TB drives in early 2015.
In the meantime, BackBlaze is testing various vendors’ 6TB drives so that by the time prices do reach that point, it will know which disk drives are faster, use less power, fail less often, and so on.
On the other hand, with Seagate now shipping 8TB drives this month, it means BackBlaze already has a new kind of disk drive to test — which would make a 360TB pod. No word yet on how much it will cost.
Disclaimer: I am a BackBlaze customer.
Now we’re finding it’s true of email too.
(Stipulated: Child pornography is bad. Moving on.)
John Henry Skillern, of Texas, was arrested earlier this month for child pornography in his Gmail account, after Google alerted police. Police then came with a warrant, searched his home, found other evidence, and arrested him.
Wait a minute. How did Google know? Doesn’t the company talk all the time about how it doesn’t really look at the content of our email? That it’s just looking for keywords so it can sell ads?
It works like this. Really, there’s not that many newly generated child porn images out there; old ones keep getting sent around. As we’ve written about before, companies such as Dropbox, Facebook, and LinkedIn have a database of known child pornography images that have been hashed, or reduced by algorithm to a much smaller size. This is helped by an organization called the Internet Watch Foundation, which is co-funded by Google. Those companies compare the hashes of files being sent or stored with the database of child pornography hashes, and look for a match. It saves time, it saves space, and it means the companies don’t need to keep a database of eeeeevil pictures around for comparison.
It’s also why you don’t get arrested for sending a picture of your kid in the bathtub — because that picture isn’t in the database.
Turns out that Google is doing this searching with Gmail as well. It claims, in fact, that it is required by U.S. law to do so. Sort of. It is required by law to notify the National Center for Missing and Exploited Children if it finds people sending child pornography. It is less clear whether it is required to search for them doing so.
Either way, what’s the problem? If you don’t send out or receive child pornography in your Gmail, you don’t have to worry, right?
First of all, this incident raises the question of, what else does Google (and, presumably, other email providers, such as Microsoft and Yahoo!, according to CNN) look for in our Gmail? What else might they be willing to turn over to the police or other government agency?
Google claims that it doesn’t do this for anything other than child pornography. “if you’re Gchatting with a friend about buying marijuana, Google doesn’t want you to worry about being turned in,” writes CNN. But according to the legal expert CNN consulted, there was no reason Google couldn’t do that — it’s right in the terms of service.
“This kind of search technique can’t be easily translated to other crimes,” Business Insider reassures us blithely. “It’s not the same as a keyword search looking for words like ‘murder, ‘killed,’ ‘stolen’ or ‘bomb.’ Think how many times people use use those words innocently.”
On the other hand, as you may recall, Dropbox used a similar method of storing files to eliminate duplicates — by hashing them to see if a file was already stored online, and if it was, putting in a pointer rather than another copy of the file. But that would also make it easy for a law enforcement organization to determine whether a person was storing copyrighted material, such as movies, in their accounts — just create a similar hash database of popular movies and television programs. The same could be done for music files.
Not to mention anything considered to be terrorist activity, which is right up there with child pornography in terms of the throw-your-civil-liberties-out-the-window card. Or, as GigaOm suggests, fraud or illegal drugs.
Second, just how automated is this process? If someone receives a child pornography file through email that they didn’t want and didn’t ask for, how likely is it that the email provider is going to turn them in? What a great way to take care of your enemies!
Third, what other online products do this? If someone Googles “child pornography,” is this going to come back to haunt them later? The CNN piece indicated that it applied to search as well. What if they’re doing research for an article or a blog post?
Hypothetically, of course.
Another judge has told Microsoft that it must release data stored on one of its servers to a U.S. government agency, even though the data in question is outside the U.S., setting the stage for a massive worldwide confrontation on just who has the right to have access to data where.
The agency is unnamed, but the Wall Street Journal identifies it as the Justice Department, and several media reports indicate that the case has something to do with drugs.
This particular ruling is just another stepping stone on the path; according to the New York Times, the judge in question, Judge Loretta A. Preska of the United States District Court for the Southern District of New York, agreed to an immediate stay pending the next step in the appeals process, meaning nobody has to do anything yet except the lawyers.
And the lawyers are very busy.
“It is a question of control, not a question of the location of that information,” Preska said, according to newspaper reports. Because Microsoft could make a copy of the information in the Irish server from the United States, it doesn’t violate Ireland’s sovereignty, she ruled.
“She cited a 1984 case that held that a court may require a company to disclose its business records no matter where in the world they are, and that the disclosure did not require the consent of the country in which they are stored,” writes the Washington Post. “She said that Congress was aware of the 1984 case when it passed [the Electronic Communications Privacy Act], and so the law implicitly authorized the overseas reach.”
Microsoft is the point person here, but in a there-but-for-the grace-of-God-go-I move, several other companies that have worldwide data centers have filed briefs with the court supporting the company, including Apple, AT&T, Cisco, and Verizon, according to the Times and the Guardian, as well as the nonprofit Electronic Frontier Foundation.
Aside from keeping several law firms afloat, the case could have several other ramifications. First, it puts companies in the position of violating either European Union or U.S. law. “All of [this] puts Microsoft in a very difficult position,” explains Forbes. “If they obey the US order then they’re in breach of EU law, if they stick to EU law then they’re going to be in breach of this US order (for however long the order survives the appeals process).”
Second, it could scare worldwide companies away from using U.S. companies’ data centers to store their data, which could have a significant impact on those companies’ finances. Companies are worried they could lose billions of dollars in revenue to foreign competitors if customers fear their data is subject to seizure by US investigators anywhere in the world, the Guardian writes.
Third, it gives other countries the opportunity to come up with reasons why they should have access to data stored in the U.S. based on their laws. The U.K., for example, recently created the Data Retention and Investigatory Powers (DRIP) Act, which “requires internet and phone companies to collect their customers’ personal communication data, tracking their phone and internet use, and store it for 12 months to give access to the police, security services and up to 600 public bodies on request,” according to the Guardian.
The U.K. government also added a clause making it clear that foreign firms holding data on U.K. citizens can also be served with a warrant to hand over information, the Guardian writes.
Similarly, Microsoft’s attorney said that authorities in China had appeared at Microsoft offices there demanding a password to gain access to material that the company stores in the United States, according to the Associated Press.
All in all, it has the potential to create a terrible mess.
Black Hat is always a fun time to find out what new security vulnerabilities there might be to keep you up at night — particularly if you attend and get your system infected while you’re there — and this year is no exception. The conference will be held in Las Vegas next week and the online world is already atwitter, so to speak, about one of the presentations.
This is all according to the German security organization SR Labs, which is offering a presentation called “Bad USB — On Accessories That Turn Evil.” The organization released a preview of its presentation on its website.
According to the presentation, it’s possible to insert malware into the microcode in USB devices — that is, any USB device, including keyboards, cameras, and mice — to reprogram them and essentially turn them into another USB device. This would allow people to:
- Emulate a keyboard and issue commands on behalf of the logged-in user to steal files or install malware; such malware, in turn, can infect the controller chips of other USB devices connected to the computer
- Spoof a network card and change the computer’s DNS setting to redirect traffic
- Boot a small virus on startup, which infects the computer’s operating system prior to boot
- Replace the computer’s BIOS
- Though the researchers don’t mention this one, presumably it could turn on the camera and spy on you or anything else in the room
Naturally, none of this is detectable. Virus scanners don’t work because they don’t look at microcode. Beyond that, once a computer is infected, you can basically never trust it again, SR Lab researchers say, because any USB thing that might be plugged into it could still be infected, even if you reinstall the operating system.
The organization says it will be releasing unspecified “tools” on August 7, but whether these are tools to prevent this sort of attack or enable it, they don’t say. The session description, however, does seem to indicate that the researchers will be speaking about how to protect against such attacks, at least theoretically.
A Reuters article on the presentation attributes the vulnerability to a “bug,” but the SR Labs presentation doesn’t make it sound like a bug is involved — simply that the microcode isn’t protected from such malware.
Karsten Nohl, chief scientist at SR Labs, who is one of the co-presenters, also told Reuters that he wouldn’t be surprised if organizations such as the NSA weren’t already using this technique, but the NSA wouldn’t comment to Reuters.
Reuters said Nohl had done this with Google’s Android as well as with microcode on chips from Phison; Phison representatives didn’t think it was possible. Nohl also said that he believed it would work with any vendor’s chips, not just Phison’s.
We’ve said before that it’s really not a good idea to pick up strange USB sticks and use them; it sounds like that’s particularly true now.
Particularly at Black Hat.
More than one person thought it was from the Onion: Police have trained dogs to smell out child pornography. But the truth is no laughing matter.
First of all, the dogs can’t smell pornography, child or otherwise. For heaven’s sake, writers. Have some credibility.
What the dogs (publicized in Connecticut and Rhode Island, thus far) have reportedly been trained to do is smell out storage devices, such as hard drives, memory cards, and USB sticks. Similarly, dogs have also been trained to find cell phones in prisons. And in response to media pirating, dogs were trained in 2006 to find DVDs and other recorded media, which the police would then seize and search and determine whether they were legal.
So the “pornography-sniffing dog” works like this: Police think a perp has child pornography on storage devices, bring in the dog, the dog finds storage devices “hidden” in a suspicious way, and that gives the police probable cause to seize the storage and search it. Because after all, if there was nothing creepy on it, why’d you hide it, punk?
(Slashdot commenters had fun with this story, suggesting ways to defeat the dog. “Get a lot of old flash drives, sd cards, and the like, the old super cheap ones of course, and stick them everywhere,” writes one. “Under the carpet, taped to the bottom of the drawers, in the hem of the curtain, etc. After 30 or 40 of them, somebody is going to get sick of playing that game, and it might be the dog.”)
Stipulated: child pornography is bad, and we don’t want people to do it. Stipulated also, most police officers and prosecutors genuinely want to just catch bad guys and be on the right side of the law. That said, we’ve already written about how child pornography seems to be a Get Out of the Fourth Amendment Free card for some people. And this is a particularly egregious example.
Let’s start with the fact that drug-sniffing dogs, from which this is the logical extension, and their handlers have been implicated in some pretty interesting Fourth Amendment cases. In February, 2013, the Supreme Court ruled that searches based on using drug-sniffing dogs was legal even if what was found wasn’t related to what the dog detected. (Though in more recent cases, the Supreme Court has ruled that home searches, specifically, based on a drug-sniffing dog are illegal.)
“The U.S. Supreme Court has given police ‘probable cause’ to search your vehicle if a police dog detects drugs, typically by sitting, digging or barking,” explains the Las Vegas Review-Journal in an extensive article about drug-sniffing dog flaws. “That is an extraordinary power — officers working without dogs need ‘a reasonable belief that a person has committed a crime’ for such searches. Mere suspicion is not enough, and criminal cases resulting from searches that don’t meet the ‘probable cause’ standard can be, and are, tossed out in court.”
Drug-sniffing dog reactions consequently practically gives police carte blanche to search whatever they want. And note that it’s been reported that some 90 percent of U.S. currency has traces of cocaine on it. For example, in numerous cases people traveling with large amounts of cash have had it seized by virtue of it being “contaminated” with drugs.
Some people have also criticized the fact that the storage-sniffing dogs are being trained and rewarded with food. “This is how he eats every day,” according to the dog’s trainer. But other dog experts say that training a dog with food is a bad idea. “Offering a sniffer dog food in exchange for a ‘find’ opens the way for an abuse of the system — if it’s hungry enough it will take food from anybody, not just its handler and therefore defeats the object of the search,” Maggie Gwynne, of Sniffer Dogs UK & International, told the BBC.
(On the other hand, one wonders what one of these storage-sniffing dogs would do in a room full of dog biscuits.)
There have also been cases where dogs’ “detection” of drugs appears to be based primarily on the reactions of their handlers, a sort of drug-sniffing Clever Hans. The police want to find drugs in your car? Son of a gun, the dog detects something — simply because the handler believes that it’s there. That gives police probable cause to search. And chances are, something, somewhere in your car, has been touched by an illegal drug, sometime.
Now, for how many of us is that going to be true of having some sort of data storage device?
So, now the cops have “found” your data storage, which they declare was “hidden,” and thus suspicious, which gives them the right to search it, and who knows what they might find during that fishing expedition?
Well, you say, not a problem, I’ll encrypt it. Except that, as we’ve seen — typically also under the aegis of “protecting the children” — people are being forced to reveal their encryption keys. The Massachusetts Supreme Court just ruled on another one of these cases last month, saying that because the suspect agreed that it was his computer and that he had encrypted it and had the key, he had given up his Fifth Amendment rights about self-incrimination.
What could be worse is if — after the storage-sniffing dog finds the microSD card under the dresser that the cat knocked off last month and the police decide that means you were hiding it — it isn’t encrypted but the police decide that it is and you’re lying. In some countries, particularly the U.K., people have gone to jail for refusing to reveal an encryption key. And as we’ve suggested before, it’s going to be an interesting legal case when someone goes to jail for refusing to reveal a key they don’t have.