Here we go again. Is an encryption key more like a physical key or the combination to a safe?
Courts have been deciding back and forth on the issue for several years now and, most recently, have decided that a phone password is more like the combination to a safe.
It matters because something that is the expression of one’s mind, like the combination to a safe, is protected under your Fifth Amendment rights not to incriminate yourself. A physical key, something you possess, is something you can be forced to produce.
This all came up when the Securities and Exchange Commission (SEC) began investigating Bonan and Nan Huang (who are not related to each other) for insider trading, writes Orin Kerr in the Washington Post.
“The two worked at the credit card company Capital One as data analysts,” Kerr writes. “According to the complaint, the two allegedly used their jobs as data analysts to figure out sales trends at major U.S. companies and to trade stocks in those companies ahead of announced company earnings. According to the SEC, they turned a $150,000 investment into $2.8 million.
“Capital One let its employees use company-owned smartphones for work. Every employee picked his own passcode, and for security reasons did not share the passcode with Capital One. When Capital One fired the defendants, the defendants returned their phones. Later, as part of the investigation, Capital One turned over the phones to the SEC. The SEC now wants to access the phones because it believes evidence of insider trading is stored inside them.”
But the SEC has been thwarted by Judge Mark Kearney, which ruled that the passwords were indeed protected by the Fifth Amendment. Exactly why is a very long how-many-angels-dance-on-the-head-of-a-pin discussion that lawyers love to have. But it boils down to whether the SEC actually wants the password itself, or access to the documents. And since it wants access to the documents, the proper way to approach it is to have the defendants enter the password, providing access to the documents but without revealing the password, Kearney writes.
And for people debating between company-provided cellphones and BYOD, that angle is involved, too: Is a password to a company-provided cell phone considered a corporate record? If it were, then the Fifth Amendment wouldn’t apply, but Kearney doesn’t believe it is.
Indeed, because Capital One specifically told the analysts to keep their passwords secret and not write them down, that made them products of the mind and not corporate records, Kearney writes.
As with other cases of this ilk, it’s likely that, eventually, the Supreme Court is going to need to rule on the issue.
To add an additional wrinkle, recall that a suspect can be forced to give up a fingerprint, if that’s being used to secure the phone. That’s because a fingerprint is something you have, similar to the way that you can be compelled to give up a blood sample to test for alcohol. (Consequently, what you’d want in an ideal is to protect a phone both through encryption and a fingerprint, but not all phones can do that.)
So all that business about not writing down your password? Turns out it was more right than you knew.
Those of us of a certain age have fond memories of hunching by the radio with a mike and our portable cassette recorder, waiting for them to play our favorite song and please-God-don’t-let-the-DJ-talk-over-the-intro-this-time.
Guess what: The cassette is coming back.
In the same way that some retro purists have brought back vinyl, some artists such as Arcade Fire and Transviolet are actually still issuing music on cassette tape. In 1983, cassettes began outselling records, until 1991, when the CD became the most popular medium, writes Zach Sokol in Motherboard.
As it turns out, there’s at least one factory – National Audio Co., in Springfield, Mo. — that still manufactures cassette tapes, and it says business is better than ever: Last year is the best year it’s had since it opened in 1969, writes Jeniece Pettitt in Bloomberg. “The profitable company produced more than 10 million tapes in 2014 and sales are up 20 percent this year,” she writes.
Why, it’s hard to say. In the case of vinyl, there is an argument to be made that it sounds “better,” though any quality improvement might be wasted on a generation that grew up listening to MP3s. And like vinyl, there are those who claim the analog sound of tape is preferable to digital recordings. Also, in a day and age where so much of our content is digital, some people really like having something tangible, Pettitt writes.
“Certain kinds of music sound good on cassette,” wrote Nick Sylvester in Pitchfork on the eve of Cassette Store Day in 2013. “The public perception is that tape is ‘warm’ and ’fat,’ but not all tape is equal, and recording to 2-inch tape on an old Studer is very different from playing a cassette in a car stereo. In the cassette heyday, people weren’t exactly seeking out cassette releases for their sonic character. Mastering engineers did everything they possibly could to ‘beat’ the cassette, to make the music sound pretty damn close to the original recording despite the ways tape stock can roll off the highs, stuff the low-mids, and hiss above 1khz.”
“Nostalgia’s a potent drug, and the music industry has changed abruptly enough that even twenty-somethings like me feel wistful for the lost ‘90s,” Zach Schoenfeld wrote in Newsweek during the first Cassette Store Day. “Though I’m not yet 30, I can recall my very first cassette (Red Hot Chili Peppers’ much-maligned One Hot Minute) far more easily than I can name my first CD or MP3 or Spotify stream.”
As we’ve mentioned, part of the problem with recording data on old media is finding a way to read it later, and indeed that is a problem for some people who didn’t save their Walkmans. (Walkmen?) And there’s the occasional story about Kids These Days who mistake the cassette player in an older car for an iPhone dock, to hilarious results.
Count your blessings, though: So far, nobody’s talking about bringing back the 8-track.
We have always been at war with Eastasia.
In an era where people have to resort to smartphones and the Internet just to look up phone numbers, people are warning that the next wave of hacking might not take information, but add or change it instead.
“Most of the public discussion regarding cyber threats has focused on the confidentiality and availability of information; cyber espionage undermines confidentiality, whereas denial of service operations and data deletion attacks undermine availability,” wrote Director of National Intelligence James Clapper, in testimony presented to the House Subcommittee on Intelligence earlier this month. “In the future, however, we might also see more cyber operations that will change or manipulate electronic information in order to compromise its integrity (i.e., accuracy and reliability) instead of deleting it or disrupting access to it. Decision making by senior government officials (civilian and military), corporate executives, investors, or others will be impaired if they cannot trust the information they are receiving.”
In particular, hackers or terrorists could wreak havoc by changing data about infrastructure, postulates Patrick Tucker, technology editor for Defense One. Remember that as far back as Die Hard 2, the bad guys were crashing planes by feeding them incorrect data on their actual altitude.
Clapper isn’t the only one to suggest this. For example, when the Office of Personnel Management revealed earlier this year that it had been hacked, some speculated that more could be involved than simply taking information. “For those of us who wear tinfoil hats – what if records were not only taken, but some were added as well?” writes Steve Ragan in CSO Online. “Would the OPM be able to tell?”
As it turns out, Clapper has actually been saying this for some time; articles quoting him talking about hackers who could “change or manipulate” information have been published since at least February, when he testified to the Senate Armed Services Committee. “[Clapper] described future attacks which will change or manipulate [there’s that phrase again] electronic information in order to compromise its integrity,” Business Korea wrote in March. “In the future, hackers may launch more clandestine cyber espionage programs that manipulate data so victims lose credibility.”
What might it have done, for example, if at some point someone had added data to government records to make it appear that President Obama actually had been born in Kenya?
People have always added fake people to rosters to get additional paychecks and other benefits – remember M*A*S*H’s “Captain Tuttle”? – but doing it through the computer can make it a lot easier. “A doctor pulls up your electronic medical records to discover that they have been changed and you have been receiving the wrong dosage of a lifesaving medicine,” writes Rep. John Ratcliffe (R-Texas), who chairs the Homeland Security Subcommittee on Cybersecurity, Infrastructure Protection and Security Technologies, and sits on the Judiciary Committee, in The Hill. “Now imagine this happening at every hospital in the United States.”
Think the whole notion of messing things up by changing information is farfetched? How many “Han shot first” arguments have you seen? And that’s with film that millions of people have seen – not to mention someone who actually admits that they changed something.
Of course, if you really want to drive yourself crazy, you can remind yourself that this is the age of Edward Snowden. Maybe Clapper is warning us to beware of hackers changing data because he wants us to be suspicious of our data. Maybe he’s going to be changing the data – and is laying the groundwork now to blame it on hackers.
If you need me, I’ll be hiding under the bed.
“Joe, I know how we can make a mint. We just get a database full of personal information for a bunch of really gullible guys with something to hide, and then we can sell it!”
“That’s great, Ron, but how do we get the database of personal information in the first place?”
Which is what gave me the idea for a database that will actually be the most elaborately designed honeypot in history.
Okay, work with me here.
First of all, we find a bunch of guys who admit to strangers that they’re looking to cheat on their wives. They even provide their contact information.
Heck, they’re even willing to pay for the privilege!
Meanwhile, we create a bunch of fake female profiles. And not only do these guys not realize the women are fake, but they have conversations with them! We just send out messages periodically from these fake women so the guys think there’s really women on this database interested in cheating with them.
Who knows. Maybe we can even convince the guys to pay more to talk to these fake women.
Oh, sure, some guys are going to figure it out, or feel guilty, and drop out. But the ones who keep on – we know we’ve got ‘em.
And the ones who do drop out, and feel guilty, and want us to delete all their information so their wives never find out? We tell them they have to pay more to delete their information! And most of them pay it!
And then we don’t delete it all!
After all, data is valuable.
Then, when the database gets big enough, we tell them it’s been hacked! And all their information has been stolen! And to make that more plausible, we’ll use a really simple encryption technique that would make it easy for someone to hack it.
(Not that we need to worry much about that. They’ll end up picking really common, easy passwords.)
That’s when we can sell the data. We can market it as “Gullible guys with lots of disposable income who won’t want to go to the police.”
We can even sell the database to blackmailers. Sure, none of these guys actually cheated – but how would they convince their wives of that? Just the fact that they signed up to be in a database of cheaters is damning enough, isn’t it?
And yes, some of the guys might be kind of upset. There might be some collateral damage. We have to be prepared for that.
But think of the money we’d make! Plus, we can do it again! All the publicity will probably cause even more gullible guys to join!
Still. Maybe we don’t want to do it for real. Maybe we should just write a movie script about it.
Naah. Nobody’d believe it.
First, lightning struck the utility grid used by one of Google’s three data centers in St. Ghislain, Belgium, a small town about 50 miles southwest of Brussels, knocking the center offline and losing some data.
You know how they say lightning never strikes twice? Well, it hit the grid that Google uses four times.
Naturally, there were backup systems, but they failed too, writes Yevginey Sverdlik in Data Center Knowledge. “Besides failover systems that switch to auxiliary power when primary power source goes offline, servers in Google data centers have on-board batteries for extra backup,” he writes. “But some of the servers failed anyway because of ‘extended or repeated battery drain,’ according to the incident report.”
The storage in question was part of the Google Compute Engine (GCE) disks, which allow customers to run cloud-based virtual machines, according to Mike Brown in the International Business Times. “It’s not the first time GCE has had issues,” Brown writes. “In February, GCE experienced a global outage that lasted for nearly two hours affecting businesses that depend on GCE for their day-to-day operations. GCE is seen as a competitor to Amazon AWS and Microsoft Azure for dominance of the cloud, but instances like these will shake consumer confidence in the GCE brand as they look for the most stable cloud services possible.”
Brown noted that “To be sure, AWS and Azure have also had their share of outages,” such as Virginia thunderstorms in 2012 that took out major Internet services such as Netflix, Pinterest, and Instagram.
Altogether, Google servers had problems for about five days, with a resultant loss of 0.000001 percent of data, Sverdlik writes. (How many bytes that is, Google didn’t say.) It’s not known which clients were affected, or what type of data was lost, according to the BBC.
“Having worked in data recovery, that’s a remarkable achievement and a definite feather in Google’s bow,” commented one reader.
Google staff apparently had to tweak the servers some to retrieve data as well, wrote the company in its incident report. “In almost all cases the data was successfully committed to stable storage, although manual intervention was required in order to restore the systems to their normal serving state.” The company also pointed out that users needed to make additional copies of data in case of such incidents. “GCE instances and persistent disks within a zone exist in a single Google data center and are therefore unavoidably vulnerable to data center-scale disasters,” Google wrote, recommending GCE snapshots and Google Cloud Storage.
Next, a failed chilled water pipe caused the air conditioning system to fail in a CenturyLink data center in Weehawken, N.J. This data center provides facilities for a number of companies, including education company Pearson, Thomson Reuters, and trading companies BATS Global Markets and Investment Technology Group.
As a precaution, CenturyLink reportedly shut down its systems, meaning that the companies went offline as well. Incidentally, this was happening at the same time the stock market was tanking last week.
And this is just in August.
By the way, the hurricane season typically enters its heaviest phase on September 1. We’re already up through Fred.
“OMG, did you hear? You can now mail a hard drive in to Google to store it on the cloud! Squee!”
Not exactly. Hold your horses, people.
Yes, it’s true that being able to mail a hard disk device for import into a cloud storage service helps a lot when you have a lot of data. Nobody’s arguing with that. As Google points out, trying to upload just a terabyte can take more than 100 days. And as we all know from the station-wagon-full-of-backup-tapes calculation, sometimes that’s just the fastest way.
But this isn’t really a new thing. Sorry.
So, what actually is new here? Well, there’s the name: Offline Media Import/Export.
- Instead of mailing in your device to Google directly, you now do it through third-party service providers. So far, that list is exactly one vendor long: Iron Mountain, which performs the service in North America. Providers for EMEA and APAC are reportedly coming. You can, however, also do it through any other provider you like, Google writes. Meanwhile, it looks like the direct-to-Google service, which was only experimental anyway, might be kaput.
- Previously, you could send in data only on a hard drive, which Google would upload for you for $80 apiece. Now you can also send in data on tapes or thumb drives. Google doesn’t specify what kind of devices or formats it supports; presumably that’s up to the third-party vendor. (Can you send in that box of 8” floppy drives you found in the closet? Who knows?) Incidentally, Amazon has also supported storage devices other than hard drives; Microsoft supports only hard drives, and only up to 6TB.
What happens to the storage device afterwards? That’s up to you, writes Ben Chong, product manager, in a blog post describing the service. “Once data upload is complete, Iron Mountain can send the hard drive back to you, store it within their vault or destroy it.”
How much will it cost? Google didn’t say, because you now pay the third-party provider, not Google.“Neither Google nor Iron Mountain call out pricing for the service, but it’s likely competitive with Amazon’s rates of $80 per storage device and then $2.49 per hour it actually takes to upload to the cloud,” writes Matt Weinberger in Business Insider. Google’s previous service cost an $80 flat fee with no per-hour charge, plus Google supported drives of up to 400TB; Microsoft charges $80 per disk drive.
Google’s previous service not only supported encryption, it required it. The new service definitely supports encryption, but it isn’t clear whether it’s required. The service’s product page doesn’t even mention encryption, though Chong’s blog post does. “Save and encrypt your data to the media of your choice (hard drives, tapes, etc.) and ship them to the third party service provider through your preferred courier service,” he writes. The encrypted data will be uploaded to Google Cloud Storage using high speed infrastructure.”
So there you go. Always nice to have a new option but it’s not really the Great! New! Thing! That some — including Google, apparently — would have you believe.
Now in its 73rd official year, science fiction fandom is grappling with a very present-day problem: How to archive its history in a way that future generations can reference.
“Archiving for the Future,” a panel session held at this week’s World Science Fiction Convention (Worldcon) in Spokane, Wash., included several science fiction historians as well as archiving professionals who discussed the aging of the fandom population, the lack of a clear repository for the history, and the fact that there’s so much material that no single site could actually serve as such a repository.
In the same way that some comic books became scarce because everyone’s moms threw them out, some irreplaceable science fiction fandom material, such as fanzines, has been lost because it was considered ephemera and discarded, panelists lamented. The problem is, many well-known science fiction authors started out as fans and their early work was included in those fanzines. “You have to save everything because you don’t know who that person will become,” one noted. “Some of those people became Harlan Ellison.”
Moreover, material on paper is vulnerable to a variety of ills ranging from moisture to fire. When you get access to material, share it, panelists were told: You never know when your house is going to burn down or be hit by a hurricane.
Even now, in an era where material is born digital, some is considered ephemeral but is actually significant historically, noted participant Leslie Johnston, whose day job is Director of Digital Preservation at the National Archive.
For example, when the Library of Congress announced in 2010 that it would archive Twitter – a project still under some criticism — some people didn’t understand why they’d want to bother saving details of what people had for breakfast, she said. But Twitter has become the first place a number of historical events and reactions to them were documented, such as the death of Osama bin Laden. “Twitter is today’s diaries,” she said.
And when such material does manage to make it to a collection rather than being thrown out, it’s often missing much of the context that gives it value, panelists said, citing cases of getting hundreds of photographs “from Worldcon” but with none of the participants, or even which Worldcon it was, identified. “It always comes down to the metadata,” said panelist Pierre Pettinger Jr., whose particular specialty is costuming. People have had to resort to such techniques as identifying venues based on the woodwork and carpeting shown in the pictures, panelists reported.
It was suggested, though, that crowdsourcing could help with some of that identification. Crowdsourcing has been used by a variety of libraries, from the New York Public Library to the British Library, to help identify and verify material ranging from maps to menus.
While some people may think that the preservation problem is solved once material is scanned or otherwise digitized, that’s no panacea, either, Johnston said. “Digitization is not preservation,” she said. “It’s creating a whole new set of materials that need to be preserved.”
What’s the issue? First of all, some of the digitized formats themselves are vulnerable. “CDs make me crazy,” Johnston said, because of their fragility, and thumb drives aren’t much better, relating the case of one that went through the wash.
Second, as time goes on, the hardware and software required to read material in particular formats can become hard to find, no matter how popular it once was, Johnston said. For example, the industry stopped manufacturing slide projectors three years ago, which will make it more difficult to look at slides going forward. She praised organizations such as the National Audio-Visual Conservation Center in Culpeper, Va., which holds a large archive of such hardware and software.
This loss of data isn’t just with old files, Johnston cautioned, noting that even some more recent material, which used early versions of cutting-edge storage formats, is now inaccessible.
Another issue, particularly with photographs, is that of the rights, panelists reported. Pettinger noted that he often posts images online that are of a lower quality than others he has because of concerns that people will appropriate them.
Similarly, panelists discussed the conflicting rights among the people who owned a picture vs. the people who might appear in it. Few of those subjects ever signed model releases, said fan history specialist Joe Siclari, who added that he always takes down images on request from the people in them. Fandom needs a better education in rights and how those rights can be transferred and archived, panelists said.
What’s needed now is for members of fandom to take responsibility for identifying and organizing the material they have, while they’re still around to do it, panelists said. In addition, fandom should set up a collaborative collection, where it’s accepted that a repository for one kind of material, such as costuming, will be located at one institution, with other institutions acting as repositories for other kinds of material.
Finally, that information also needs to be made available to fandom by creating repository directories, because there’s so much material that no one institution can take it all, Johnston said. That way, aging fans, and their descendants, know the value of the material and the process to follow for donating it.
In addition, there needs to be a canonical list of the types of hardware and software available, and where, that are available to read the different file formats. That way, archivists will be able to find out how to retrieve past material, panelists said.
Ideally, fans of the future would be able to see material in the same way that the writer originally did, Johnston said, citing the example of professor Salman Rushdie’s archive at Emory University.
Meanwhile, people are on their own. “How do we find the place that wants the stuff we have before we croak?” summarized one session attendee.
As you may recall, a year or so ago, activist investor Elliott Management Corp. took a large position in EMC stock, with the goal of “releasing shareholder value” – in other words, selling or reorganizing some of the pieces of EMC and VMware to make more money for stockholders. EMC CEO Joe Tucci has largely been resisting that effort, but a deadline is coming up that may mean something will happen – ranging from EMC buying VMware to VMware buying EMC.
How large is large? Reportedly it was more than $1 billion, which would amount to about 2 percent of its value, and also make it EMC’s fifth-largest shareholder.
So if this is something that’s been going on for a year, why the pressure now? It’s because in January, Elliott and EMC made a “standstill agreement,” which basically means that Elliott would not publicly pressure the company into divesting its holdings in VMware, in return for getting two people on the board of directors, writes Martin Blanc in Bidness Etc.. However, that agreement is set to expire in September, writes Anne Shields in Market Realist.
Moreover, Tucci’s on-again, off-again retirement is on again, Shields writes. “EMC’s CEO, Joe Tucci, is also under tremendous pressure to get EMC on the right track before he retires,” she notes. “David Goulden, CEO of EMC’s information infrastructure unit, as well as Patrick Gelsinger, VMware’s present CEO, are seen as equal contenders for EMC’s future CEO position.”
It might sound weird for the subsidiary VMware to buy out the parent EMC, but it makes sense because VMware stock is worth more than EMC stock, writes Blanc. “The move would likely be backed by Elliot Management as it will unlock more value for investors,” he writes. “Secondly, VMware already makes up for 73% of EMC’s entire market capitalization, so it makes more financial sense.”
Also, in some ways, VMware is the stronger company, with EMC facing pressure from flash drive manufacturers, commodity storage manufacturers, and other sources. “EMC would emerge weaker than before,” writes Arik Hesseldahl in Re/Code, which started this whole speculation. “An EMC-minus-VMware scenario leaves the parent with a value of about $11 a share, or less than half what it’s trading for now.”
A VMware acquisition would work like this, according to Hesseldahl: “VMware would issue somewhere between $50 billion and $55 billion worth of new shares,” he writes. “A portion of those shares — about $30 billion — would be used to cancel EMC’s 80 percent stake in VMware, which currently has a market value of $38.5 billion. The remaining new VMware shares would be issued to current EMC shareholders, who will also get some cash generated from the issuance of about $10 billion in new debt.”
Putting VMware in charge would also make the merged company more forward-looking. “Inverting the company to make VMware the pinnacle would send a message that says storage hardware is not the future and virtualization/cloud (whatever that means) is where the world is headed,” agrees analyst Chris Evans. It would probably also play better with the companies’ various partners, he adds.
Ultimately, some sort of acquisition between the two companies wouldn’t have much effect in the long run about how they operate, writes Chris Mellor in The Register UK. “Not much would have changed fundamentally, on the good ship EMC, apart from the deck chair arrangement and signage,” he notes.
One big change? Integrating the two companies could reduce their operational expenses by almost $1 billion, writes Shields. And indeed, the most recent EMC earnings call hinted at such a possibility, with the company promising $850 million in savings by the end of 2016, though it didn’t say how.
That said, the stock market wasn’t necessarily thrilled about the potential merger news, particularly from the VMware side, writes Shields. “EMC shares rose more than 3%, whereas VMware shares fell more than 5% on August 5, 2015,” she notes.
Companies that collect large amounts of user data, such as Facebook, Google, and Twitter, may have a tougher time fighting government requests for that information after a recent court case.
New York prosecutors had filed 381 warrants in 2013 to get photos and private information from Facebook on hundreds of public employees suspected of Social Security fraud. A Manhattan-based state appeals court has unanimously ruled that the warrants could only be challenged by defendants in criminal cases to move to suppress the evidence they produced, according to Reuters.
This is the third time Facebook has lost on this ruling, and it had already provided the requested data to prosecutors.
Reportedly, “Facebook pages showed public employees who claimed to be disabled riding jet skis, playing golf and participating in martial arts events,” Reuters writes. By collecting the Facebook data, the government has collected nearly $25 million from those people.
It’s not the first time that people have been fired or lost insurance due to pictures on Facebook. What was new in this case was the government using warrants to gather information about the people from Facebook, some of whom were September 11 first responders. It also used private messages, not just information available publicly.
This appeal arose from the largest set of search warrants that Facebook had ever received, according to the brief on the case. It noted that of the 381 warrants, only 62 of the targeted Facebook users were charged with any crime. (Eventually, 134 users had charges filed.)
“The warrants also contained broad gag provisions barring Facebook from informing its users what the Government was forcing it to do,” the brief continues. “The Government’s bulk warrants, which demand ‘all’ communications and information in 24 broad categories from the 381 targeted accounts, are the digital equivalent of seizing everything in someone’s home. Except here, it is not a single home but an entire neighborhood of nearly 400 homes. The vast scope of the Government’s search and seizure here would be unthinkable in the physical world.”
Facebook’s objections were primarily to the fishing expedition aspect of the warrants, noting that only a fraction of the information requested had anything to do with proving Social Security fraud, and that there was no provision for the government to return the data to the users.
Throwing a sop, the court agreed that Facebook had a point. “Our holding today does not mean that we do not appreciate Facebook’s concerns about the scope of the bulk warrants issued here or about the district attorney’s alleged right to indefinitely retain the seized accounts of the uncharged Facebook users,” the five-judge panel wrote, according to NBC.
Facebook also pointed out that as the holder of the data, it had to do all the work to collect it for the police, compared with a typical search warrant where the police are doing the searching.
Ultimately, though, that wasn’t enough. “If the cops show up at your door with a warrant to search your house, you have to let them search,” writes Orin Kerr in the Volokh Conspiracy legal blog. “You can’t stop them if you have legal concerns about the warrant. And if a target who is handed a warrant can’t bring a pre-enforcement challenge, then why should Facebook have greater rights to bring such a challenge on behalf of the targets, at least absent legislation giving them that right?”
While this particular action happened to target Facebook, there were amici curiae briefs from companies including Google, Microsoft, Pinterest, Twitter, and Yelp (as well as the New York Civil Liberties Union), because it could have just as easily been them. (Similarly, Microsoft is carrying the water for a case concerning the right of the U.S. government to seize data stored offshore, with Apple, AT&T, Cisco, and Verizon backing it up.) Tumblr, Foursquare, Kickstarter, and Meetup also filed a brief, arguing that “the lower court’s decision was especially troubling for startup online platforms like themselves” because smaller companies often lacked the financial resources to challenge warrants.
Part of the problem, the companies acknowledged, is that their business models are predicated on people being willing to share information about themselves online, which is sort of hard to do when you feel like the government could come in and snap up anything you post and the company can’t even warn you about it. Or, in lawyer talk,“Here that freeze also threatens the willingness of users to participate in online platforms — fora for speech of all kinds — that small and mid-size companies offer, for fear that their private information will be obtained improperly and without their knowledge,” the brief said.
Part of the problem, too, is that at least some of these people actually did appear to be committing fraud. In the same way that fighting for the right of people to encrypt their data and not reveal the key to the government means you end up supporting child pornographers, it’s can be more challenging to support legal principles if in the process crooks go free.
Facebook is reportedly considering whether to appeal the decision.
Legislation that had allowed enforcement and intelligence agencies in the U.K. to force communications providers to store records of their customers’ activities has been shot down by the country’s highest court, but the government has nine months – til March 2016 — to rewrite the law to make it more palatable.
Plus, the UK has already put forth another bill that could be even worse.
The Data Retention and Investigatory Powers Act (DRIPA) had been challenged by Members of Parliament David Davis and Tom Watson on the grounds that it lacked sufficient privacy and data-protection safeguards, Politico writes. “This is the first time a British national court has struck down primary legislation in the country, and the first time that a member of parliament has brought a successful judicial review against the government,” the site adds.
What was wrong with the law? “The MPs complained that use of communications data was not limited to cases involving serious crime, that individual notices of data retention were kept secret, and that no provision was made for those under obligation of professional confidentiality, in particular lawyers and journalists,” writes the Guardian. “Nor, they argued, were there adequate safeguards against communications data leaving the EU.”
Critics also said it had been rushed through Parliament, which is what led to the unusual judicial challenge, the BBC writes. “Normally it would be scrutinized in Parliament, but the two MPs say that because the Data Retention and Investigatory Powers Act was rushed through in days, there was no time for proper parliamentary scrutiny and that this judicial review was their only option.” Legislation in the UK usually takes months to pass, but the government claimed it needed the bill right away to protect British citizens against terrorism.
The law governed gathering information about who suspects contact by telephone or email, according to the BBC, and allowed the data to be stored for up to a year. “This does not include content but does include the fact that calls and emails are made, by whom, to whom and how often,” the BBC writes. “Some half a million requests are made each year for this data.”
As with similar laws in the U.S., DRIPA supporters said the law was important to save lives in cases such as kidnapping and potential suicides.
The UK bill followed a similar one for the European Union as a whole, which was invalidated by the Court of Justice of the European Union in April, 2014. “The court struck down the directive largely because of poor access controls, although it was also concerned that citizens were not being informed about who was holding their data, and that some of the data might unlawfully leave the EU,” Politico explains. The MPs also drew on a number of EU laws in their arguments against the law.
DRIPA wasn’t just an issue for residents of the UK. The law also had a clause making it clear that foreign firms holding data on U.K. citizens could also be served with a warrant to hand over information. Anyone providing a “communication service” to customers in the UK, regardless of where that service is provided from, needed to comply, writes Lexology. “This was previously considered to be a grey area, and this clarification has significant ramifications for those providing communication services in the U.K. from overseas,” Lexology adds.
Exactly how the law could be rewritten is now being discussed. It could include more time to allow proper scrutiny of the proposed measures, writes the Media Policy Project blog of the London School of Economics.
The UK government has already said it plans to appeal the ruling. “I do think there is a risk here of giving succour to the paranoid liberal bourgeoisie whose peculiar fears are placed ahead of the interests of the people,” Security Minister John Hayes reportedly told BBC Radio 4’s The World at One.
But Parliament is already slated to see next month another bill that could be even worse: the Investigatory Powers Bill, writes the Huffington Post. “Revealed during the Queen’s Speech as a replacement for the emergency bill, the Investigatory Powers Bill has potentially far greater reach than even DRIPA with some of the preliminary wording suggesting that if fully approved it would allow the Government powers to ban encrypted communications services such as WhatsApp, iMessage and Facebook Messenger,” the Post writes.