If you’re a politics fan and have some time on your hands, there’s some new rabbit holes to go down that give you a great opportunity to compare two Presidents.
First, the Internet Archive has set up an archive of more than 500 hours of Trump footage dating back almost ten years. Think that what he’s saying doesn’t mesh with what he’s said before? Afraid he’s going to pull a 1984 and say we’ve always been at war with Eastasia? Well, the Internet Archive has saved it all.
“The Trump Archive launches today with 700+ televised speeches, interviews, debates, and other news broadcasts related to President-elect Donald Trump, created using the Internet Archive’s TV News Archive,” writes Nancy Watzman in the Internet Archive blog. “A work in progress, the growing collection now includes more than 520 hours of Trump video. The earliest excerpt dates from December 2009, and the collection continues through the present. It includes more than 500 video statements fact checked by FactCheck.org, PolitiFact, and The Washington Post’s Fact Checker covering such controversial topics as immigration, Trump’s tax returns, Hillary Clinton’s emails, and health care.”
What’s more, it’s all freely available. “Reporters, researchers, Wikipedians, and the general public are invited to quote, compare and contrast televised statements made by Trump,” Watzman writes, offering suggestions such as using clips in articles and videos and creating supercuts on topics like Trump’s perspectives of the US press. Moreover, she asked technical people to “help us enhance search and discovery by collaborating in experiments to apply artificial intelligence-driven facial recognition, voice identification, and other video content analysis approaches.”
Doesn’t that sound fun.
This is just a start, Watzman notes. “We’ll explore the idea of creating curated collections for Trump’s nominees to head federal agencies; members of Congress of both parties (for example, perhaps the Senate and House majority and minority leadership); Supreme Court nominees, and so on.”
The whole effort is similar to Politwoops, an effort that started in 2012 to keep track of politicians’ deleted Tweets – including those of President-Elect Trump.
At the same time, much of the content created under President Barack Obama’s administration has also been archived. “The Obama White House website – which includes press articles, blog posts, videos, and photos – will be available at ObamaWhiteHouse.gov, a site maintained by the National Archives and Records Administration (NARA), beginning on January 20, 2017,” according to a post on the aforementioned site. “If you are looking for a post or page on the Obama administration’s WhiteHouse.gov from 2009 through 2017, you can find it by changing the URL to ObamaWhiteHouse.gov.”
Given that President Trump’s administration is already removing content from the White House web pages, this is likely to be useful going forward.
That link also lists all the social media archives from the Obama administration, as well as new social media links for many former White House officials. “From tweets to snaps, all of the material we’ve published online will be preserved with NARA just as previous administrations have done with records ranging from handwritten notes to faxes to emails,” wrote Kori Schulman, Special Assistant to the President and Deputy Chief Digital Officer, in October, explaining how the digital transition would work. “Second, wherever possible, we are working to ensure these materials continue to be accessible on the platforms where they were created, allowing for real time access to the content we’ve developed.
While this is a laudable goal, they kind of had to, writes Lisa Vaas in Naked Security. “The White House didn’t have much choice, given that accessibility of federal government communications is required by the Freedom of Information Act (FOIA),” she writes. “All those tweets and Facebook posts need to be retained and available to the public on request, in a ‘future-proof’ format.” In other words, the material will continue to be available even if one of the platforms goes belly-up, she writes (as, for example, Vine already has).
In addition, there is also a searchable archive of social media posts spanning eight platforms, writes Alex Byer in Politico. While timing was certainly a factor, notes Ian Bogost at The Atlantic, Obama certainly embraced social media.
As well as giving heartsick Democrats something to cry over during the dark days moving forward, the archive will also be studied by researchers, such as the University of Texas School of Information.
Like it or not, social media and the Internet will be an indelible part of the Presidency going forward, just as radio and television were.
This is January, and you know what that means: Trend stories. Specifically, storage trends.
Yes, for some reason a completely arbitrary line on a calendar turns everyone into prognosticators. Actually, the reason is pretty simple: Nobody announces anything in December and early January, and we’ve got to write about something.
Actually, with the consolidation and commoditization of the storage and e-discovery industries, there haven’t been as many predictions and retrospectives as there used to be. Other than worrying about what the Donald Trump administration is going to do, of course.
That’s what was interesting about a recent report from Kroll Ontrack about 2017 storage and security trends. While it was largely based on what the staff was seeing its own business, it still provided a useful snapshot as to what was going on.
The use of flash/solid state drives is increasing. Hold the presses. But Kroll’s evidence for this was interesting: “We have seen a 239 percent increase in the number of hybrid drives needing data recovery since 2014,” the company reports. That’s certainly one indication of, if nothing else, the reliability of such drives – or, perhaps, the lack of it.
The downside of hyper-converged storage. Vendors have been pushing hyper-converged storage and networking recently, billing it as easier to use. While that’s true, it comes with a downside: Vendor lock-in. And Kroll is running into similar issues.
“We are seeing that recovery from these complex systems often requires a custom solution because data is fully integrated into the unit making it difficult to gain sector-level access to the disks,” Kroll reports. Moreover, because hyper-converged devices are simpler, simpler people are using them, which runs into problems. “Organizations are employing less specialized individuals to operate hyper-converged storage systems – employees who may not have the depth of knowledge needed to solve more complex problems,” the company writes. “This presents new challenges when backups need to be verified or when data loss occurs.”
Tape is still around. In a separate survey of 819 IT administrators, Kroll found that many of them still haven’t gotten tape backup working right. “Nearly half of the companies surveyed (49 percent) confirmed they run two or three different backup solutions, with an additional seven percent running four or more parallel solutions,” the company reports. “Nearly one third (27 percent) of all participating companies reported they do not have clear insight into what specific information is stored on their legacy tapes.” In addition, more than half of respondents (56 percent) of respondents said they use different versions of their backup solutions (for example, different versions of the same backup format at each company site).
Consequently, the cost of keeping legacy data accessible, depending on company size, ranges from $10,000 to over $1 million annually. The primary costs are due to storage (70%), maintenance (69%), staff (52%), security (42%), and licenses (38%). And in an ominous sign, up to 40 percent of companies said they intend to terminate their maintenance contracts due to cost.
Security is hard. Kroll notes that stories of ransomware and associated data loss were rampant in 2016. “Hospitals, corporations, individuals and government entities were all exposed or lost data in these attacks,” the company writes. “Wearable technology is especially vulnerable as there can be little to no real security on your device.” For that reason, Kroll predicts a wider use of encryption – assuming a Trump administration will allow us to use it.
Here’s to 2017 and our new Soviet overlords.
Prosecutors investigating a murder have submitted a warrant to Amazon, trying to get data that the suspect’s Amazon Echo smart home assistant might have recorded at the time.
If you don’t have an Echo (introduced in November, 2014) or similar device, it works like this: Unless the mike is shut off, the device is constantly “on,” listening for its name. Once it hears it, it records a snippet of audio so that the statement can be sent to the cloud, analyzed by servers, and then responded to.
In this particular case, on November 22, 2015, James Bates had several people over to watch a football game, including Victor Collins. Collins’ body was found in the hot tub the next day, and Bates was accused of killing him. Bates’ next hearing is scheduled for March 17.
But because Bates owns an Amazon Echo, and reportedly was using it the night of the murder to play music, prosecutors wanted to know if it had heard anything. “The search warrant, signed by a judge in August, requests all ‘audio recordings, transcribed records, text records and other data’ from Bates’ Echo speaker,” writes Jill Bleed for the Associated Press. “So far, authorities have obtained only basic subscriber and account information.”
Not to say the recordings might not be there. “Amazon keeps all of the recordings of you asking Alexa to play WNYC or of you setting a timer for 20 minutes. You can jump into the Alexa companion app and hear all of your requests again if you want to see just how bored you sound when talking to your home voice robot,” writes Jake Swearingen in New York. “I’ve heard from a professor who works in voice research that Amazon deletes all voice data after six months — but Amazon has no stated policy about how long it holds onto that data. Still, if any of this has you feeling uneasy about your Amazon Echo, you can always head to amazon.com/myx, find your Echo, and delete out all of your old voice recordings.”
This isn’t the first time the subject of what listening devices can hear has come up. A couple of years back, when voice-activated devices first started becoming more widely available, some people started freaking out that their voice-activated TVs might be “listening” to them. Some people were also concerned about children’s toys that had intelligent recording devices in them.
Nobody appears to be suggesting that the suspect asked, “Alexa, how do I hide a body?” or anything along those lines. It appears that prosecutors hope that someone giving Alexa a command might have caused the system to record some background noises relevant to the case. “Police did not specify what data they expected to find on Bates’s Echo — nor is it clear what the device could have captured that would have been relevant to the case,” writes Amy Wang in the Washington Post. “Only if someone happened to have triggered his device with its wake word would it have begun recording any audio. Even then, it seems unlikely that audio would be conclusive evidence of an alleged murder.”
And, frankly, it’s possible that prosecutors don’t exactly understand how Amazon Echo works, and are assuming that it’s always on and always recording, Wang writes. “At least part of the search warrant indicated police may not have had a full understanding of how the Echo worked,” she writes. “That allegation — that the Echo is possibly recording at all times without the “wake word” being issued — is incorrect, according to an Amazon spokesperson. The device is constantly listening but not recording, and nothing is streamed to or stored in the cloud without the wake word being detected.”
The search warrant gives some evidence of this, “The Amazon Echo device is constantly listening for the ‘wake’ command of ‘Alexa’ or ‘Amazon,’ and records any command, inquiry, or verbal gesture given after that point, or possibly at all times without the ‘wake word’ being issued,” notes the report.
Amazon, for its part, appears to regard the whole thing as a fishing expedition, and has thus far refused to comply with the warrant. “In a statement, Amazon said it ‘objects to overbroad or otherwise inappropriate demands as a matter of course,’” writes Kathryn Varn in the Tampa Bay Times.
One thing is clear: This isn’t over. “As we connect more things to the Internet in our houses, these devices will become involved in more crime investigations,” predicts Alina Selyukh, a tech blogger for National Public Radio. She points out, for example, that prosecutors used information from a smart meter to note that Bates had used a lot of water that evening, potentially to wash away evidence of a murder.
Moreover, it’s not out of the question that manufacturers of devices such as the Amazon Echo could become obliged to making, keeping, and providing such audio records to law enforcement.
And the case could also become a precedent for the Internet of Things. “’The Arkansas slaying could be a test case for how evidence rules apply to information from home appliances connected to the internet such as water meters, thermostats and lighting systems, said Nuala O’Connor, president of the Center for Democracy & Technology, a nonprofit group that works on privacy and civil-liberties issues,” Bleed writes.
The problem with being a storage nerd is it makes it pretty difficult to enjoy the movies. In this day and age, too many movie plots hinge on computers and data, and moviemakers typically aren’t geeks, so it’s way too easy to lose your suspension of disbelief based on bonehead errors.
Take Rogue One. And yes, here be spoilers; as a good nerd, I saw it opening night, but waited til after Christmas to write this so that most reasonable people had already seen it.
Sneakernet is alive and well. Yes, here we are in a galaxy far, far away, with space travel and holograms and planet destroyers, and we still are exchanging data using tape cartridges, CD-ROMs, USB sticks, and watching a meter as we upload data? And tape cartridges have a handy-dandy loop on them so you can hook them to your belt? Not to mention the fact that, although all the plans are in electronic data storage, there’s no way to gain access to the file other than flying out to that base.
Encryption isn’t a thing? Okay, maybe the reason you couldn’t gain access to the file other than by flying out to the base was for physical security. I’ll buy that. But then they have that entire library stored out there, with all the seeekrit plans for the Death Star, in unencrypted files? See what happens, FBI, when you outlaw encryption?
Data centers powered by renewable energy. Perhaps the Scarif data center was also protected by a moat. With a waterfall. But perhaps the waterfall was there to power the library using renewable energy. It’s nice to know that even the Empire wants to be green.
User interfaces still need work. Okay, it’s not “It’s Unix! I know this!” but apparently all the user interface designers got killed off early in the war because all the technology seems incredibly hard to use. That tape library is pretty snazzy – but there’s no instructions or intuitive interface, and access to the tapes needs to be done manually? And uploading data means you need to walk out on a catwalk to manually adjust the satellite dish? Or hook up a manual data transmission with a big, fat (yet, still incredibly flexible) cable that nonetheless requires you to walk out to a control panel to flip a master switch? (And has a port that just anybody can walk up and plug said cable into?) Not to mention, how screwed are you if, while using those instructionless mechanical hands, you happen to drop the tape cartridge down twenty stories?
We still don’t have a good system for filenames. Admittedly, Erso couldn’t call the blueprints Secret_Plan_to_Destroy_Death_Star, but really, we’re reduced to having Jyn read the filenames manually until she finds the one that’s her nickname? So if Jyn hadn’t been around, the Alliance never would have stumbled on the plan? Knowing a secret about the designer worked to find the back door in Wargames, but shouldn’t we have advanced beyond that by now?
Back doors don’t work. If nothing else, perhaps Rogue One will point out to law enforcement and the federal government why encryption back doors are a bad idea. Erso’s back door allows a couple of rebels with bombs and teeny planes to destroy an $852 quadrillion investment. At least it’s reassuring to find out, after almost 40 years, that someone had created that flaw in the design on purpose.
We still don’t have backups? Rebels are at the Scarif archive? No problem, says Governor Tarkin; we’ll just blow it up – a decision that caused no small amount of hand-wringing among librarians. “Did the Empire have a data backup plan?” worries Gabriel McKee, librarian for collections and services for New York University’s Institute for the Study of the Ancient World, who still hasn’t gotten over the destruction of the library at Alexandria. “What else was stored there, and was any of that data backed up elsewhere? Did Tarkin have authorization to, for lack of a better word, deaccession the entire archive? And could anything of Scarif’s archive have survived such apocalyptic weeding?”
If nothing else, perhaps Rogue One can be used as a cautionary tale of how not to set up a storage archive.
Co-founder of the Electronic Frontier Foundation John Gilmore was known for his saying, “The Internet interprets censorship as damage and routes around it.” With Donald Trump’s election as President, the Internet is getting a little help.
Concerned that a Trump administration might delete decades of weather data in an effort to make it more difficult to demonstrate climate change, scientists are reportedly frantically making copies of weather databases in Canada, writes Brady Dennis in the Washington Post.
“Something that seemed a little paranoid to me before all of a sudden seems potentially realistic, or at least something you’d want to hedge against,” Nick Santos, an environmental researcher at the University of California at Davis who started copying government climate data onto a nongovernment server after the election — where it will remain available to the public – told Dennis. “Doing this can only be a good thing. Hopefully they leave everything in place. But if not, we’re planning for that.”
Paranoid? Maybe not, writes Weston Williams in the Christian Science Monitor. “This would not be the first time access to climate research was restricted by a US president,” he writes. “During President George W. Bush’s administration, many Environmental Protection Agency libraries were shut down, and there were multiple accusations that government publications on climate change had been edited to change their meaning.”
With Trump, activities such as the appointment of climate change deniers, the attempt to find climate change supporters in federal agencies, and the suggestion that NASA should no longer do weather research leads scientists to believe that a Trump administration could try to alter or dismantle parts of the federal government’s repository of data on everything from rising sea levels to the number of wildfires in the country, Dennis writes.
“To be clear, neither Trump nor his transition team have said the new administration plans to manipulate or curtail publicly available data,” Dennis notes. “The transition team did not respond to a request for comment. But some scientists aren’t taking any chances.”
It all started with a Twitter crowdsourcing request from meteorologist Eric Holthaus, Dennis writes. “What are the most important .gov climate assets? Scientists: Do you have a US .gov climate database that you don’t want to see disappear?”
Within hours, responses flooded in from around the country – enough that Holthaus created a Google spreadsheet to keep track of them all. In addition, investors offered money, attorneys offered legal help, and database specialists offered expertise and storage, Dennis writes. There’s now a GitHub repository as well.
“Within two days, more than 50 key data sets had been identified, and six of them have already been archived on publicly available nongovernment servers,” reports Holthaus. “Complementary efforts at the University of Pennsylvania and the University of Toronto are merging resources to attempt to avoid duplication of effort, and the Penn Program in the Environmental Humanities put the data refuge online Tuesday afternoon. On Twitter, the most common response to the project was, ‘I can’t believe it’s come to this.’”
In some cases, they’re even making a party out of it. At the University of Toronto, researchers held a “guerrilla archiving” event to catalogue key federal environmental data ahead of Trump’s inauguration, Dennis writes.
The work is associated with the End of Term Presidential Harvest 2016, an effort by the Internet Archive to ensure that useful federal government data isn’t lost during the transition between Presidents. “With the arrival of any new president, vast troves of information on government websites are at risk of vanishing within days,” writes Jim Dwyer in the New York Times. “The fragility of digital federal records, reports and research is astounding.
The Presidential Harvest project – “a volunteer, collaborative effort by a small group of university, government and nonprofit libraries to find and save valuable pages now on federal websites,” Dwyer writes – began before the 2008 elections and returned in 2012.
Moreover, the Internet Archive, which purports to make a copy of everything on the Internet, is also hoping to set up a copy of itself in Canada lest something happen to its American data, writes Michael Hiltzik in the Los Angeles Times. The Internet Archive includes copies of 279 billion web pages, 2.2 million films and videos, 2.5 million audio recordings and 3 million digital books, as well as software and television programs, he writes.
“The Internet Archive has stepped up its plans to back up its entire data hoard in Canada, out of reach of what might be efforts under a Trump administration to block public access to the material,” Hiltzik writes. Internet Archive founder and chairman Brewster Kahle is seeking donations to cover the estimated $5-million cost of the project by Jan. 20, Inauguration Day, he adds. Kahle is doing this because he is concerned by some of the things Trump said while campaigning.
On the other hand, the EFF is calling for the opposite tactic: The computer civil liberties organization ran a full-page ad in Wired encouraging sysadmins to, among other things, delete log files so they couldn’t be used in the future against people. “EFF’s open letter outlines four major ways the technology community can help: using encryption for every user transaction; practicing routine deletion of data logs; revealing publicly any government request to improperly monitor users or censor speech; and joining the fight for user rights in court, in Congress, and beyond,” the organization explains in a press release. “President-Elect Trump has promised to increase surveillance, undermine security, and suppress the freedom of the press. But he needs your servers to do this. Join us in securing civil liberties in the digital world, before it’s too late.”
Amazon has upped the ante on that old question of the bandwidth of a station wagon full of backup tapes hurtling down the highway: For one thing, instead of a station wagon, it’s a semi.
The data storage device hauled by the semi is a 45-foot shipping container called Snowmobile, and it could let people send up to 100 petabytes of data to Amazon Web Services – much, much faster than it would take to upload it. A petabyte is 1 million gigabytes. The data can be stored in either Amazon’s regular S3 service, or its “Glacier” cold storage service, which is less expensive.
The notion of doing the initial upload of data to a cloud service by shipping a physical hard drive isn’t new; the major cloud vendors have all supported this for a while. This is, in fact, considered an upgrade to Amazon’s “Snowball” service, which uses a mere 80TB suitcase.
But a semi? That’s new.
“Amazon plans to drive Snowmobiles to its customers’ offices, extract their data, then cruise to an Amazon facility where the information can be transferred to the cloud-computing network in far less time than it would for so much data to travel over the web,” write Jay Greene and Laura Stevens in the Wall Street Journal. “Ten Snowmobiles would reduce the time it takes to move an exabyte from on-premises storage to Amazon’s cloud to a little less than six months, from about 26 years using a high-speed internet connection, by the company’s calculations.”
Amazon announced the new service at its annual customer conference. It is actually already available and costs half a cent per gigabyte per month of use, or about $500,000 a month to use its full capacity, Greene and Stevens write.
Physically, Snowmobile attaches to your network and appears as a local, NFS-mounted volume, writes Jeff Barr, AWS chief evangelist, in a blog post (which includes all sorts of awesome Legos showing how it works). “Snowmobile is a ruggedized, tamper-resistant shipping container 45 feet long, 9.6 feet high, and 8 feet wide,” he adds. “It is water-resistant, climate-controlled, and can be parked in a covered or uncovered area adjacent to your existing data center. Each Snowmobile consumes about 350 kW of AC power; if you don’t have sufficient capacity on site we can arrange for a generator.”
So how do you attach it to your network? “Each Snowmobile includes a [fiber] network cable connected to a high-speed switch capable of supporting 1 Tb/second of data transfer spread across multiple 40 Gb/second connections,” Barr writes, adding that you can use your company’s existing backup or archiving tools. “Assuming that your existing network can transfer data at that rate, you can fill a Snowmobile in about 10 days.”
Of course, the question of security comes up. You don’t want your company’s entire data record to be hijacked by some guy with a CDL. “Snowmobile uses multiple layers of security designed to protect your data including dedicated security personnel, GPS tracking, alarm monitoring, 24/7 video surveillance, and an optional escort security vehicle while in transit,” writes Amazon. “All data is encrypted with 256-bit encryption keys managed through the AWS Key Management Service (KMS) and designed to ensure both security and full chain-of-custody of your data.”
Exactly when and where this encryption is done isn’t said; can you load encrypted data onto Snowmobile, so that even Amazon doesn’t know what it is, or is it Amazon that encrypts the data, meaning the company has access to the data at some point?
There’s also the matter of securing the trucks themselves while in transit, writes Daniel Stoller in Bloomberg. “Researchers recently said that trucks, much like the ones Amazon would use, are prone to the same kind of hacking attacks that have disabled some connected cars,” he writes. “The researchers showed that there is a real possibility of ‘safety critical attacks that include the ability to accelerate a truck in motion, disable the driver’s ability to accelerate and disable the vehicle’s engine brake.’”
That also raises the question of what happens to the Snowmobiles full of data after it’s uploaded. How do you wipe a Snowmobile, how long does it take, and what assurance do you have that Amazon actually does this? Assuming it does; Amazon doesn’t talk about it.
The question also arises, why bother transferring the data at all? Why not just truck it to Amazon, plug it in, and leave it? Though having all the data on a physical device would sort of defeat the purpose of having it in the cloud.
Getting the data back out again may be more of an issue (a problem Amazon has also had with Glacier. “The initial launch is aimed at data import (on-premises to AWS),” Barr writes, though he adds, “We do know that some of our customers are interested in data export, with a particular focus on disaster recovery (DR) use cases.”
It will be interesting to see whether other cloud vendors, such as Google and Microsoft, follow suit, or if Amazon will have the long-haul data-trucking field to itself.
As you may recall, in July the Second Circuit Court of Appeals ruled that Microsoft did indeed not have to turn over data it owned that was stored in Ireland, in response to a Department of Justice (DoJ) search warrant. At the time, supporters were glad but said it was possible that the DoJ would appeal the decision, which would mean it would go to the Supreme Court.
That has indeed happened, as of October. Sort of.
Technically, the appeal is to the Second Circuit Court of Appeals, and asks for an en banc hearing. In other words, the DoJ wants to present the information to everybody, not just a subset of judges, in hopes that they can find enough judges to agree with them. In practice, though, the Second Circuit rarely grants en banc hearings, and the previous ruling was unanimous anyway, so even if they did grant it, the verdict would likely be the same, writes Jeff John Roberts in Fortune.
The upshot, then, is that the case is likely to go to the Supreme Court.
This all gets pretty down in the weeds. “In filing for the appeal to the US Supreme Court, the DoJ has claimed that the Court of Appeals misinterpreted the law as to when companies are obliged to disclose data stored on servers in foreign jurisdictions,” writes William Fry in Lexology. “The Court of Appeals ruled that in order to rebut the presumption against extra-territoriality of legislation, the statute under which the warrant was issued (the Stored Communications Act) would have to contain an ‘affirmative indication’ of an intention to apply outside the US. The court determined that enforcement of the warrant constituted an unlawful extra-territorial application of the Stored Communications Act.”
In other words, because the Stored Communications Act didn’t specifically mention international communications, it shouldn’t apply, the Second Circuit ruled.
What some in the U.S. would like to do is, instead of making law through the court system, make it through legislation that specifically addresses the issue of international data searches, Fry writes. And that is indeed the mechanism the appeals court suggested in July. The Congressional sponsors of the ICPA have also written to the DoJ asking it to work with them on fine-turning the ICPA legislation.
The ICPA is an updated version of the Electronic Communications Privacy Act, which dates from the 1980s, and includes such things as a free pass to search for data as long as it’s more than six months old, writes Eric Peters in InsideSources. “The International Communications Privacy Act has been written to deal with ECPA’s shortcomings — including rescinding the ‘180-day loophole’ for data mining without a warrant — and to tamp down the international kerfuffle over whose laws apply,” he writes, calling on the current lame-duck Congress to pass the bill before the next session.
There hasn’t been much indication of this happening, and it seems unlikely that the incoming Congress and Administration are likely to do much about protecting users from government data search.
If the case does end up in the Supreme Court, that’s a whole new kettle of fish. Recall that, at least for the time being, there are only eight justices. What that means is, if they have a tied decision, it would apply only to this single case, not as a precedent. On the other hand, now that the election is over and the Senate is staying in Republican hands, it’s conceivable that there could be a ninth Justice by October, when the new Supreme Court year starts. In any event, Fortune’s Roberts believes that the Supreme Court would agree to hear the case because of its importance.
Another nuance is that different Internet companies have different policies for how they store their data, with some of them stashing it wherever’s convenient at the moment, anywhere in the world, and some of them choosing a location in or out of the country where the user resides. The Second Circuit’s decision makes it difficult for the DoJ to follow any procedure to get that data, writes Orin Kerr in the Washington Post. “ I didn’t expect that major domestic providers would respond to a ruling that they can’t be compelled to disclose foreign-stored emails pursuant to a warrant by refusing to disclose foreign-stored contents voluntarily when the target was domestic and the only reason that particular e-mail was foreign-stored at that instant was the fluid nature of the network’s architecture,” he writes.
In case this is all a blur to you, Microsoft reportedly had email messages from one of its customers stored on a server in Ireland. The DoJ wanted access to those email messages while pursuing an unspecified case, claiming that since the email messages were controlled by Microsoft, an American country, the DoJ had jurisdiction over them even though they were stored in Ireland.
This viewpoint was fraught for a number of reasons, as I described in July.
- Because so many computer companies are American, it would mean an awful lot of data worldwide would be subject to access by the U.S. government.
- Computer companies worried that worldwide customers would stop using them because they were afraid they’d get their data accessed.
- Having the data subject to U.S. access could mean that the company – Microsoft in this case, but any company – could be violating data privacy laws in force at the second country. (For that reason, dozens of companies and civil liberties organizations – as well as the government of Ireland itself — filed amicus curiae briefs supporting Microsoft.)
- If this precedent was set with the U.S., all the other countries in the world could declare that, in that case, all their data laws could apply to any company doing business in their countries, which could be an incredibly complicated, contradictory mess.
For its part, the DoJ said that if users could stash their data overseas, it would make it hard for the DoJ to catch bad guys. While there were other methods that would give the U.S. government the ability to request the data stored in the foreign country, the DoJ said they were hard to do,
At this point, it’s up to the Second Circuit – and after that, the Supremes.
The election of Donald Trump as President of the United States could have some really interesting implications for data storage, data sovereignty, and encryption.
If by “interesting” you mean Really Scary and Bad.
The Internet Association, a group of about 40 Internet vendors including Amazon, Google, and Netflix, have already called on Trump to support Internet technologies. “Some of the policy goals stated in the letter may align with Trump’s priorities, including easing regulation on the sharing economy, lowering taxes on profits made from intellectual property and applying pressure on Europe to not erect too many barriers that restrict U.S. internet companies from growing in that market,” writes Reuters’ Dustin Volz. “Other goals are likely to clash with Trump, who offered numerous broadsides against the tech sector during his campaign.”
Here are some possible aspects of the Trump effect on data storage.
Storage manufacturing. Trump has proposed imposing a 35 percent tax on companies that manufacture their products outside the U.S. While he originally said this in the context of Apple making iPhones, he has since extended it to any company that lays people off in the U.S. to outsource manufacturing to other countries. Because the vast majority of storage hardware is manufactured outside the U.S., this could have a significant impact – either by raising prices, reducing availability, or, perhaps, having vendors move their manufacturing to the U.S. This is especially true because he has also said he intends to lower the U.S. corporate tax rate from 35 percent to 15 percent and provide a moratorium for companies like Apple, which have large offshore bank accounts, to bring their money back to the U.S.
Encryption. Remember when Apple refused to help the FBI break into an encrypted iPhone? At the time, Trump reportedly called for a boycott of Apple products, according to Reuters. But as President, and based on his Cabinet picks thus far, he could do much more. Reuters also pointed out that Senator Richard Burr – who spearheaded last year’s effort to mandate encryption “back doors” – had also been re-elected, and as chair of the Senate intelligence committee was likely to reintroduce his legislation next year.
Trump’s pick for Attorney General, Jeff Sessions, also argued that government must have access to the phone, writes Anita Balakrishnan for CNBC. “Rep. Mike Pompeo, nominated for new CIA director, has called the use of strong encryption in personal communications a ‘red flag,” she adds. “Michael Flynn, tapped as national security adviser, supported the Digital Security Commission Act, whose sponsors supported a commission to examine cases like Apple’s.”
In response, vendors of end-to-end encrypted messaging applications, such as Signal, have seen a 400 percent increase in their downloads since the election, write Stephanie Hughes and Ben Johnson for Marketplace. “Signal uses end-to-end encryption, so that no one — not even the people at Signal — can read the messages you send to others,” they write. “It’s not the only encrypted app that’s seen an uptick in use — another messaging app, Wickr, also told us they have also seen a noticeable increase in downloads.”
Data sovereignty. Trump campaigned against the Trans-Pacific Partnership (TPP), a trade deal with 11 Asian countries – in fact, he called it “rape” — and with his election it is assumed to be dead. However, it included a number of components having to do with exchanging data between countries.
For example, the TPP would have banned forced localization – “the discriminatory requirement that certain governments impose on U.S. businesses that they place their data, servers, research facilities, and other necessities overseas in order to access those markets,” according to the White House. Countries that have implemented these laws, such as Russia, have blocked social networking sites like LinkedInb for storing data on citizens outside the country.
It also would have reserved free international movement of data, “ensuring that individuals, small businesses, and families in all TPP countries can take advantage of online shopping, communicate efficiently at low cost, and access, move, and store data freely,” the White House writes. “On the most fundamental level, TPP grants data, for the first time, the same legal protections in international trade law as goods,” writes Carter Dougherty for the International Business Times. “If it’s from a TPP member country, it’s treated as data flowing within the country would be treated.”
Surveillance. Trump has already indicated his interest in setting up a registry or database of Muslims or Muslim immigrants. Some people are concerned that this presages a general increase in government surveillance, Reuters writes. And Trump said during the campaign that he was interested in restoring provisions of the Patriot Act to allow for bulk data collection, writes The Hill.
Trump’s inauguration is scheduled for January 20.
Everybody’s all excited about self-driving cars, but not too many people are talking about self-driving cars’ data. What sort of data will be collected, how much will be stored on the car, where it will be sent from the car, and what sort of security that data will have from hackers, law enforcement, and the government are all open questions.
There isn’t any question that, one way or another, self-driving cars are going to generate and process a lot of data – on the order of 1 GBps, according to Intel, which is investing $250 million in self-driving car technology. That would be more than 4 TB an hour.
As someone who has trouble keeping up with the data collected by the dashboard cam, this seems like a lot of data. Where is the car going to put it?
“Some data will have to be stored online,” writes Paresh Dave in the Los Angeles Times. “That means automakers have to prepare for boosting spending on storage, said Intel’s Doug Davis, senior vice president and general manager of a division dealing with cutting-edge mobile devices.”
But the vast majority of it will be used in real time and then thrown away, automakers reassure us. The vehicle itself might store just a terabyte or so. Otherwise, it’ll be just like our phones — what’s kept will mostly go up in the cloud, according to Richard Barrett, senior product marketing engineer, automotive wireless technology at Cypress Semiconductor, writes Ann Steffora Mutschler in Semiconductor Engineering.
I don’t know about Barrett, but my phone has 54 gigabytes of storage, and I’ve used almost half of it.
But okay, fine. Most of the data will go into the cloud.
Hold on. Which data will go into the cloud? And who will get it, exactly? To what degree will you know, and how much can you limit this?
“Automakers already collect and store location and driving data from millions of cars on the road today,” writes Pete Bigelow in AutoBlog. “In the approaching era of self-driving vehicles, privacy advocates fear the data collection will grow more intrusive. The more sensitive the data, the more lucrative it might be for a company like Google, says John Simpson, an advocate with California-based Consumer Watchdog. ‘Once this technology is widely adopted, they’ll have all sorts of information on where you’re driving, how fast you’re going, and there’s no control over what Google might do with it,’ he said.”
In fact, automakers are reportedly salivating over the amount of data that self-driving cars could generate and how that data could be sold, Dave writes. “The $100-billion app economy built on data from smartphones would look small compared with the $750 billion in revenue produced around cars,” he writes – an estimate from McKinsey for 2030. “The forecast has automakers buzzing. As they accelerate spending on developing self-driving cars, they’re devoting enormous attention on what to do with data that those high-tech devices generate — beyond making the drive automated. Among the possibilities: selling details about driving patterns to real estate developers or using it in personalized insurance calculations.”
Note that some of this data is already being collected now. A number of auto loan companies, insurance companies, and car rental agencies have GPS units on the vehicle that collect information ranging from location to acting as a “black box” about your driving. The Government Accountability Office did a report on this in 2013.
Similarly, the state of Oregon is piloting a program where, instead of paying a gas tax, you’d pay a “mileage tax” based on how many miles in the state you drive. This is important, the state explains, because as cars become more efficient and are more likely to use electricity, they use less gas, and the gas tax no longer adequately funds road maintenance, repair, and construction. Which makes sense, but it still means you have a little box attached to your car keeping track of where you’ve been.
Not to mention, how else might this data be used? Bigelow writes. “How often do you happen to drive your car to a liquor store, and will that information be provided to your insurance company?” he quotes Consumer Watchdog executive director Carmen Balber as saying. “Will information on where you spend your Saturday nights be subpoenaed in a divorce proceeding?”
Keep in mind that in 2011 people had conniptions because they realized their smartphones were saving their location data. Now we’re going to let cars do it?
And those are just the legitimate data users. Can your car get hacked to get a nice list of your past movements for blackmail purposes? Can the real-time data be monitored so the burglars know when to hit your house?
That brings us to the courthouse. Will the government have access to the data – with or without a warrant – to know if you’ve been visiting a house associated with terrorist activities, or a neighborhood associated with drugs or prostitution?
Self-driving cars offer a lot of potential – but a lot of potential problems as well.
When will a disk drive fail? The people at BackBlaze, who use 67,814 disk drives, have been looking at Self-Monitoring, Analysis and Reporting Technology (SMART) disk drive statistics to try to predict this.
They’ve been looking at this for a while, at least since 2014. As I wrote then:
“SMART defines — and measures, for those vendors that support it — more than 70 characteristics of a particular disk drive. But while it’s great to know how many High Fly Writes or Free Fall Events a disk has undergone, these figures aren’t necessarily useful in any real sense of being able to predict a hard drive failure.
“Part of this is because of the typical problem with standards: Just because two vendors implement a standard, it doesn’t mean they’ve implemented it in the same way. So the way Seagate counts something might not be the same way as Hitachi counts something. In addition, vendors might not implement all of the standard. Finally, in some cases, even the standard itself is…unclear, as with Disk Shift, or the distance the disk has shifted relative to the spindle (usually due to shock or temperature), where Wikipedia notes, ‘Unit of measure is unknown.’”
At that point, BackBlaze had determined that out of the 70 statistics SMART tracked, there was really only one that mattered: SMART 187, or Reported_Uncorrectable_Errors. At the time, BackBlaze wrote: “Drives with 0 uncorrectable errors hardly ever fail. Once SMART 187 goes above 0, we schedule the drive for replacement.”
Since then, the company has been looking at the SMART statistics some more, and it’s now added four other statistics that it’s determined have a correlation to failed drives, writes senior marketing manager Andy Klein:
- SMART 5 Reallocated Sectors Count
- SMART 188 Command Timeout
- SMART 197 Current Pending Sector Count
- SMART 198 Uncorrectable Sector Count
The company didn’t say why it started looking at other SMART statistics if it had already determined that there was one statistic that was correlated with failure. (It also makes its raw statistics available in case you want to play with correlations yourself.) He also points out that not all vendors report all the statistics, or in the same way — an issue two years ago as well.
Another factor is the period of time in which the errors occur, Klein writes. “For example, let’s start with a hard drive that jumps from zero to 20 Reported Uncorrectable Errors (SMART 187) in one day,” he writes. “Compare that to a second drive which has a count of 60 SMART 187 errors, with one error occurring on average once a month over a five year period. Which drive is a better candidate for failure?” He doesn’t actually say, though he implies that it’s the first one.
Incidentally, BackBlaze has even started looking at High Fly Writes as a possible indicator of future disk failure. “This stat is the cumulative count of the number of times the recording head ‘flies’ outside its normal operating range,” Klein explains, noting that while 47 percent of failed drives have a SMART 189 value of greater than zero, so do 16.4 percent of drives that work. “The false positive percentage of operational drives having a greater than zero value may at first glance seem to render this stat meaningless. But what if I told you that for most of the operational drives with SMART 189 errors, that those errors were distributed fairly evenly over a long period of time?” he asks. “For example, there was one error a week on average for 52 weeks. In addition, what if I told you that many of the failed drives with this error had a similar number of errors, but they were distributed over a much shorter period of time, for example 52 errors over a one-week period. Suddenly SMART 189 looks very interesting in predicting failure by looking for clusters of High Fly Writes over a small period of time.”
That’s not to say that any of the statistics, or even a combination of them, is a perfect predictor of when a disk drive is going to fail – or when it isn’t. The organization points out, “Operational drives with one or more of our five SMART stats greater than zero – 4.2 percent. Failed drives with one or more of our five SMART stats greater than zero – 76.7 percent,” writes Klein. “That means that 23.3 percent of failed drives showed no warning from the SMART stats we record.” But if nothing else, it sounds like starting to see these particular SMART errors is a way to bet.
Disclaimer: I am a BackBlaze customer.