The recent publication of the original Apollo 11 lunar module code in GitHub is bringing new interest to a really arcane form of storage: Rope memory.
Let’s get the Moore’s Law, kids-these-days aspect of this over with right away: The Apollo Guidance Computer (AGC) had just “36,864 sixteen-bit words of core rope memory (placed within one cubic foot) and 4,096 words of magnetic core memory (within two cubic feet),” or the equivalent of 72K of the sort of memory we have today, according to programmer (and stoner) Don Eyles in the 2008-2009 documentary Moon Machines. (An MP3 player at the time the documentary was made had 50,000 times more storage space, the documentary notes.) This is inside a 70-pound box that was the state of the art – in the mid-1960s. Heck, it even used integrated circuits, the first device to do so.
But it’s the type of read-only memory that’s particularly interesting. “Fixed memory consisted of core rope, a high-density read-only memory using cores of similar material composition as the erasable memory but of completely different design,” writes NASA in Computers in Spaceflight: The NASA Experience. “MIT adopted the use of core rope in the original Mars probe computer design and carried it over to the Apollo. Chief advantage of the core rope was that it could put more information in less space, with the attendant disadvantages that it was difficult to manufacture and the data stored in it were unchangeable once it left the factory.”
Yeah, so let’s talk about that manufacturing. It was literally woven by little old ladies on looms (and, consequently, was sometimes known as Little Old Lady, or LOL, memory). “Producing these required skills was analogous to textile work, which had long been part of the New England labor force,” explains the Dibner Institute for the History of Science and Technology at the California Institute of Technology, on its History of Recent Science & Technology website.
“Women in the factory would literally weave the software into this core rope memory,” explains historian David Mindell, an MIT professor of the history of engineering and manufacturing, in Moon Machines. “The rope is made up of rings and wires,” notes Margaret Hamilton, director of software engineering for the project, in the documentary. “If the wire goes through the core, it represents a 1, and around the core, it represents a zero.” A single program could take several months to weave, and an error would be a “nightmare to correct,” the documentary points out.
It could only have been more arcane if the little old ladies in question had woven it using their own hair or something.
That said, rope memory was quite the thing in its day, because it actually stored much more data than competing methods. “In the AGC, up to 64 wires could be passed through a single core,” writes Ralf-Udo Hartmann in his personal website. “A relatively large (by the standards of the time) amount of data could be stored in a small installed volume of core rope memory (72 kilobytes per cubic foot; roughly 2.5 megabytes per cubic meter); about 18-folda the amount of data per volume compared to standard read-write core memory.”
Moreover, core rope memory was more robust than other kinds of memory, even some we use today, because it didn’t require power to operate and could survive in rugged environments like space and splashing down in the ocean. “Core memory is non-volatile storage – it can retain its contents indefinitely without power,” Hartmenn writes. “It is also relatively unaffected by EMP and radiation. These were important advantages for some applications like first generation industrial programmable controllers, military installations and vehicles like fighter aircraft, as well as spacecraft, and led to core being used for a number of years after availability of semiconductor MOS memory (MOSFET). For example, the Space Shuttle flight computers initially used core memory, which preserved the contents of memory even through the Challenger’s explosion and subsequent plunge into the sea in 1986.”
On the other hand, there are those who claim that the core rope method is so arcane it couldn’t actually work, and use that as proof that either the moon landing itself was faked or the technology was faked to keep it out of the hands of competing governments.
Maybe they just didn’t trust their little old ladies.
If you sell disaster recovery products and solutions, you have a big batch of potential new customers: Major U.S. airlines. Four airlines — Delta, Southwest, United, and American, which just happen to control 80 percent of all domestic travel in the U.S. — have suffered major outages in recent months that have been attributed to a lack of a proper disaster recovery plan.
In August, Delta customers suffered major delays because of…a power failure? “A power outage in Atlanta, which began at approximately 2:30 a.m. ET, has impacted Delta computer systems and operations worldwide, resulting in flight delays,” the company reportedly wrote on its blog at the time. “Following the power loss, some critical systems and network equipment didn’t switch over to Delta’s backup systems,” reported Kim Nash in the Wall Street Journal.
“How could a company as technologically savvy and mature with its business processes as Delta not have a working disaster recovery plan?” writes M.J. Shoer in Seacoast Online. “It’s a fair question and we are still waiting to learn the details. I can’t for a minute believe Delta does not have a disaster recovery plan to deal with an event like this, but it failed. That begs the question as to when it was last tested and how often this plan is reviewed, revised and retested.”
In July, “Southwest Airlines canceled more than 2,000 flights over several days after an outage that it blamed on a faulty network router,” write Alastair Jamieson, Shamar Walters, Kurt Chirbas and Gabe Gutierrez for NBC News. While the company had redundancies built into its equipment, they didn’t work, according to CEO Gary Kelly. “A back-up system also failed, extending the outage. Ultimately the company had to replace the router and reboot 400 servers,” the company’s chief operating officer told Conor Shine of the Dallas Morning News.
Delta and Southwest aren’t alone. “Computer network outages have affected nearly all the major carriers in recent years,” writes David Koenig for the Associated Press. “After it combined IT systems with merger partner Continental, United suffered shutdowns on several days, most recently in 2015. American also experienced breakdowns in 2015, including technology problems that briefly stopped flights at its big hub airports in Dallas, Chicago and Miami.”
Altogether, thousands of flights were delayed, resulting in costs of millions of dollars. Southwest’s Kelly could even lose his job over the incident, after criticism from a number of employee unions.
“The meltdown highlights the vulnerability in Delta’s computer system, and raises questions about whether a recent wave of four U.S. airline mergers that created four large carriers controlling 85 percent of domestic capacity has built companies too large and too reliant on IT systems that date from the 1990s,” writes Susan Carey in the Wall Street Journal.
Moreover, it points to how vulnerable airlines are to terrorist attacks, writes Hugo Martin in the Los Angeles Times. In fact, at first, some even attributed Delta’s outage to a terrorist attack – maybe because it just seemed so unbelievable that an airline could be brought down by a power failure. But it’s happening.
So What’s Causing It?
“There have been several reservation system outages that have hit worldwide airline ops with distressing regularity over the past few years,” risk management specialist Robert Charette told Willie Jones of IEEE Spectrum. “Southwest Airlines had one just a few weeks ago. (It had another big one June of 2013 and another in October 2015.) What you’ll see in reviewing them is recurring problems with infrastructure (i.e., power, networks, routers, servers, etc.) that seem to keep surprising the airlines. In every case I can recall, there were backup systems in place, but they failed—another recurring theme. The Southwest CEO claimed that the last outage—caused by a router—was equivalent to a 1000-year flood. Not only was that a comical overstatement, but it also shows the thinking that is probably [leading to the airlines] skimping on contingency management preparations.”
Some have attributed the failures to too much consolidation in the airline industry and too much emphasis on efficiency. One study, Delivering the Benefits? Efficiencies and Airline Mergers, found that not only did mergers not save operational money, but often cost much more than expected – particularly in terms of integrating IT. Some have attributed Delta’s failure, in particular, to the retiring of its previous CIO, Theresa Wise – who merged the Delta and Northwest IT teams — in January.
“Is this a sign that airlines aren’t investing enough money in their IT infrastructure?” writes Adam Levine-Weinberg in The Motley Fool.
Airlines Aren’t Alone
On the other hand, airlines aren’t alone in having insufficient disaster recovery protection – just the most conspicuous. A survey – admittedly by a disaster recovery vendor – found that companies in general aren’t testing their disaster recovery systems enough. “When asked about how frequently they tested their DR environment, more than half of the respondents indicated that they tested less than once a year; even worse, a third said that they tested infrequently or never,” the survey found.
Airlines’ IT systems’ complexity makes it worse. “Since they are needed on a 24/7 basis, 365 days a year, it’s hard to fully test every potential scenario that could cause problems,” Levine-Weinberg writes. “As a result, it may be impossible to fully eliminate large-scale IT outages across the airline industry.”
The biggest problem with this failure in disaster recovery? The computer networks and systems can be repaired. Disaster recovery plans can be created for the next time. But repairing customers’ trust may not be so easy.
You could call it Porn Wars. The U.S. conflict with the Islamic State of Iraq and Syria (ISIS) is being fought not on the battlefield – to the extent there even is a battlefield with the guerilla-focused organization – but also psychologically, using pornography.
For example, the activist hacker group Anonymous reportedly hacked ISIS Twitter feeds, replacing terrorist-oriented content with rainbows and links to gay porn. “Messages posted to the compromised accounts include ‘I’m gay and proud’ and ‘Out and proud,’” writes Anthony Cuthbertson for Newsweek. “A link to a gay porn site is included in some of the hacked accounts, although no explicit images have been posted in respect to Islam.”
In addition, American military leadership is claiming that – far from having encrypted data – seized ISIS laptops are not only unencrypted, but 80 percent full of porn, according to Lieutenant General Michael Flynn, an ex-chief of the Defense Intelligence Agency. ‘We looked a ruthless enemy in the eye — women and children, girls and boys, raped and exploited, the beheadings stored on a laptop next to pornography,” he reportedly writes in his book, The Field of Fight: How We Can Win the Global War Against Radical Islam and Its Allies, excerpted in the German newspaper Bild. ‘These sick, psychopathic enemies were not only unimaginably hideous, but also treacherous and torn.”
“Some of it was really bad, and it was all over the map. Some of it was kids, animals,” a former intelligence analyst told ABC News, confirming the general’s claim. “I think it’s indicative of their hypocrisy of what they’ve said they believe in — their perverted version of their religion.”
This claim is particularly egregious in the case of ISIS, a religiously-oriented organization, which is ostensibly against porn, particularly the pedophiliac- and bestiality-oriented sort the laptops reportedly contained. Assuming it’s true that the laptops are “full of porn,” this could either be an attempt to make ISIS look bad to Americans, to other Muslims, or to both.
“The Islamic extremist group preaches a strict moral code and its caliphate operates under Shira[sic] Law,” writes Meg Wagner for the New York Daily News. “Drinking alcohol is a crime, women are forced to stay inside their homes unless they’re chaperoned by a male relative and anyone considered a heretic can be sold as a slave.” Osama bin Laden’s hideout was also said to be stocked with porn videos.
That said, some experts disagree with Flynn’s contention that pornography could be used to incite violence. “Porn has been the scapegoat for cheating on a spouse, the national divorce rate and even the 2014 University of California, Santa Barbara shootings, and now Flynn is correlating porn consumption with rape — these accusations are unsound,” writes Melanie Ehrenkranz in Tech.Mic. “Despite what anti-porn advocates argue, some experts say that watching violent pornography might actually help people fight their violent sexual urges.”
The timing of these revelations could also be related to the rumor that Flynn, who was reportedly forced out of his position in 2014 due to clashes with leadership, was being considered as a possible Vice Presidential nominee by Republican Presidential candidate Donald Trump, which was bruited scant days after the publication of his anti-Islam book.
Is it Propaganda?
Moreover, the oodles of porn claim also fits in with the classic “dehumanizing the enemy” propaganda efforts of previous wars. “Tales of atrocities also can dehumanize, as readers of William Randolph Hearst’s newspapers learned when they got whipped up for the Spanish-American war with fake stories and sketches of Spanish atrocities that probably never happened,” wrote Michael James for ABC News in 2003. “Arguably, it set a pattern for phony or embellished American wartime propaganda that would last at least through the Gulf War.”
In what seems prescient now, James Forsher — a film historian and documentary filmmaker who has studied propaganda films, and an assistant professor of mass communications at California State University — predicted, “If things turn nasty [with Iraq], God knows what’s going to happen to them,” James writes. “It worries me. It worries me for the country and for Americans who have Middle Eastern ancestry.”
“I think the demonization of Islam and the Arab world is identical to what happened 100 years ago,” Yale University history professor Jay Winter told James. “The Arab is now a stock figure, a caricature, a symbol of fanaticism, of infinite cruelty and no regard for human rights.”
As you may recall, last fall the European Union got concerned about its citizens’ user data privacy regarding sending data to U.S. companies, because of data security flaws discovered by revelations from Edward Snowden. The European Court of Justice gave the EU and the U.S. till the end of January to come up with a better way to transfer data between themselves.
The good news is, they did. But the bad news is, they got it settled just in time for Brexit – the UK’s exit from the EU – to screw it all up again.
So here’s the good news. The European Commission, which governs the EU, announced on July 12 that it had adopted the EU-U.S. Data Privacy Shield, which was announced on February 2 and replaces the previous “Safe Harbor” method that was no longer considered adequate. Further modifications were made after one European committee found the additions were also inadequate.
- The U.S. Department of Commerce will conduct regular updates and reviews of participating companies, to make sure they’re following the rules.
- The U.S. promises that it won’t perform indiscriminate mass surveillance on the data it gets from EU citizens – and citizens have a method of redress if they think this isn’t being followed.
- The European Commission and the U.S. Department of Commerce will review the program every year to make sure it’s doing what it’s supposed to.
Without the agreement, more than 4,000 European and U.S. companies wouldn’t have been able to exchange data about each other’s citizens as easily, which could make commerce more difficult. That’s currently worth up to $260 billion, writes Mark Scott in the New York Times.
The next step is for companies to sign up for the self-certification program, which starts August 1, write Maria Camila Tobon and Alfred J. Saikali of the law firm Shook Hardy & Bacon LLP in the legal website Lexology. Google and Microsoft said they would be signing up for the new program; Facebook said it was still thinking about it. (The other, more-cumbersome alternatives to Safe Harbor are still around.) If companies decide to join within two months, they will then have nine months to actually comply; companies that decide after the two months will have to comply right away. Violators can be fined up to 20 million Euro.
All that stipulated, the user data privacy legal case that spawned this activity is still going on. Austrian privacy activist Max Schrems sued Facebook to determine whether personal privacy of the data it transfers back to the U.S. is properly protected from U.S. government surveillance, which is what started the whole Safe Harbor issue. In fact, as the Privacy Shield was getting settled, an Irish court was ruling that the U.S. could join Facebook’s case, since the precedent was so significant, writes Padraic Halpin for Reuters.
That ruling was important because it means that the U.S., as well as organizations such as the Electronic Privacy Information Centre, the trade group Business Software Alliance, and the European alliance of Internet companies Digital Europe, can present evidence as an amicus curiae in the case because they have a bona fide interest and are not just a “meddlesome busybody,” The Hill’s Joe Uchill quotes the Irish High Court as saying. (In contrast, the American Civil Liberties Union and the Electronic Frontier Foundation were denied this privilege, he adds.) How long that lawsuit is going to take, nobody is speculating.
And What About Brexit?
The UK may be facing similar issues as the U.S. with its departure from the EU, depending on how the negotiations go that will determine whether it is still considered a European Economic Area, writes Claus Farber in the National Law Review. That process that could take two years, he writes. On the other hand, some see the UK’s departure from the EU as an advantage in that area.
In addition, Canada is concerned that the EU will find its data transfer regulations inadequate as well.
Moreover, some people feel that even the new user data privacy standards still don’t go far enough. “Legal challenges are already being prepared, and the European Court of Justice — the same court that overturned the previous trans-Atlantic data transfer deal — is likely to review the Privacy Shield to see if it meets European standards,” Scott writes. So in a year or two, we may be right back where we started from.
Is data owned by a company based in one country, but stored in another country, subject to seizure by warrant by the government of the first country? U.S. courts have been batting the issue back and forth since 2014, and the most recent verdict is that no, it isn’t.
The subject of the case (which has a lot of complex legal nuance) is Microsoft, which reportedly had email messages from one of its customers stored on a server in Ireland. The Department of Justice wanted access to those email messages while pursuing an unspecified case, claiming that since the email messages were controlled by Microsoft, an American country, the DoJ had jurisdiction over them even though they were stored in Ireland.
This scared people, for multiple reasons.
- Because so many computer companies are American, it would mean an awful lot of data worldwide would be subject to access by the U.S. government.
- Computer companies worried that worldwide customers would stop using them because they were afraid they’d get their data accessed.
- Having the data subject to U.S. access could mean that the company – Microsoft in this case, but any company – could be violating data privacy laws in force at the second country. (For that reason, dozens of companies and civil liberties organizations – as well as the government of Ireland itself — filed amicus curiae briefs supporting Microsoft.)
- If this precedent was set with the U.S., all the other countries in the world could declare that, in that case, all their data laws could apply to any company doing business in their countries, which could be an incredibly complicated, contradictory mess.
So, needless to say, everybody but the Department of Justice was pretty happy with the 3-0 decision by the 2nd U.S. Circuit Court of Appeals in Manhattan. “Circuit Judge Susan Carney said communications held by U.S. service providers on servers outside the United States are beyond the reach of domestic search warrants issued under the Stored Communications Act, a 1986 federal law,” writes Jonathan Stempel for Reuters. “Congress did not intend the SCA’s warrant provisions to apply extraterritorially,” Carney wrote in her decision. “The focus of those provisions is protection of a user’s privacy interests.”
For its part, the Department of Justice said that if users could stash their data overseas, it would make it hard for them to catch bad guys. While there were other methods that would give the U.S. government the ability to request the data stored in the foreign country, the DoJ said they are hard to do, Stampel writes.
For that matter, the new ruling – which applies only in the region served by the court — means a company could ostensibly choose to store data in a foreign country to keep it out of the hands of law enforcement, notes Peter Henning in the New York Times. “Imagine the possibility that a company might offer an email service — perhaps called ‘Crim Mail’ — guaranteeing users that their electronic files would be stored overseas,” he speculates. “It could even choose to put servers in a location that is notably hostile to the United States and that would welcome the chance to throw a wrench in law enforcement efforts. The company could charge a premium to have files maintained only in specified locations, making it almost impossible for investigators to ever gain access to them.” He called such a possibility far-fetched, though.
But hey! We’re not done! The Department of Justice could still appeal the appeal, as it were, which would send the issue to the U.S. Supreme Court. “We are disappointed with the court’s decision and are considering our options,” said U.S. Department of Justice spokesman Peter Carr in a statement cited by Elizabeth Weise in USA Today, who went on to say that the government was expected to appeal.
What could happen there is anyone’s guess. First, we don’t actually know that much about the current court’s views on worldwide data sovereignty. Second, at the moment we have only eight Justices, meaning that if they have a tied decision, it would apply only to this single case, not as a precedent. If a ninth Justice is appointed by then, who it is and what their views are will depend a lot on who gets elected President in November – not to mention which party ends up in control of the Senate, which has to approve any new Justices before they could be seated. Plus, if the current eight-member Court starts hearing testimony before the new Justice is seated, the new Justice still wouldn’t be able to rule on the case.
The appeals court judges suggested that what really needed to be done is for the U.S. government to write other laws to make it less complicated to request data stored in another country, while still giving that country sovereignty over the data stored within its borders.
“The case turned on how broadly a court can apply the Stored Communications Act, a law adopted in 1986 as part of the Electronic Communications Privacy Act to give a measure of protection to the then-nascent technology of email,” Henning writes. “Like almost any 30-year-old law dealing with technology, it is hopelessly out of date because it has not been meaningfully updated by Congress to address how digital information is created and stored.”
Chances are, that too is also something that could come out very differently depending on who gets elected President. Another reason to vote in November.
If you’ve ever been a customer of Sports Authority, you may now suddenly be a customer of Dick’s Sporting Goods. And you probably never felt a thing.
Sports Authority, which filed for Chapter 11 bankruptcy protection in March, recently auctioned its intellectual property, including its name, its e-commerce site, and about 114 million customers’ files and 25 million email addresses, writes Alex Schiffer in the Los Angeles Times.
Dick’s Sporting Goods, which also purchased a number of the company’s other assets, won the data with a $15 million bid, a sum that the Jackson, Mississippi Clarion-Ledger‘s Bill Moak thought a steal. “For just $15 million, Sports Authority rival Dick’s Sporting Goods snapped up the files,” he writes. “Many analysts consider that a paltry sum to pay for such potentially-valuable information, and could potentially help Dick’s build its presence in new areas where Sports Authority had previously dominated.”
As you may recall, a year ago the bankruptcy sale of Radio Shack raised the issue of who owns personal data about customers and what rights a company has to sell that data as an asset to another company. While everyone was having kittens about the issue last year, the similar issue this year with the bankruptcy sale of Sports Authority hasn’t drawn nearly the attention. It’s not clear whether everyone’s simply gotten used to the idea, or sporting goods users just aren’t as up on the issue as electronics nerds were.
(Ironically, Fortune even compared Sports Authority’s bankruptcy and potential future business model to that of Radio Shack.)
The uproar over Radio Shack’s proposed sale of customer data – including attention from the Federal Trade Commission (FTC) — eventually led it to cancel the sale as a separate item, and simply included it as part of a package deal with its stores. Instead, the hedge fund Standard General bought RadioShack’s business and customer data for $26.2 million, Schiffer writes. Customers also had the right to remove their information, but had only a week to do so, writes Laura Northrop for The Consumerist. Plus, since Sprint bought some of the assets, information having to do with Verizon and AT&T was also withheld from the sale.
But such interventions appear to be coming less and less often as companies realize the value of the data and make provisions for being able to exploit it.
It’s not like this is the first time that companies have bought and sold customer data, Moak writes. Anyone who’s ever used an odd middle initial so they can track what happens to their name after they give it to a company knows that. What’s different is companies seeing that data as a separate asset to sell in a bankruptcy sale.
“The sale of customer information is a huge industry, and many consumers are blissfully unaware their information is being bought and sold every day,” he writes. “Email addresses, physical addresses, phone numbers and other information is tremendously valuable to companies looking to build sales, and they collect it in numerous ways — often voluntarily through loyalty and discount programs.”
A New York Times investigation of the top 100 websites in the U.S. found that of the 99 sites with English-language terms of service or privacy policies, 85 said they might transfer users’ information if a merger, acquisition, bankruptcy, asset sale or other transaction occurred. “The sites with these provisions include prominent consumer technology companies like Amazon, Apple, Facebook, Google and LinkedIn, in addition to Hulu,” write Natasha Singer and Jeremy Merrill. Plus, only a few sites included an opt-out policy. Even the Times itself says it could sell customer data and doesn’t include an opt-out provision, they report.
It seems likely that in the future, more stores are going to be following Sports Authority’s lead, Users for whom this is an issue will need to start reading policy documents and plan their shopping accordingly.
The bankruptcy court still has to approve the sale.
Let’s say you were bad. (Of course, none of you are, but let’s say you were, for the sake of argument.) The government wants to look at a few files on your computer, but makes a copy of all of them, because, you know, it’s just easier.
Then, three years later, the government decides you were bad again. And remember all those files it had picked up the first time you were bad, that it didn’t really need but just picked up because it was easier? It goes back through those files, which it still had, and uses them to convict you on the new charge.
It sounds nuts, but that’s what’s happened to a Connecticut accountant, Stavros Ganias. As he was the accountant for a company suspected of fraud with the Army, the Army wanted to look at the relevant files on his computers. To do this, in 2003 it made a mirror copy of his files – all of them, including his personal financial data – and took it back to look at.
In 2006, the Internal Revenue Service – which had been working with the Army on the case – decided that perhaps Stavros had failed to properly declare his income. So the IRS got a warrant to look at the files it already had of Ganias’ personal financial data, instead of going back to him to get copies. You know, since the government had it around anyway. After he was convicted, he appealed, saying the government shouldn’t have had that data in the first place. But the Second Circuit court has now ruled that it was okay for the government to do that.
Fun With the Fourth
The thing is, the whole point of having a Fourth Amendment and requiring a warrant at all is so the government can’t just seize everything you have and go on a fishing expedition, or what is called a “general” warrant – a point that Judge Denny Chin, who wrote the original opinion, made in a 40-page dissent.
Frustratingly, the Second Circuit court had originally ruled the other way – with a ruling that had been praised by civil liberties advocates. “Courts have long held that the practicality of computer search and seizure allows government agents to seize computers and search them later for responsive files,” wrote Orin Kerr of the Volokh Conspiracy in June, 2014, in the Washington Post, about the original ruling. “In Ganias, the Second Circuit makes clear that the government’s right to overseize is temporary, and that it has no right to continue to retain the non-responsive files indefinitely. The court doesn’t say exactly when the government has to destroy, delete, or return its copy of the non-responsive files. But the Second Circuit does make clear that the government has such a duty.”
However, a year later, the Second Circuit decided, on its own, to reconsider the case en banc. In other words, rather than having three judges decide the case, all 13 judges heard it in one big group, which happens only every couple of years or so with major cases. The en banc court overturned the Second Circuit’s original ruling.
Now, all bets are off, writes Andrew Crocker, staff attorney for the Electronic Frontier Foundation (EFF). “Had Ganias’ files been stored on paper, this would have been a simple case,” he writes. “As the Ninth Circuit explained in United States v. Tamura, police may do a cursory examination of files in a filing cabinet to determine which are included in a warrant, but they can only seize items outside that warrant for off-site review in very limited circumstances. And even then, non-responsive items must be promptly returned.”
Storage is Hard
What led the en banc Court to make a different ruling? Because otherwise it would be too hard to separate the data, it ruled. In fact, the government’s original request for a rehearing was on the grounds that not using the additional files would be too expensive.
The court also said the additional files couldn’t have been returned or destroyed in the first place because of the need to preserve the chain of evidence, and because hard disks store files in all sorts of little pieces scattered all over the place. “Though to a user a hard drive may seem like a file cabinet, a digital forensics expert reasonably perceives the hard drive simply as a coherent physical storage medium for digital data — data that is interspersed throughout the medium, which itself must be maintained and accessed with care, lest this data be altered or destroyed,” the Court writes.
Besides, since it had gone to the effort of getting the second warrant, obviously the DOJ meant well, writes Scott Greenfield in the Simple Justice legal blog. “It’s not that the en banc majority disputed the idea that computer hard drives contained vast amounts of information beyond that for which the warrant authorized seizure, or the government holding onto the entirety of the mirrored hard drive evidence for two and half years beyond the end of the investigation for which it was seized, just because,” he writes. “It’s that, by obtaining a second warrant, all evils magically disappear, because it was covered by the government’s ‘good faith.’”
Well, He Didn’t *Ask* for It Back
In addition, the court points out that since Ganias hadn’t asked to have his data returned, apparently he didn’t mind that the government kept a copy of it. But that was specious, Kerr wrote in 2014, when that question came up before. “Imposing such a prerequisite makes little sense in this context, where Ganias still had the original computer files and did not need the Government’s copies to be returned to him,” he writes. Moreover, if the government has been saying it was too hard to return them or delete them anyway, what would be the point of his asking for it? he continued.
Ultimately, this ruling might apply to other cases as well, Crocker warns. “The court’s discussion of Ganias’ failure to seek the return of his data before 2006 could set a dangerous norm of allowing broad searches, putting the burden on users to sue the government if they object,” he writes. “By failing to require the deletion of overcollected data, the Ganias court may provide a perverse incentive to retain when the government has no good reason to do so.”
Backup service Backblaze has released the quarterly report on its hard drive statistics. While there wasn’t anything new that particularly leapt out of this report, the company noted that it has now passed one billion hours of disk drive operation. Or, as Backblaze director of product marketing Andy Klein notes in a blog post, 42 million days or 114,155 years’ worth of spinning hard drives.
Which is what’s valuable about the report. People can do their own hard drive testing, but when it comes to sheer volume, there’s hardly anyone who uses disk drives more consistently. And while there may be others of a similar volume – Google and Facebook come to mind – they don’t typically release this sort of data to the public.
That doesn’t necessarily mean, of course, that you might see the same sort of performance yourself – anybody can get a bum drive – but it’s a good way to bet. Backblaze uses commodity hard drives to build “pods” where they strip off everything extraneous and cram as many drives as possible into rack space. (They even changed out the power switch this year after finding a cheaper model.) The company then builds a “vault” out of 20 pods, which totals 1,200 hard drives. It fills at least three vaults a month.
The design of the pod – which is open-sourced – gets updated periodically. For example, Backblaze is now using pods of 60 drives – though they poke out the back of the server rack — instead of 45, giving it a capacity of up to 480TB, which costs the company less than a nickel per gigabyte. “That’s a 33 percent increase to the storage density in the same rack space,” writes Klein. “Using 4TB drives in a 60-drive Storage Pod increases the amount of storage in a standard 40U rack from 1.8 to 2.4 Petabytes. Of course, by using 8TB drives you’d get a 480TB data storage server in 4U server and 4.8 Petabytes in a standard rack,” he adds.
Such changes don’t save much on an individual basis, but add up, Klein writes. “Saving $0.008 per GB may not seem very innovative, but think about what happens when that trivial amount is multiplied across the hundreds of Petabytes of data,” he writes.
So here’s the noteworthy information about this quarter.
- The company has 61,590 hard drives, an increase of 9.5 percent over year-end 2015, when they evaluated 56,224.
- There are seven kinds of drives where they had no failures: Hitachi 4TB (the 4040B), Hitachi 8TB, Seagate 1.5TB, Seagate 6TB, Toshiba 4TB, Toshiba 5TB, and Western Digital 4TB.
- The three kinds of drives with the worst failure rates are the Seagate 4TB, the Toshiba 3TB, and Western Digital 2TB. Note that these are among the oldest drives the company has. The company also points out that because it only has a few of the Toshiba 3TB drives, that figure is based on a single drive failure.
- Overall, Backblaze’s disk failure rates are getting better. “The overall Annual Failure Rate of 1.84% is the lowest quarterly number we’ve ever seen,” Klein writes.
- Since a year ago at this time, Backblaze has stopped using four kinds of drives, all from Seagate: two 3TB models, a 2TB model, and one of its two 1.5TB models. This was partly due to low capacity and partly to their high failure rates.
- At this point, the majority of the drive hours are on 4TB drives. “The 4TB drives have been spinning for over 580 million hours,” Klein writes. “There are 48,041 4TB drives which means each drive on average had 503 drive days of service, or 1.38 years. The annualized failure rate for all 4TB drives lifetime is 2.12 percent.”
- Of the four primary manufacturers the company uses – Hitachi, Seagate, Toshiba, and Western Digital – Seagate by far has the highest failure rate (though it’s dropped significantly since last year), followed by Western Digital. Hitachi is the lowest.
- That said, Backblaze these days is buying primarily Seagate and Hitachi drives, because they’re easier to find in large (5,000 to 10,000) quantities at a time for a reasonable price.
- Backblaze doesn’t use many 6-, 8-, and 10TB drives, because they are either too expensive per terabyte or they aren’t available in a large enough quantity.
- The company primarily uses 5400 rpm drives, because it doesn’t need the speed of the 7200 rpm drives and they use less electricity.
- Seagate 8TB SMR drives didn’t work well for their purposes in their environment.
Critics point out that in some cases, Backblaze is comparing Hitachi enterprise-class drives (which purportedly are intended for a heavier load) with consumer-grade Seagate drives, and various other quibbles about different ways that different types and ages of drives can be compared. That said, it’s still a useful batch of data.
In addition to releasing the report, the company also releases the raw data for people who like to play with such things, such as the person who used the same techniques that are used to study the survivability of cancer patients against the Backblaze data.
Disclaimer: I am a Backblaze customer.
As you may recall, governments and government agencies such as the Federal Bureau of Investigation (FBI) are building giant databases of faces, using their own sources of faces, such as Department of Motor Vehicle records, and then using facial recognition software with that data to help identify criminals.
As it turns out, the FBI facial database is much bigger than anyone thought, much less reliable, and the FBI doesn’t want to consider it protected information. And that opinion isn’t from some sort of consumer privacy group: It’s from the government itself.
The Government Accountability Office (GAO) released its report, FACE Recognition Technology: FBI Should Better Ensure Privacy and Accuracy in May with this information about the FBI’s Facial Analysis, Comparison, and Evaluation (FACE) program. It was provided to the Senate Subcommittee on Privacy, Technology and the Law in the Committee on the Judiciary. (The whole report is 68 pages long and well worth reading.)
“The GAO found that the FBI has been disregarding some of even the most basic privacy protections and standards,” writes Kia Makarechi in Vanity Fair. “To wit: the driver’s-license photos of the residents of 16 states and some additional 30 million photos from a biometric database are available for the FBI to search at will. Another 18 states are reportedly negotiating with the FBI over the use of driver’s-license images.”
Altogether, the FBI facial database has access to more than 411 million pictures, the GAO reports. “FBI’s Facial Analysis, Comparison, and Evaluation (FACE) Services unit not only has access to FBI’s Next Generation Identification (NGI) face recognition database of nearly 30 million civil and criminal mug shot photos, it also has access to the State Department’s Visa and Passport databases, the Defense Department’s biometric database, and the drivers license databases of at least 16 states,” writes the Electronic Frontier Foundation. “Totaling 411.9 million images, this is an unprecedented number of photographs, most of which are of Americans and foreigners who have committed no crimes.”
In addition, the FBI hadn’t sufficiently notified the public of the technology’s use, the report found. In 2008 the FBI published a PIA [Privacy Impact Assessment] of its plans, the report writes. “However, the FBI did not publish a new PIA or update the 2008 PIA before beginning the NGI-IPS pilot in December 2011 or as significant changes were made to the system through September 2015. In addition, [Department of Justice] DOJ did not approve a PIA for FACE Services until May 2015 — over three years after the unit began supporting FBI agents with face recognition searches.” The DOJ also did not complete a System of Records Notice (SORN) in a timely manner, the report adds.
As far as the facial recognition part goes, the FBI had previously said that as many as 20 percent of the identifications were incorrect. It turns out that the agency was wrong, and it actually has no idea how often the software returns false positives – but the FBI contends it doesn’t matter, the GAO writes. “According to FBI officials, because the results are not intended to serve as positive identifications, the false positive rate requirement is not relevant.” The GAO went on to point out how, according to government specifications, that belief is incorrect, and listed several ways that the FBI could have tested this.
In addition, the FBI doesn’t go to any effort to ensure the accuracy of the photos it gets from other sources, such as state databases, saying that it was the responsibility of the external source, the GAO report continues. “However, states generally use their face recognition systems to prevent a person from fraudulently obtaining a drivers’ license under a false name, while the FBI uses face recognition to help identify, among other people, criminals for active FBI investigations,” the report notes. “Accuracy requirements for criminal investigative purposes may be different.” Moreover, other federal agencies such as the Transportation Security Administration (TSA) perform their own accuracy checks, the report adds.
This is particularly an issue because facial recognition software is not created equal. Systems developed in different countries tend to do best with people of the most prevalent race in that country, and less well with minorities, write Clare Garvie and Jonathan Frankle in The Atlantic. For example, in the U.S., the facial recognition algorithms work better with pictures of white people than those of black people.
False positives can create problems down the line, according to the GAO. “Given that the accuracy of a system can have a significant impact on individual privacy and civil liberties as well as law enforcement workload, it is essential that both the detection rate and the false positive rate for all allowable candidate list sizes are assessed prior to the deployment of the system,” the report notes. “According to a July 2012 Electronic Frontier Foundation hearing statement, false positives can alter the traditional presumption of innocence in criminal cases by placing more of a burden on the defendant to show he is not who the system identifies him to be.”
It turns out, also, that facial recognition searches are more common than wiretaps — but don’t have the same protections, notes Alvaro Bedoya, the executive director of the Center on Privacy and Technology at Georgetown University Law Center. “According to the GAO, the unit fielded approximately 214,920 searches or requests between 2011 and 2015 — 36,420 involving the 16 states’ driver’s-license photos,” Makarechi writes. “Overall, FACE found 8,590 cases in which a ‘likely candidate’ was returned to an FBI agent.”
To add insult to injury, the FBI requested in May that its biometrics database – including fingerprints and facial photographs — be exempted from certain provisions of the Privacy Act, a move that the EFF is also fighting.
“The big concern is that the FBI is proposing to exempt NGI from any requirement that they update or correct data about somebody in the future,” Jennifer Lynch, EFF senior staff attorney, tells Ellen Nakashima of the Washington Post. In response to concerns from the EFF and more than 40 other groups, the Justice Department has extended the comment period to July 6, Nakashima adds.
In the meantime, smile. The FBI may be watching.
The e-discovery acquisition market has been pretty much a snore the past few years, after several years of musical chairs. But things might start heating up again due to a major acquisition from an unexpected source: enterprise information management software vendor OpenText has acquired Recommind for a reported $163 million, scheduled to close in Q1 2017.
Recommind has been reliably, though quietly, in the leaders section of the Gartner magic quadrant for e-discovery software for years now (though it did start out in 2011 as a “visionary”). The acquisition might lead to other vendors thinking that they might want to take a look at the other leaders – something Gartner actually predicted in last year’s MQ – particularly since 2015 featured a number of other peripheral e-discovery acquisitions.
Incidentally, you’ve got to figure that Gartner has to be really POed about the timing of this announcement; it typically releases its annual Magic Quadrant on e-discovery about this time of year, and now they’re either going to have to scramble to rewrite it, or know that it’s going to come out immediately out of date.
Ian Lopez of Legaltech News points out that Recommind had lost several executives recently. “Among them are former VP of business development Dean Gonsowski, who is now with kCura; Bob Clemens, who left his position as vice president of sales in America at Recommind to join CS Disco; and Philip Favro, who left his post as Recommind’s senior discovery counsel to join Driven as a consultant,” he writes.
What’s particularly interesting about this acquisition is that it’s OpenText. While a number of e-discovery vendors have been acquired over the years, generally the acquirers have been fairly major vendors – Symantec, HP, Microsoft, and so on. But OpenText? It’s interesting to see a vendor in this market seeing e-discovery as a field to enter. Image and Data Manager suggests it’s because of the increasing threat of litigation that companies face about their data.
And what OpenText is going to do with Recommind is not yet clear. OpenText could take Recommind’s technology and deploy it within proprietary offerings as Microsoft has with Equivio, and withdraw from the e-discovery industry altogether, or could become part of OpenText like Clearwell did when Symantec bought it, Lopez quoted Favro as saying.
OpenText has made a number of other acquisitions recently, and it’s thought that it’s not quite done. The company recently bought ANXeBusiness, a provider of cloud-based information exchange services for the automotive and healthcare industry, for about $100 million, and $170 million for some pieces of HP, including the HP TeamSite multichannel digital experience management platform, digital asset management solution HP MediaBin, and intelligent workforce optimization solution HP Qfiniti, reports the Waterloo Region Record.
In fact, content management consultant Laurence Hart thinks that OpenText might even acquire EMC’s red-headed stepchild Documentum. “We knew more acquisitions were coming after they then announced that they were raising $600 million and are planning to spend upwards of $3 billion on acquisitions over the next five years,” he writes. “It is fun to conjecture about the possibilities of Open Text acquiring Documentum from EMC. The problem is that Documentum is a huge acquisition. Documentum will cost a lot and that $600 million Open Text is raising would likely not cover the cost. The price does fit into Open Text’s 5 year plan financially but the acquisition would front-load that $3 billion in 2016.”
In any event, the announcement could set off another round of musical chairs in the e-discovery market, Favro notes.