Yottabytes: Storage and Disaster Recovery


June 30, 2016  8:54 PM

Court Backtracks on Ganias Data Collection

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
E-discovery, ediscovery, government, law, privacy, Security

Let’s say you were bad. (Of course, none of you are, but let’s say you were, for the sake of argument.) The government wants to look at a few files on your computer, but makes a copy of all of them, because, you know, it’s just easier.

Then, three years later, the government decides you were bad again. And remember all those files it had picked up the first time you were bad, that it didn’t really need but just picked up because it was easier? It goes back through those files, which it still had, and uses them to convict you on the new charge.

It sounds nuts, but that’s what’s happened to a Connecticut accountant, Stavros Ganias. As he was the accountant for a company suspected of fraud with the Army, the Army wanted to look at the relevant files on his computers. To do this, in 2003 it made a mirror copy of his files – all of them, including his personal financial data – and took it back to look at.

In 2006, the Internal Revenue Service – which had been working with the Army on the case – decided that perhaps Stavros had failed to properly declare his income. So the IRS got a warrant to look at the files it already had of Ganias’ personal financial data, instead of going back to him to get copies. You know, since the government had it around anyway. After he was convicted, he appealed, saying the government shouldn’t have had that data in the first place. But the Second Circuit court has now ruled that it was okay for the government to do that.

Fun With the Fourth

The thing is, the whole point of having a Fourth Amendment and requiring a warrant at all is so the government can’t just seize everything you have and go on a fishing expedition, or what is called a “general” warrant – a point that Judge Denny Chin, who wrote the original opinion, made in a 40-page dissent.

Frustratingly, the Second Circuit court had originally ruled the other way – with a ruling that had been praised by civil liberties advocates. “Courts have long held that the practicality of computer search and seizure allows government agents to seize computers and search them later for responsive files,” wrote Orin Kerr of the Volokh Conspiracy in June, 2014, in the Washington Post, about the original ruling. “In Ganias, the Second Circuit makes clear that the government’s right to overseize is temporary, and that it has no right to continue to retain the non-responsive files indefinitely. The court doesn’t say exactly when the government has to destroy, delete, or return its copy of the non-responsive files. But the Second Circuit does make clear that the government has such a duty.”

However, a year later, the Second Circuit decided, on its own, to reconsider the case en banc. In other words, rather than having three judges decide the case, all 13 judges heard it in one big group, which happens only every couple of years or so with major cases. The en banc court overturned the Second Circuit’s original ruling.

Now, all bets are off, writes Andrew Crocker, staff attorney for the Electronic Frontier Foundation (EFF).  “Had Ganias’ files been stored on paper, this would have been a simple case,” he writes. “As the Ninth Circuit explained in United States v. Tamura, police may do a cursory examination of files in a filing cabinet to determine which are included in a warrant, but they can only seize items outside that warrant for off-site review in very limited circumstances. And even then, non-responsive items must be promptly returned.”

Storage is Hard

What led the en banc Court to make a different ruling? Because otherwise it would be too hard to separate the data, it ruled. In fact, the government’s original request for a rehearing was on the grounds that not using the additional files would be too expensive.

The court also said the additional files couldn’t have been returned or destroyed in the first place because of the need to preserve the chain of evidence, and because hard disks store files in all sorts of little pieces scattered all over the place. “Though to a user a hard drive may seem like a file cabinet, a digital forensics expert reasonably perceives the hard drive simply as a coherent physical storage medium for digital data — data that is interspersed throughout the medium, which itself must be maintained and accessed with care, lest this data be altered or destroyed,” the Court writes.

Besides, since it had gone to the effort of getting the second warrant, obviously the DOJ meant well, writes Scott Greenfield in the Simple Justice legal blog. “It’s not that the en banc majority disputed the idea that computer hard drives contained vast amounts of information beyond that for which the warrant authorized seizure, or the government holding onto the entirety of the mirrored hard drive evidence for two and half years beyond the end of the investigation for which it was seized, just because,” he writes. “It’s that, by obtaining a second warrant, all evils magically disappear, because it was covered by the government’s ‘good faith.’”

Well, He Didn’t *Ask* for It Back

In addition, the court points out that since Ganias hadn’t asked to have his data returned, apparently he didn’t mind that the government kept a copy of it. But that was specious, Kerr wrote in 2014, when that question came up before. “Imposing such a prerequisite makes little sense in this context, where Ganias still had the original computer files and did not need the Government’s copies to be returned to him,” he writes. Moreover, if the government has been saying it was too hard to return them or delete them anyway, what would be the point of his asking for it? he continued.

Ultimately, this ruling might apply to other cases as well, Crocker warns. “The court’s discussion of Ganias’ failure to seek the return of his data before 2006 could set a dangerous norm of allowing broad searches, putting the burden on users to sue the government if they object,” he writes. “By failing to require the deletion of overcollected data, the Ganias court may provide a perverse incentive to retain when the government has no good reason to do so.”

June 27, 2016  11:08 PM

Fun With Hard Drive Statistics

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Backblaze, Hitachi, Seagate, Storage, toshiba, western digital

Backup service Backblaze has released the quarterly report on its hard drive statistics. While there wasn’t anything new that particularly leapt out of this report, the company noted that it has now passed one billion hours of disk drive operation. Or, as Backblaze director of product marketing Andy Klein notes in a blog post, 42 million days or 114,155 years’ worth of spinning hard drives.

Which is what’s valuable about the report. People can do their own hard drive testing, but when it comes to sheer volume, there’s hardly anyone who uses disk drives more consistently. And while there may be others of a similar volume – Google and Facebook come to mind – they don’t typically release this sort of data to the public.

That doesn’t necessarily mean, of course, that you might see the same sort of performance yourself – anybody can get a bum drive – but it’s a good way to bet. Backblaze uses commodity hard drives to build “pods” where they strip off everything extraneous and cram as many drives as possible into rack space. (They even changed out the power switch this year after finding a cheaper model.) The company then builds a “vault” out of 20 pods, which totals 1,200 hard drives. It fills at least three vaults a month.

The design of the pod – which is open-sourced – gets updated periodically. For example, Backblaze is now using pods of 60 drives – though they poke out the back of the server rack — instead of 45, giving it a capacity of up to 480TB, which costs the company less than a nickel per gigabyte. “That’s a 33 percent increase to the storage density in the same rack space,” writes Klein. “Using 4TB drives in a 60-drive Storage Pod increases the amount of storage in a standard 40U rack from 1.8 to 2.4 Petabytes. Of course, by using 8TB drives you’d get a 480TB data storage server in 4U server and 4.8 Petabytes in a standard rack,” he adds.

Such changes don’t save much on an individual basis, but add up, Klein writes. “Saving $0.008 per GB may not seem very innovative, but think about what happens when that trivial amount is multiplied across the hundreds of Petabytes of data,” he writes.

So here’s the noteworthy information about this quarter.

  • The company has 61,590 hard drives, an increase of 9.5 percent over year-end 2015, when they evaluated 56,224.
  • There are seven kinds of drives where they had no failures: Hitachi 4TB (the 4040B), Hitachi 8TB, Seagate 1.5TB, Seagate 6TB, Toshiba 4TB, Toshiba 5TB, and Western Digital 4TB.
  • The three kinds of drives with the worst failure rates are the Seagate 4TB, the Toshiba 3TB, and Western Digital 2TB. Note that these are among the oldest drives the company has. The company also points out that because it only has a few of the Toshiba 3TB drives, that figure is based on a single drive failure.
  • Overall, Backblaze’s disk failure rates are getting better. “The overall Annual Failure Rate of 1.84% is the lowest quarterly number we’ve ever seen,” Klein writes.
  • Since a year ago at this time, Backblaze has stopped using four kinds of drives, all from Seagate: two 3TB models, a 2TB model, and one of its two 1.5TB models. This was partly due to low capacity and partly to their high failure rates.
  • At this point, the majority of the drive hours are on 4TB drives. “The 4TB drives have been spinning for over 580 million hours,” Klein writes. “There are 48,041 4TB drives which means each drive on average had 503 drive days of service, or 1.38 years. The annualized failure rate for all 4TB drives lifetime is 2.12 percent.”
  • Of the four primary manufacturers the company uses – Hitachi, Seagate, Toshiba, and Western Digital – Seagate by far has the highest failure rate (though it’s dropped significantly since last year), followed by Western Digital. Hitachi is the lowest.
  • That said, Backblaze these days is buying primarily Seagate and Hitachi drives, because they’re easier to find in large (5,000 to 10,000) quantities at a time for a reasonable price.
  • Backblaze doesn’t use many 6-, 8-, and 10TB drives, because they are either too expensive per terabyte or they aren’t available in a large enough quantity.
  • The company primarily uses 5400 rpm drives, because it doesn’t need the speed of the 7200 rpm drives and they use less electricity.
  • Seagate 8TB SMR drives didn’t work well for their purposes in their environment.

Critics point out that in some cases, Backblaze is comparing Hitachi enterprise-class drives (which purportedly are intended for a heavier load) with consumer-grade Seagate drives, and various other quibbles about different ways that different types and ages of drives can be compared. That said, it’s still a useful batch of data.

In addition to releasing the report, the company also releases the raw data for people who like to play with such things, such as the person who used the same techniques that are used to study the survivability of cancer patients against the Backblaze data.

Disclaimer: I am a Backblaze customer.


June 22, 2016  4:35 PM

FBI Facial Database: Big and Broken

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
FBI, government, privacy, Security

As you may recall, governments and government agencies such as the Federal Bureau of Investigation (FBI) are building giant databases of faces, using their own sources of faces, such as Department of Motor Vehicle records, and then using facial recognition software with that data to help identify criminals.

As it turns out, the FBI facial database is much bigger than anyone thought, much less reliable, and the FBI doesn’t want to consider it protected information. And that opinion isn’t from some sort of consumer privacy group: It’s from the government itself.

GAO Report

The Government Accountability Office (GAO) released its report, FACE Recognition Technology: FBI Should Better Ensure Privacy and Accuracy in May with this information about the FBI’s Facial Analysis, Comparison, and Evaluation (FACE) program. It was provided to the Senate Subcommittee on Privacy, Technology and the Law in the Committee on the Judiciary. (The whole report is 68 pages long and well worth reading.)

“The GAO found that the FBI has been disregarding some of even the most basic privacy protections and standards,” writes Kia Makarechi in Vanity Fair. “To wit: the driver’s-license photos of the residents of 16 states and some additional 30 million photos from a biometric database are available for the FBI to search at will. Another 18 states are reportedly negotiating with the FBI over the use of driver’s-license images.”

Altogether, the FBI facial database has access to more than 411 million pictures, the GAO reports. “FBI’s Facial Analysis, Comparison, and Evaluation (FACE) Services unit not only has access to FBI’s Next Generation Identification (NGI) face recognition database of nearly 30 million civil and criminal mug shot photos, it also has access to the State Department’s Visa and Passport databases, the Defense Department’s biometric database, and the drivers license databases of at least 16 states,” writes the Electronic Frontier Foundation. “Totaling 411.9 million images, this is an unprecedented number of photographs, most of which are of Americans and foreigners who have committed no crimes.”

In addition, the FBI hadn’t sufficiently notified the public of the technology’s use, the report found. In 2008 the FBI published a PIA [Privacy Impact Assessment] of its plans, the report writes. “However, the FBI did not publish a new PIA or update the 2008 PIA before beginning the NGI-IPS pilot in December 2011 or as significant changes were made to the system through September 2015. In addition, [Department of Justice] DOJ did not approve a PIA for FACE Services until May 2015 — over three years after the unit began supporting FBI agents with face recognition searches.” The DOJ also did not complete a System of Records Notice (SORN) in a timely manner, the report adds.

False Positives

As far as the facial recognition part goes, the FBI had previously said that as many as 20 percent of the identifications were incorrect. It turns out that the agency was wrong, and it actually has no idea how often the software returns false positives – but the FBI contends it doesn’t matter, the GAO writes. “According to FBI officials, because the results are not intended to serve as positive identifications, the false positive rate requirement is not relevant.” The GAO went on to point out how, according to government specifications, that belief is incorrect, and listed several ways that the FBI could have tested this.

In addition, the FBI doesn’t go to any effort to ensure the accuracy of the photos it gets from other sources, such as state databases, saying that it was the responsibility of the external source, the GAO report continues. “However, states generally use their face recognition systems to prevent a person from fraudulently obtaining a drivers’ license under a false name, while the FBI uses face recognition to help identify, among other people, criminals for active FBI investigations,” the report notes. “Accuracy requirements for criminal investigative purposes may be different.” Moreover, other federal agencies such as the Transportation Security Administration (TSA) perform their own accuracy checks, the report adds.

This is particularly an issue because facial recognition software is not created equal. Systems developed in different countries tend to do best with people of the most prevalent race in that country, and less well with minorities, write Clare Garvie and Jonathan Frankle in The Atlantic. For example, in the U.S., the facial recognition algorithms work better with pictures of white people than those of black people.

False positives can create problems down the line, according to the GAO. “Given that the accuracy of a system can have a significant impact on individual privacy and civil liberties as well as law enforcement workload, it is essential that both the detection rate and the false positive rate for all allowable candidate list sizes are assessed prior to the deployment of the system,” the report notes. “According to a July 2012 Electronic Frontier Foundation hearing statement, false positives can alter the traditional presumption of innocence in criminal cases by placing more of a burden on the defendant to show he is not who the system identifies him to be.”

It turns out, also, that facial recognition searches are more common than wiretaps — but don’t have the same protections, notes Alvaro Bedoya, the executive director of the Center on Privacy and Technology at Georgetown University Law Center. “According to the GAO, the unit fielded approximately 214,920 searches or requests between 2011 and 2015 — 36,420 involving the 16 states’ driver’s-license photos,” Makarechi writes. “Overall, FACE found 8,590 cases in which a ‘likely candidate’ was returned to an FBI agent.”

No Protections

To add insult to injury, the FBI requested in May that its biometrics database – including fingerprints and facial photographs — be exempted from certain provisions of the Privacy Act, a move that the EFF is also fighting.

“The big concern is that the FBI is proposing to exempt NGI from any requirement that they update or correct data about somebody in the future,” Jennifer Lynch, EFF senior staff attorney, tells Ellen Nakashima of the Washington Post. In response to concerns from the EFF and more than 40 other groups, the Justice Department has extended the comment period to July 6, Nakashima adds.

In the meantime, smile. The FBI may be watching.


June 14, 2016  10:48 AM

What’ll OpenText Do With Recommind?

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
E-discovery, ediscovery, Opentext

The e-discovery acquisition market has been pretty much a snore the past few years, after several years of musical chairs. But things might start heating up again due to a major acquisition from an unexpected source: enterprise information management software vendor OpenText has acquired Recommind for a reported $163 million, scheduled to close in Q1 2017.

Recommind has been reliably, though quietly, in the leaders section of the Gartner magic quadrant for e-discovery software for years now (though it did start out in 2011 as a “visionary”). The acquisition might lead to other vendors thinking that they might want to take a look at the other leaders – something Gartner actually predicted in last year’s MQ – particularly since 2015 featured a number of other peripheral e-discovery acquisitions.

Incidentally, you’ve got to figure that Gartner has to be really POed about the timing of this announcement; it typically releases its annual Magic Quadrant on e-discovery about this time of year, and now they’re either going to have to scramble to rewrite it, or know that it’s going to come out immediately out of date.

Ian Lopez of Legaltech News points out that Recommind had lost several executives recently. “Among them are former VP of business development Dean Gonsowski, who is now with kCura; Bob Clemens, who left his position as vice president of sales in America at Recommind to join CS Disco; and Philip Favro, who left his post as Recommind’s senior discovery counsel to join Driven as a consultant,” he writes.

What’s particularly interesting about this acquisition is that it’s OpenText. While a number of e-discovery vendors have been acquired over the years, generally the acquirers have been fairly major vendors – Symantec, HP, Microsoft, and so on. But OpenText? It’s interesting to see a vendor in this market seeing e-discovery as a field to enter. Image and Data Manager suggests it’s because of the increasing threat of litigation that companies face about their data.

And what OpenText is going to do with Recommind is not yet clear. OpenText could take Recommind’s technology and deploy it within proprietary offerings as Microsoft has with Equivio, and withdraw from the e-discovery industry altogether, or could become part of OpenText like Clearwell did when Symantec bought it, Lopez quoted Favro as saying.

OpenText has made a number of other acquisitions recently, and it’s thought that it’s not quite done. The company recently bought ANXeBusiness, a provider of cloud-based information exchange services for the automotive and healthcare industry, for about $100 million, and $170 million for some pieces of HP, including the HP TeamSite multichannel digital experience management platform, digital asset management solution HP MediaBin, and intelligent workforce optimization solution HP Qfiniti, reports the Waterloo Region Record.

In fact, content management consultant Laurence Hart thinks that OpenText might even acquire EMC’s red-headed stepchild Documentum. “We knew more acquisitions were coming after they then announced that they were raising $600 million and are planning to spend upwards of $3 billion on acquisitions over the next five years,” he writes. “It is fun to conjecture about the possibilities of Open Text acquiring Documentum from EMC. The problem is that Documentum is a huge acquisition. Documentum will cost a lot and that $600 million Open Text is raising would likely not cover the cost. The price does fit into Open Text’s 5 year plan financially but the acquisition would front-load that $3 billion in 2016.”

The company also flexed its muscle in 2014 by suing Box for $268 million for patent infringement, right as Box was going public. Eventually, it had to settle for less than $5 million.

In any event, the announcement could set off another round of musical chairs in the e-discovery market, Favro notes.


May 31, 2016  10:49 PM

Fearing Government Warrants, Firms Ditching Client Data

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Encryption, government, privacy, Security

In this day and age of “big data,” where many companies are trying to collect more and more data about their customers so they can analyze it to serve them better or sell them more stuff, some Silicon Valley companies are trying the opposite tactic: Jettisoning as much customer data as possible.

While some companies have done this before because they want to protect themselves from revealing incriminating details in the event of electronic discovery in a legal case, these Silicon Valley companies are going even further. They don’t want to possess any of their customers’ data in case the federal government comes looking for it.

Instead, the companies are decoupling encryption from the services they provide to their customers while having the customers encrypt their own data. Though this can reduce the performance of the companies’ products and make it harder for them to support their customers, it also means that they can’t be forced to surrender their customers’ data, because they won’t have it to give.

‘Radicalized Engineers,’ Oh My

This is all according to a recent article in the Washington Post that is drawing a lot of attention, with some people calling these companies traitors with “radicalized engineers” who are protecting terrorists. Others say they companies are responding logically to government efforts such as the Edward Snowden revelations and the more recent attempts to force Apple to make it easier for the FBI to break into an encrypted iPhone.

“The trend is a striking reversal of a long-standing article of faith in the data-hungry tech industry, where companies including Google and the latest start-ups have predicated success on the ability to hoover up as much information as possible about consumers,” writes Elizabeth Dwoskin. “Now, some large tech firms are increasingly offering services to consumers that rely far less on collecting data.”

It’s not just the government – the companies also don’t want to become targets for hackers by  possessing a lot of valuable data – but it’s the government aspect that’s attracting the most attention, especially as the FBI and Congress continue to press for legislation to force companies to include “back doors” in their encryption products.

While such a back door might be intended for governmental agencies or law enforcement, a door is a door, and as such is a risk, contend security experts such as Bruce Schneier. Consequently, an increasing number of high-profile companies are following the lead of Apple and Google, which in 2014 turned on encryption by default, meaning they couldn’t decrypt their customers’ products even if they wanted to.

‘So Secure Even We Can’t Read It’

The upside of this decision is customers of these companies can feel pretty sure that their data is secure, since even their own vendor can’t get at it. The downside is that if their customer loses their encryption key or something, they’re hosed, because not even the vendor can help them out.

But for a number of users, that’s a risk they’re willing to take. While originally, customers wanting to hold their own encryption keys were large, sophisticated financial services companies, such as Goldman Sachs and Blackstone, a wide variety of other companies, including media and automotive firms and small banks, are now making these requests to hold their own encryption keys, Dwoskin writes.

Critics claim that such actions help protect terrorists, while the companies say it forces law enforcement agencies to focus their attention where it belongs: On the suspect. “If you have an issue with my customer, go talk to my customer, don’t talk to me,” a representative from one vendor tells Dwoskin. “I’m just a tech guy, and I don’t want to be in the middle of these things.”


May 29, 2016  12:35 AM

Spokeo Decision Causes Ripples

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
privacy, Security

As expected, the Supreme Court ruled before June on the case of Spokeo Inc. vs. Robins. But instead of ruling in one of the two directions the industry expected – each of which drew all sorts of dire predictions from its opponents – the court found a way to Kobayashi Maru its way out of it by finding a different solution.

As you may recall, the whole thing started last November when a guy, Thomas Robins, sued the data aggregator Spokeo for having inaccurate data about him in its databases. He said the inaccurate data could have made it harder for him to get a job or credit, though he couldn’t point to specific examples of this having happened. He ended up suing Spokeo under the Fair Credit Reporting Act (FCRA) that requires such database companies to use accurate information.

Ironically, the real upshot was not this particular case itself, but what a final ruling could mean in terms of precedents. Spokeo supporters warned that finding in favor of Robins would mean that practically anybody could file a class-action suit on practically any tiny technical detail that some company screwed up, potentially costing millions or billions. Robins’ supporters warned that finding in favor of Spokeo would mean that nobody would ever be able to file a class-action suit ever again, unless each member could point to specific, enumerated injuries.

What added an extra fillip to the whole situation is when Justice Antonin Scalia suddenly died in February. That left open the notion that the Supreme Court could be deadlocked 4-4 on the case – which would have had the effect of upholding the Ninth Circuit court’s verdict, because they would have needed a majority to overturn it, but without setting a precedent for future cases.

As it happens, the Court voted 6-2 on its final decision, which was to send it back to the Ninth Circuit to let them figure out more details on the Robins case. Specifically, the Court directed the Ninth to determine two things about the case:

  • Whether it was particular. “There was no dispute that Robins had satisfied the particularity requirement of injury in fact,” writes James McKenna of Jackson Lewis, in National Law Review. “The injury he alleged was based on the violation of his own statutory rights and was due to the inaccurate information about him.”
  • Whether it was concrete. “A concrete injury must be ‘real’ and not ‘abstract,’” McKenna writes. “There are three aspects of concreteness. First, it is not synonymous with being tangible. A concrete injury need not be tangible. Intangible injuries, such as the abridgment of free speech rights, can be concrete.” Second, while Congress implementing a law is important, just because a law provides the right to sue doesn’t mean they can, he writes. “Third, the risk of real harm’ can constitute a concrete injury, even if the harm may be difficult to prove or measure,” he adds.

Where the Ninth had erred, the Supreme Court ruled, was by not considering those two points separately in the first place, especially the concreteness one. The Supreme Court didn’t say the Ninth was wrong, simply that, like Srinivasa Ramanujan, it hadn’t proved it enough. What actually “counts” as an injury, and how can the Ninth persuade the Supreme Court about this case?

“FCRA is designed to prevent harm resulting from a lack of ‘fair and accurate credit reporting,’ but inaccurate reporting is not a harm, in and of itself, because the incorrect information may not impact, or create a material risk of impact to, the consumer,” write Rand L. McClellan and Keesha N. Warmsby of Baker Hostetler for Mondaq. “But an agency’s failure to disclose information, which must be disclosed by statute, however, immediately violates the very right (i.e. access) the statute was designed to protect, and therefore constitutes injury-in-fact.”

(Interestingly, some states don’t have these distinctions.)

The two dissenters, who wrote their own opinion starting on page 21 of the decision, were Justice Ruth Bader Ginsburg and Justice Sonia Sotomayor, who said they believed that the Ninth Circuit had made enough of a case for the “concreteness” of the issue.

What’s interesting is that, in response to the Supreme Court’s decision, several lower courts have already tossed out several cases by saying the person or class didn’t have standing, using this most recent decision. For example, several data breach cases where people were suing a company for losing their personally identifiable data have been thrown out because the people can’t point to specific harms that have occurred due to the loss of their data.

“It will be a long while until the lower courts decide who won Spokeo – but it is already clear that defendants in privacy class actions are going to wield the Supreme Court ruling like a weed wacker,” writes Alison Frankel in Reuters.

Similarly, people who sue companies providing robocalls in violation of the law against them could have problems, write Diana Eisner, Christine Reilly, and Marc Roth of Manatt, Phelps & Phillips, LLP in JD Supra Business Advisor.  “Plaintiffs may need to allege a concrete or real harm,” they write. “This may be difficult, given that so many Americans are on ‘unlimited’ or flat-rate cell phone plans where no charges are incurred for incoming calls or text messages and no other ‘injury’ exists other than an alleged privacy invasion. In cases where the plaintiff did not answer the phone or know about the call absent the use of Caller ID, the plaintiff may be unable to allege a concrete harm stemming from the unanswered call, potentially shuttering the lawsuit.”

So if the Ninth can’t figure out a way to prove the concreteness aspect, the ultimate result may be very much like what the Spokeo supporters feared.

It isn’t clear when the Ninth Circuit is expected to rule, but if it decides to rule in a way that sends the case back to the Supreme Court, it may be next year – after the new President we elect in November chooses a new Supreme Court Justice who can make it through the approval process — before a decision is made.


May 24, 2016  1:42 PM

Gov. Vetoes Arizona Cloud Mandate

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
cloud, government

Arizona Governor Doug Ducey has vetoed a bill intended to encourage state agencies to move to the cloud –at the risk, some said, of jail time for agencies that didn’t comply. While the Arizona cloud mandate bill has failed, it may still be coming to a state near you.

The bill, Senate Bill 1434, specified that the IT department would adopt a policy to establish a two-year hardware, platform, and software refresh cycle that required each budget unit to “evaluate and progressively migrate” the group’s IT assets “to use a commercial cloud computing model or cloud model,” according to the bill. Budget units were directed to consider purchasing and using cloud computing services before making any new IT or telecommunications investment.

Moreover, each budget unit had to develop a cloud migration plan by next January, and thereafter report twice a year to the Arizona chief information officer (CIO) on how its migration was progressing.

It wasn’t obvious where the Arizona cloud mandate came from, such as whether it was written by legislative staff or what. It had 24 bipartisan cosponsors and was supported overwhelmingly by the legislature at every step of the process.

Imagine, Cloud Fans Supported It

Certainly, cloud supporters were all over it. “The endless hand-wringing over the use of the cloud needs to end,” writes David Linthicum, a consultant at Cloud Technology Partners, in InfoWorld.  “It may take laws such as this to get at least the public sector moving in the right direction. It might even save them — and thus all of us — money.”

There’s no argument that the cloud has advantages over a traditional data center. It is funded through operational money rather than capital money. Its expenses can be more predictable. It can be expanded and contracted more quickly and granularly than a data center. It doesn’t require as many skilled staff members. It offers the possibility of being more reliable. “The State of Arizona has already migrated its DNS solution to AWS, which saves it approximately 75 percent in annual operating costs on its DNS solution compared to its previous on-premise infrastructure,” notes Nicole Henderson in the blog Talkin’ Cloud.

But to the extent of mandating jail time if state CIOs don’t comply? (To be fair, only Linthicum mentioned this aspect, only in passing, and he didn’t say where it came from; it

Clouds? In Arizona?

And why Arizona? Arizona CIO Morgan Reed, appointed in October, 2015, was formerly Expedia’s director of data center services, according to Jake Williams at StateScoop.  Before that, he served in a variety of positions with Web hosting company GoDaddy.com, including leading IT governance, disaster recovery and business continuity, as well as leading global data centers, IT systems and advanced support operations in previous roles, Williams writes. (Arizona lost its top three IT executives at the beginning of 2015 as they all quit to join the private sector; Reed replaced an acting CIO.)

“When Reed speaks of the benefits of moving at the pace of the cloud, he does so with prior experience,” writes Ricky Ribeiro in StateTech. Reed goes on to say that Arizona wants to be “as agile and accelerated as the private sector.”

On the other hand, some commenters have speculated that it’s simply a conservative Republican attempt to streamline government and turn over as much as possible to the private sector.

Not to mention, do we really want legislators mandating the technology that state agencies should use? When we’ve seen the tenuous grasp that government officials have on technology such as encryption, is it a good idea for them to pick winners and losers in technology?

The veto message from Ducey, a Republican, wasn’t very enlightening. It consisted of two sentences. “It’s time for state government to enter the 21st century, and major advances in technology are needed to get there,” he writes. “This bill appears to add extra layers of bureaucracy that are unnecessary and will stall needed advancements in technology.” So perhaps he was convinced by the argument that legislators shouldn’t be deciding on technology.

Next Steps

The Arizona Legislature, has already adjourned so it won’t be able to attempt to override the Governor’s veto. It remains to be seen whether someone will attempt to bring the bill back next year.

All that said, some predict that other states, Ducey’s veto notwithstanding, will follow suit, noting that the federal government started it all in 2010 with its cloud mandates. “If the proposal becomes law and is successful, count on other states to follow the same path,” Linthicum writes. “That would be a good development because state agencies have built a whole lot of planet-heating data centers over the last 20 years and are planning to build many more.”

Even if you’re not the sort of person who typically follows your state legislature, it might be a good time to start.


May 13, 2016  11:10 PM

Man Jailed for Forgetting an Encryption Passcode

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Encryption, government, privacy, Security

It’s like one of those nightmares where you keep trying to type in a password and it doesn’t work. Except in this case, it’s real life, and a man has been in jail for seven months – without being charged – for forgetting an encryption passcode.

Francis Rawls, a former sergeant in the 16th district of the Philadelphia Police Department, has been accused of having child pornography on two encrypted Macintosh hard drives, which had been seized in March, 2015. He was ordered by a judge in August, 2015, to provide the passcode to decrypt the drives, but he claims to not remember it.

I don’t know about you, but I’ve been known to forget a password over a long weekend, let alone five months.

Consequently, for the last seven months, Rawls has been in jail for contempt of court for refusing the judge’s order, though he hasn’t been charged with a crime. His attorney is claiming that having to provide the passcode violates his right to self-incrimination.

Back story

Courts have been deciding back and forth on the issue for several years now and, most recently, have decided that a phone password is more like the combination to a safe than a physical object such as a key. It matters because something that is the expression of one’s mind, like the combination to a safe, is protected under your Fifth Amendment rights not to incriminate yourself. A physical key, something you possess, is something you can be forced to produce.

The Fifth Amendment angle is, in fact, why police wanted Rawls to type in the passcodes himself rather than tell someone else what they were – because it was thought that the latter would amount to testifying against himself, writes Christine Hauser in the New York Times.

“Quite sensibly, Rawls refused,” writes William Grigg in this website Personal Liberty. “He is a veteran cop and knows – better than the public he supposedly served in that capacity – what happens when a targeted citizen offers the police unrestricted access to his home and personal effects. If he had acceded to the demand for his encryption codes, Rawls would have done the equivalent of allowing the police to rummage through every room, closet, and drawer in his home, while letting them inspect all of his correspondence, medical records, and personal finances. Diligent and motivated investigators would eventually find something that an ambitious prosecutor could use to manufacture a felony charge.”

The government is claiming it wouldn’t be self-incrimination because it knows there’s child pornography on the hard drives. “If the government already knows there’s child pornography on Rawls’s hard drives, then it’s not self-incrimination for Rawls to give his passwords,” explains Travis Andrews in the Washington Post. “Think of it like a search warrant: if an officer of the law is granted a search warrant to someone’s house, then that suspected party has no recourse but to allow the officer enter his house. Much like that hypothetical person wouldn’t be able to lock the door, Judge Rueter ruled that Rawls can’t refuse to provide the passwords.”

On the other hand…

Needless to say, Rawls’ defense attorney takes issue with this belief. “The government has not, as required by the relevant decryption precedents, demonstrated that the storage of any particular file or files on the hard drives is a foregone conclusion,” notes the brief. “Instead, it put forward only a suspect witness who gave attenuated testimony and a forensic examiner who was unable to offer any authoritative opinion regarding the contents of the hard drives.”

Nor does his attorney agree with the dodge of having Rawls type in the passcodes rather than provide them to someone else. “The fact that the government seeks to have the codes entered into a forensic interface rather than spoken aloud does not change the analysis,” notes the brief. “It still demands that Mr. Doe divulge the contents of his mind, not demonstrate his typing skill.”

In an interesting coincidence, Rawls is being compelled to enter his passcode using the All Writs Act, the same act under which the Federal Bureau of Investigation recently tried to force Apple to develop an operating system to make it easier for it to break into a suspected terrorist’s iPhone.

After Rawls was accused of having child pornography on his iPhone 6, he did enter the passcode to decrypt that, and no child pornography was found. Investigators were able to decrypt his Mac Pro using a passcode found on the iPhone 6, without finding child pornography on it, according to a brief filed in the case. And he did try, reportedly for several hours, to come up with the passcode to decrypt his hard drives.

Was it a ruse?

The fact that Rawls did willingly decrypt the phone, and did attempt to enter a passcode for the hard drives, is being called a “ruse” by prosecutors. The government “contended that Mr. Doe’s unlocking of the iPhone 6 and entry of passcodes for the hard drives ‘was a deliberate façade designed to feign compliance with the Court,’” writes Rawls’ defense attorney. How it would have looked different if he honestly did want to cooperate and honestly did forget the passcode, nobody seems to have said.

This isn’t the first time that someone has been jailed for contempt for refusing to provide a passcode for an encrypted drive – several people in the U.K. have been jailed for refusing to provide an encryption passcode — but it may be the first time in the U.S.

It’s also not the first time someone has refused to provide the passcode to decrypt their hard drive; Jeffrey Feldman, who was also accused of having child pornography, refused to provide a passcode, and several judges ruled several different ways on whether he had to provide it. In that case, however, investigators learned enough details about what was on the drives on their own that they dropped the attempt to have him decrypt them.

Solitary

To make matters worse for Rawls, because he is a police officer – that is, former police officer; he was fired, after 17 years on the force, after he was jailed – he was put into solitary confinement to protect him from the other inmates. So not only has he been in jail seven months; he’s been in solitary confinement for seven months.

“He spends 22 and a half hours of every day completely alone,” writes Andrews. “If someone visits him, they have to remain behind a barrier. Each month, he’s granted one fifteen-minute phone call.”

Meanwhile, government prosecutors have until May 16 to file their brief with the Third District Federal Circuit Court of Appeals in response to Rawls’ attorney’s motion to have him released.

On the other hand, Rawls has a long way to go to set any sort of record for longest jail time, without a charge, for contempt. H. Beatty Chadwick was released in 2009 after serving 14 years for contempt after saying he had lost $2.75 million and could not split it with his former wife. He pointed out that he would have served less time if he had been convicted of third-degree murder.


April 30, 2016  10:57 PM

‘License and Registration. And, Cell Phone.’

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
government, privacy, Security, smartphone

We’re all accustomed to handing over our driver’s license and registration when we get pulled over. But if New York lawmakers have their way, you’ll also have to hand over your cell phone after an accident, so police can check to see whether you were using it illegally at the time.

While the Supreme Court ruled in June, 2014, that police need a search warrant, or your consent, to gain access to your phone, this bill — New York Senate Bill S6325A — elides this requirement by declaring that by driving in New York, you are automatically giving consent.

Motorists involved in crashes who refuse to surrender their phones to police would have their licenses suspended, and could face a one-year revocation of their license and a $500 fine, according to Mike McAndrew in Syracuse.com.

The bill prohibits police to review the content of a person’s cell phone, such as texts, social media postings, and email messages, McAndrew writes. “It would only allow officers to check if the phone was illegally used while driving.”

New York would be the first state to pass such a law, McAndrew adds, but the New York Times notes that such a law could spread to other states. For example, the Vermont legislature earlier this year considered a bill that would also require drivers to hand over their phones so police could check whether they’d been in use, without an accident being required.

The bill was announced in collaboration with an activist group founded by parents of children involved in crashes caused by drivers distracted by their devices, called Distracted Operators Risk Casualties, or Dorcs. (And no, we’re not making that up.)

But some are skeptical about the idea. “They would get into your phone only for the purpose of finding out whether you were using it at the time of the collision. They would not, of course, look at anything else while they were in there. They totally promise,” snarks Kevin Underhill in Lowering the Bar. “If there’s anything history has taught us, it’s that law enforcement and government can be trusted not to adapt a new technology for questionable uses despite promises to the contrary.”

Underhill points out, too, that the proposed law is pretty broad. “It applies not just to a driver suspected of causing injury or death but to all drivers in any ‘accident or collision involving damage … to property,'” he notes. “Somebody backed into your car at the gas station? Hand over your phone. Oh, you didn’t have it with you right then? Did you have it ‘near the time of such accident or collision’? Hand it over. And I’m not sure how they could determine whether you were using it at any particular time without learning about what you were using it for. For that matter, if the fact of ‘use’ is really all they would learn, how would they know you were using it illegally? I ‘use’ my phone all the time for voice navigation. Is that a problem now? What about hands-free calling?”

“There are so many ways in which somebody could be using the phone in a car that is not a violation of any laws,” Mariko Hirose, a lawyer with the New York Civil Liberties Union, tells Joel Rose of NPR. “This bill is simply providing a way for law enforcement to get around the privacy protections that apply to a cellphone.”

Other civil liberties organizations also have concerns with the bill, such as the Electronic Frontier Foundation (EFF), which believes it’s unconstitutional. “A law that essentially requires you to hand over your phone to a cop in a roadside situation without a warrant is a non-starter,” Lee Tien, a senior staff attorney with the EFF, told Jose Pagliery of CNN. Similarly, Marc Rotenberg, the president of the Electronic Privacy Information Center, called the bill “excessive, unnecessary, and invasive,” Pagliery writes.

Actually, the real target of the law appears to be the police, who apparently do not always attempt to retrieve records from cell phones after accidents. “Law enforcement can subpoena records from the phone company or ask a judge for a warrant to search the phone itself,” Rose writes. “But they don’t always do it because it takes a lot of money and time for cases that can be hard to prove.”

“All you really need to know about New York Senate Bill S6325A is that it would create a law named after a person (this one would be ‘Evan’s Law’), since any law named after a person is almost always a terrible idea,” Underhill writes. “If the law were a good idea, they wouldn’t need to try to generate support by manipulating people’s emotions.”


April 28, 2016  11:13 PM

Yet Another Set of Terrorists Fails to Use Encryption

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Encryption, government, privacy, Security

Government officials have been using recent terrorist attacks to try to justify limiting the use of encryption. You may recall, for example, that the Federal Bureau of Investigation (FBI) recently attempted to force Apple to develop a different version of the iPhone operating system to make it easier for the agency to break into encrypted phones thought to be owned by the perpetrators in last December’s San Bernardino attack.

Similarly, states such as California and New York have attempted to put forth bills that would outlaw the sale of cell phones with unbreakable encryption, while agencies such as the FBI have been recommending a mandated “back door” for law enforcement into encrypted phones.

These efforts have been continuing even though there’s been very little indication that terrorists are actually using encryption. For example, the FBI used last fall’s terrorist attacks in Paris to justify their long-held position that governments should mandate a “back door” into encryption, even though there’s no evidence the attackers used encryption — and, in fact, quite a lot of evidence that they didn’t.

One of the most recent incidents was the March bombings in Brussels. Rep. Adam Schiff, the California representative who’s the top-ranking Democrat on the House Intelligence Committee, suggested the same day they occurred that encryption might have been involved, writes Cory Bennett in The Hill.

Since then, law enforcement has been studying the laptop of one of the suicide bombers, Brahim El-Bakraoui, who blew himself up at Brussels airport, writes Lucy Clarke-Billings in Newsweek. “The bomber referred to striking Britain, the La Défense business district in Paris, and the ultra-conservative Catholic organization, Civitas, in a folder titled ‘Target,’ written in English, according to the source,” Clarke-Billings writes. “The laptop was found in the trash by police in Brussels shortly after the suicide bombings on March 22 that killed 32 people at the city’s airport and on a Metro train.”

So let’s get this straight. The data was not only unencrypted, but in English. And on top of that, it was located in a file folder. Labeled TARGET.

That’s right up there with Jurassic Park’s “It’s a Unix system! I know this!”

Security experts who are following the incidents believe there’s no indication that terrorist organizations have some sort of overarching encryption plan. “The clear takeaway from this list is that: 1) ISIS doesn’t use very much encryption, 2) ISIS is inconsistent in their tradecraft,” writes an information security researcher known as “the grugq” in Medium. “There is no sign of evolutionary progress, rather it seems more slapdash and haphazard. People use what they feel like using and whatever is convenient.”

The laptop discovery fits in with what appears to have been the strategy used thus far, writes Quartz. “ISIL’s strategy in last year’s Paris attacks and others was simple: avoid trackable electronic communications like email and messaging apps in favor of in-person meetings and disposable devices, or ‘burner phones,’ that are quickly activated, used briefly, and then dumped,” the organization writes. “Communications from the Paris attacks were reportedly (paywall) largely unencrypted, and investigators have found much of their intelligence through informants, wiretaps, and device-tracking rather than by trying to decipher secret messages. That’s not to say that terrorists won’t use encryption to carry out heinous acts. They will. But encryption is by now a fact of life: your apps, credit cards, web browsers and smartphones run encryption algorithms every day.”

Of course, to some people, the TARGET folder discovery was almost too good to be true. Skeptics on social media have been suggesting that the folder was planted by a group such as the CIA, that the folder was a decoy, and so on.

On the other hand, there doesn’t seem to have been much of a question about confirming who performed the Brussels attacks, especially since they were suicide attacks. If the folder was really planted, wouldn’t it have made for sense for the government agency involved to have used some sort of encrypted – though easily breakable – code? That way, the agency could have used it to justify its attempts to outlaw encryption. If the FBI planted the TARGET folder, it missed an opportunity.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: