Yottabytes: Storage and Disaster Recovery


June 24, 2019  9:12 AM

Australian Archivists Worry About ‘Digital Dark Ages’

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

What with “but her emails” and government officials using applications like Signal to avoid having records of communications, it’s easy to forget that, actually, data doesn’t last forever.

That’s what people are finding out in Australia, where the National Archives of Australia made headlines recently by saying it expected to lose access to some of its electronic records by 2025 because it couldn’t read them anymore.

“Australia’s memory institutions are racing to digitize their magnetic tape collections before the year 2025, when archivists around the world expect it will become almost impossible to find working tape playback machines,” writes James Elton for ABC News in Australia. “The National Archives of Australia holds some 130,000 hours of audio and video tape that still need to be rescued.”

Consequently, the National Archives people are spending their days looking for old machines to play back the tapes so they can be converted to a more modern format, Elton writes. “The Archives is using its limited budget to pick up tape machines wherever it can find them,” he writes. “Archivists scour online marketplaces like eBay and Gumtree looking for machines for sale, even broken machines that can be harvested for working parts.”

People who know how to work the machines are also in short supply – and if you know how to work them, Australia may have a job for you. “As the technology has changed, people are no longer learning how to use the older machines,” Elton writes.”It’s mostly ex-industry people, working for the preservation service.”

This isn’t a new problem, and Australia isn’t alone. The issue of the “digital dark ages” has come up a number of times over the past couple of decades, as people lose access to digital information. Whether it’s due to links that no longer work, magnetic media that suffers bit rot, software formats we can no longer use, or magnetic media we can no longer read, people are increasingly concerned about what this will mean for future generations.

Other examples include the game Prince of Persia, which was laboriously restored from Apple ][ disks a few years ago; the state of Rhode Island, which lost access to some of its government email records due to incompatibilities between the different proprietary email systems state government was using; and the fact that some government agencies still use Zip drives and floppy disks, even with nuclear missiles.

The topic is a common one among archivists, and comes up a lot during Electronic Records Day (which, at the same time, archivists are trying to persuade organizations to digitize their paper archives). This problem has also led to a burgeoning business in audio and video tape restoration, especially as people become more interested in genealogy, noted one comment.

One possibility is to develop software to read the data off magnetic tapes in a different way and reconstruct the images, Elton writes.

Archivists might even work together to get companies to start manufacturing the machines again, one Australian archivist said.  “If that’s what it takes, will then we will be pursuing those strategies,” Elton writes.

June 20, 2019  9:09 AM

Counsel, Judge Beat up on HP Execs in HP-Autonomy Fraud Trial

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Autonomy, HP

It’s the gift that keeps on giving. Two former major HP executives – coincidentally, both women – testified last week in the Autonomy fraud case, and both got pushback from opposition counsel, with one even getting criticized by the judge.

In case you’ve forgotten, in the Autonomy-HP merger – officially the sixth-worst merger and acquisition of all time – HP chairman and CEO Leo Apotheker (who was fired later that year) paid $11.1 billion to acquire Autonomy, a European e-discovery company. By the following year, HP claimed that Autonomy had cooked its books to overvalue itself, wrote down the purchase a a $9 billion loss, and sold off the company’s remaining assets in 2016.

In March, a $5 billion civil lawsuit against Autonomy CEO Mike Lynch started. Testimony from former HP CFO Cathie Lesjak and former HP CEO Meg Whitman was part of this proceeding.

Interestingly, Lesjak said that she had tried to protest against the deal at the time, saying that “she had felt compelled to speak out against the acquisition at a meeting of the Silicon Valley group’s board in August that year” and that the company’s 64 percent premium was too high, writes Simon Duke in The Times. Feeling “blindsided,” Apotheker said at the time she would be fired, but before that could take place, he was fired himself. Lesjak continued working for HP until this February.

Just goes to show how dangerous groupthink can be at a company, where nobody speaks up because they don’t want to be the odd one out or are afraid of the consequences. Though, how long does it take someone to get fired at HP, anyway?

Lesjak went on to be criticized because there was no written record of the calculation of the writedown valuation.

“’I don’t know if it was ever in writing,’ Ms Lesjak said of a crucial part of HP’s calculations which accounted for $2.5bn of the writedown,” writes James Cook in the Telegraph. “’It was a verbal conversation that I had with [HP executive Andy Johnson] when we put it up on the whiteboard and we walked through it together.”

Testimony then went to Whitman, who took over as CEO after Apotheker’s firing and who presided over the Autonomy writedown.

Much of the testimony – as well as the news coverage – centered around Whitman referring to throwing Apotheker under the bus after he blamed the board, on which she sat at the time, for agreeing to the original Autonomy purchase.

“She said in an email dated Dec 14 2012 to HP’s chief communications officer Henry Gomez: ‘Happy to throw Leo under the bus in a tit for tat’ after Apotheker had said the HP board should share the blame for the failed deal,” writes Paul Sandle for Reuters. “Asked by Lynch’s counsel Robert Miles whether she was just protecting herself, Whitman said that was ‘absolutely not the case’. ‘It was a moment of disappointment and anger,’ she said. ‘I shouldn’t have said it.’”

Whitman was also criticized for accusing Lynch of fraud without sufficient proof. “The former boss of Hewlett Packard ‘shot first and asked questions later,’ a court was told,” writes Duke in a different The Times article, adding that she was accused of “trashing their reputation.”

Even Judge Robert Hildyard got into the act, throwing shade at Whitman by alluding to her failed campaign for California governor when he asked her to stop making speeches in response to questions. “Can you please stop making speeches. It’s just not what you’re here for,” he said. “You may have done that at other times in your career but it’s not what you’re here to do today.”

Hildyard also intervened in another exchange, as described by Jonathan Browning in Bloomberg News:

“I don’t know why we would ask a fraudster why he had committed fraud,” Whitman said. “We had been a victim of significant fraud.”

“No, you had an allegation of fraud,” replied Miles, who accused her of “trashing” the reputations of Autonomy managers. “And it’s nothing more than that and you know it.”

“Well, I don’t believe that’s the case,” Whitman said. “We knew exactly what had gone on here.”

At this, Judge Robert Hildyard intervened.

“Then I wouldn’t have anything to do, would I?” the judge said. “Things have to be proven.”


June 13, 2019  9:38 AM

Hofeller Storage Case Fascinating Study in How Not to Protect Data

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
privacy, Security, Storage

If you saw it on tv you’d probably chide it for being a cliché: A guy who spends his whole life passing information on in phone calls so he doesn’t have anything written down in email or paper dies, and has all sorts of incriminating stuff on his un-password-protected, unencrypted computer storage.

Yet that’s apparently the situation with Thomas Hofeller, who died last August, and whose daughter discovered all sorts of information about manipulating the census and redistricting process on his storage devices.

We’re not going to get into the politics of it all. There’s plenty of that elsewhere. But it’s a fascinating study in storage.

We’re written before about the issues of the storage of people who’ve died. There’s two categories of data. One is the stuff the dead person has written to which the heirs would like access: financial records, medical records, photographs and videos, family history and so on.

The other is the stuff that the dead person really doesn’t want to get around, like browser histories, chatroom transcripts, and so on.

Finding a way to reveal the former while protecting the latter has always been a challenge, particularly if the person dies suddenly. Encrypt everything, and the heirs can’t get access to the material they need. Encrypt nothing, and all sorts of embarrassing stuff can come out.

It’s the latter case that’s coming up with Hofeller.

Ironically, he’d spent his whole life advising people to avoid putting things in writing. “Make sure your security is real.” “Make sure your computer is in a PRIVATE location.” “ ‘Emails are the tool of the devil.’ Use personal contact or a safe phone!” “Don’t reveal more than necessary.” “BEWARE of non­-partisan, or bi­-partisan, staff bearing gifts. They probably are not your friends.”

Yet when Hofeller died, he left four hard drives and 18 thumb drives of laptop backups, all in a convenient plastic bag. And apparently they weren’t password protected, or encrypted; none of the articles about this whole situation have indicated that the data was protected in any way.

His daughter, Stephanie Hofeller, had been estranged from him since 2014, writes Michael Wines in the New York Times. And that’s *really* estranged. Nobody in the family even told her that he’d died; she found out by accident by searching for him on the internet.

It’s not just the data itself that’s interesting. There’s also what the daughter did to get the data to the right people (as well as some luck).

The daughter happened to contact an organization called Common Cause to help find an attorney unrelated to her father to help settle the estate, Wines writes. As it happens, Common Cause was involved in a lawsuit regarding gerrymandering in North Carolina.

So that was Piece of Dumb Luck #1.

Moreover, the same law firm representing Common Cause in the gerrymandering lawsuit was also involved in a lawsuit regarding the citizenship question on the census.

So that was Piece of Dumb Luck #2.

Plus, the attorneys in question were smart. “They have been exceedingly careful to play by the rules,” writes Mark Joseph Stern in Slate. “Lizon offered her father’s drives to Common Cause directly, but its attorneys decided to issue a subpoena in February to obtain them formally and provide notice to third parties.”

Then came the potentially problematic part. “In February, attorneys challenging North Carolina’s legislative gerrymander notified the defendants, a group of Republican leaders in the legislature, that they’d issued a subpoena,” Stern writes. “The lawyers had asked Stephanie Hofeller Lizon to provide ‘any storage device’ containing redistricting-related documents left by her estranged father.”

And, for some reason, the attorneys defending the North Carolina gerrymandering case were asleep at the switch and didn’t think there was anything untoward about the request, and allowed it. By the time they realized something might be up, and tried to stop the process, opposing attorneys already had access to the data.

“At a hearing in April weeks after declining to challenge the subpoena, however, [attorney Phil] Strach attempted to block Common Cause’s attorneys from viewing the records they already had in their lawful possession,” Stern writes. “As Melissa Boughton reported at the time, Strach told Wake County Superior Court that he wanted the documents returned to Hofeller’s estate and implied that Lizon procured them improperly. Hofeller’s widow, Kathleen, expressly permitted Lizon to take the materials—but Strach claimed that Kathleen has been institutionalized and may not have been sufficiently competent to provide consent. (Stanton Jones, an Arnold & Porter attorney representing Common Cause, told the court that Kathleen has not been declared incompetent.) The court ignored Strach’s pleas, instead simply directing Common Cause’s attorneys to let the Republican defendants copy the Hofeller drives, pursuant to state law.”

That was Piece of Dumb Luck #3.

(Attorneys for the two sides have also been arguing over whether all the data is to be brought forth, or information that appears to be personal should be withheld.)

The result of finding the files is that people opposed to adding a citizenship question on the census were able to find out that there were other motives behind adding the question – just in time to file a motion about it before the Supreme Court, which is about to address the citizenship question, and just before the North Carolina gerrymandering case comes to trial in July.

Perry Mason would be proud.


May 31, 2019  10:28 PM

Another Judge Rules Against Compelled Biometric Cellphone Unlocks

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
privacy, Security

A second judge has ruled that having to use biometrics, such as a fingerprint or your face, to unlock your cellphone when you’ve been accused of a crime is a violation of the person’s Fifth Amendment rights against self-incrimination.

That’s after a first one in January.

As with a number of the cases around law enforcement trying to get information out of a person’s cellphone or laptop, the crime in question was child pornography. Exactly which law enforcement agency was doing this, and in what city, and the suspect’s name, was all sealed.

The judge in question is Idaho Chief U.S. Magistrate Judge Ronald E. Bush. “Using the individual’s fingerprints for this purpose would constitute a search and seizure under the Fourth Amendment,” he writes in his ruling. “For a search and seizure to be lawful under the Fourth Amendment it must be ‘reasonable.’ A search or seizure is unlawful, and therefore unreasonable, when it violates a person’s constitutional rights. Here, compelling the use of the individual’s fingerprints violates the Fifth Amendment right against self-incrimination because the compelled unlocking of the phone with fingerprints would communicate ownership or control over the phone. Because the compelled use of the individual’s fingerprints violates the Fifth Amendment, the search and seizure would not be reasonable under the Fourth Amendment. Thus, the Fourth Amendment and the Fifth Amendment prohibit the result sought by the Government.”

In contrast, “Furnishing a blood sample, for instance, or providing a handwriting or voice exemplar, standing in a lineup, or submitting to fingerprinting for identification purposes are not testimonial communications because such actions do not require the suspect ‘to disclose any knowledge he might have’ or to ‘speak his guilt,’” Bush continues.

Bush also notes that there were at least four other cellphones in the house with the suspect, so it isn’t at all clear that this particular cellphone was known to belong to the suspect. “The applicant avers that, when questioned at the residence at the time the earlier search warrant was executed, the individual told law enforcement his/her phone was in the bathroom. A phone was found in a bathroom, and the application implies that the individual was not in the bathroom when that statement was made,” Bush writes. “But three other phones were also located during the search. There is no specific information about how many bathrooms were in the residence. There is no information about whether the individual lives alone or whether anyone else lives or was in the residence at the time of the search. To be clear, none of these facts are determinative of the Court’s conclusion in this case. But they do illustrate that any connection between the individual and the phone at issue here is more tenuous than it might be under other circumstances.”

As in the January case, the judge is a magistrate, meaning his ruling could be overturned on appeal, as was a 2017 case in Illinois. In fact, law enforcement agencies are already trying to overturn the January case, using the Illinois case as a precedent, because it “held that no Fifth Amendment testimonial act occurs when agents press a subject’s fingers against a Touch ID sensor on an iPhone, because ‘the government agents will pick the fingers to be pressed on the Touch ID sensor, so there is no need to engage the thought process of any of the residents at all in effectuating the seizure,’ and applying the fingerprint to the sensor ‘is simply the seizure of a physical characteristic, and the fingerprint by itself does not communicate anything,’” the U.S. Attorney in California writes.

The reason this is an issue is that for some time now, it’s been true that, while people may or may not be required to give their cell phone passwords to law enforcement, they were required to give fingerprints and other biometric agents. That’s because a fingerprint is something you have, similar to the way that you can be compelled to give up a blood sample to test for alcohol. And just last August, law enforcement forced a suspect to unlock their iPhone with their face. These were all cited in the request to overturn the January ruling.

It’s also important to point out that, in both the January and May cases, it wasn’t altogether clear that the cellphone in question belonged to the suspect, and the case could indeed be made that using biometrics to unlock it would prove ownership. It’s not clear, for example, that the judges would have made the same ruling if there was a single person and a single cellphone in the house, making it much easier to demonstrate the cellphone in question belonged to the suspect.

In any event, with this ruling, and the one in January (as well as the similar one in Illinois in 2017 that was overturned), it’s getting more likely that this will eventually wind up in front of the Supreme Court.

 

 


May 27, 2019  12:18 AM

Supreme Court Rules Against E-Discovery Costs

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
E-discovery, government

After years of encouraging litigants to use electronic discovery on court documents, the Supreme Court has recently ruled that, well, you can, but you can’t count on getting reimbursed for it by the losing litigant, and now people are freaking out how to get it paid for.

It all started in 2010 when Oracle sued Rimini Street for copyright infringement. The details don’t really matter. (If you’re really dying to know, here they are.)

Oracle won the case, and in the process asked for reimbursement of what it said was more than $12 million in costs related to e-discovery. That particular aspect of the case has been bouncing around the courts ever since. Originally the court said yes, then an appellate court said yes, and then the Supreme Court said no – meaning Oracle has to give back the $12.8 million that Rimini Street paid it.

Keep in mind that the various expenses claimed by Oracle actually added up to more than the damages they were awarded.

The actual legal argument is one of those “it depends on what your definition of ‘is’ is” cases: It all depends on your definition of “full costs.”

See, there’s a place that spells out six kinds of legal costs litigants can ask for, but then the law says “full costs,” So does that mean you can only ask for those six kinds of legal costs, or can you actually ask for all the costs, because the law says “full costs”?

“Title 28 U.S.C. §1821 authorizes witness and mileage fees,” writes Mike Quartararo in Above the Law. “Section 1920 provides for six specific items which a prevailing party may seek to recover as costs. And §505 of the Copyright Act provides that in a copyright infringement action ‘the court in its discretion may allow the recovery of full costs by or against any party.’”

(Want more of the specific arguments?)

The problem is that courts haven’t given a definitive answer about whether e-discovery costs are recoverable, Quartararo writes. “Unfortunately, for more than 10 years, the courts have been indecisive on the issue of recovering eDiscovery costs,” he writes. “Some courts have found the costs recoverable; others have found the opposite; and yet others have split the baby and found partial costs recoverable.”

Courts first started ruling against collecting e-discovery costs in 2012, writes Robert Hilson in In-House.

Fortunately, that’s exactly what we have the Supreme Court for. The newest Supreme Court justice, Brent Kavanaugh, got to write the opinion on what was a unanimous decision, and he said no, in the most delightful way possible.

“The adjective ‘full’ in §505 therefore does not alter the meaning of the word ‘costs,’” Kavanaugh writes. “Rather, ‘full costs’ are all the ‘costs’ otherwise available under law. The word ‘full’ operates in the phrase ‘full costs’ just as it operates in other common phrases: A ‘full moon’ means the moon, not Mars. A ‘full breakfast’ means breakfast, not lunch. A ‘full season ticket plan’ means tickets, not hot dogs. So too, the term ‘full costs’ means costs, not other expenses.”

Meanwhile, attorneys are having kittens, because if you can’t expect the other party to pay the costs associated with e-discovery, then what? So like lawyers do, they’re looking for loopholes.

Quartararo, for example, makes the case that since “exemplification” and “making copies” are allowed costs, certainly at least some e-discovery costs should be recoverable, because isn’t that exactly what e-discovery is doing?

“More litigants may now seek to recover e-discovery costs in the form of attorney’s fees. There already is some support for this approach outside the intellectual property context,” write Cory Barnes and Christina Moser in Lexology. “Moreover, courts and litigants could treat some e-discovery costs (e.g., reviewing and producing documents) as if they were paralegal expenses or research services, which are ordinarily recoverable as attorney’s fees, even though such expenses are not allowed under § 1920. Or courts could follow the Federal Circuit path of allowing such fees where there is fraud or abuse of the judicial system.”

What could happen at this point is the law could be rewritten to be more specific. In the meantime, courts, and litigants, need to deal. Who knows, people might start using e-discovery less – which judges have been trying to get them to do, with proportionality – and using automated tools such as technology assisted review more.


May 22, 2019  9:15 AM

Vibrator Data is Protected Content, Judge Rules

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
privacy, Security

In case you’re curious, stored data about the intensity with which you like to use a vibrator has now been ruled to be communications “content.”

It’s true. It was ruled on by a judge and everything.

Most of the vibrator data ruling has to deal with jurisdictional issues, but an important part is related to stored data.

So here’s the deal. There’s this Chinese company, Hytto Ltd., that developed a vibrator, Lush, that could be communicated with using a phone app and Bluetooth. Some of the communication involved factors such as intensity. Depending on how the app was set up, partners, or whomever had access to the app, could also provide input into the intensity, even from long distance.

“Long distance couples can connect their Lush devices to their cellphones via Bluetooth using Hytto’s Body Chat app,” writes Helen Christophi in Courthouse News. “When two people use the app together, either partner can select and transmit the vibration intensity for the paired device.”

Amuse your friends! Have fun at parties!

The thing is, the company that developed the vibrator and app was storing this data, including frequency, date, time, and intensity of use, on its own servers. Moreover, it was associating this vibrator data with the email addresses of the people involved. So it wasn’t anonymized or aggregated, but personally identifiable. Finally, it didn’t tell users, let alone ask their permission, that it was doing this.

(One wonders, how could the company have monetized this vibrator data? Hmm.)

One of the product’s 34,000 users, “S.D.,” found out about this, and filed a class-action lawsuit in January 2018, saying that collecting this data violated the federal Wiretap Act, because it was intercepting the contents of an electronic communication.

The company claimed that it wasn’t, either, because the transmissions didn’t count as “content.”

Not so much, a judge ruled.

“The Wiretap Act defines ‘content’ as ‘any information concerning the substance, purport, or meaning of that communication,’” Christophi writes. “The law excludes ‘record’ information – data automatically generated when a communication is sent, such as the origin or length of a phone call.”

Hytto did win some concessions. U.S. District Judge Jeffrey White agreed that some of the data – date and time – were “record” data, because they were automatically generated. But because users enter “desired strength of touch” into the app to set vibration intensity, that data should be considered content, the judge ruled.

“Protected ‘content’ under the Act is a person’s ‘intended message to another’ and the ‘essential part’ of a communication,” White writes. “Unlike record information, content is generated not automatically, but through the intent of the user.”

Some of the ruling is rather sweet – poetic, even. “Individuals, of course, communicate by touch all the time. A pet owner can communicate to its dog, by tugging (gently) on the leash, the owner’s desire that the dog stop walking or slow down. A person can communicate his happiness to see a friend by a hug or a handshake,” White writes. “It is only with the evolution of certain technologies that the conveyance of such unspoken communications is now apparently not limited to situations where both the sender and recipient of touch-based communication are in the same location. The involvement of technology in the transmission of data does not change the character of the data. That the internet is used to effect a touch-based communication does not change the essential character of that communication.”

Hytto also tried to claim that it needed to store this information to provide the service that the customer was paying for, but White wasn’t buying that, either. “Hytto does not explain how the collection of the communication is necessary to enabling users to use an app to control the vibration intensity of a paired sex toy,” he writes. “Put another way, Hytto has failed to explain why it would be difficult or impossible to provide its service without the objected-to interception, particularly where the FAC alleges that Hytto markets the app as functioning peer-to-peer. Hytto’s argument, therefore, is not persuasive.” He did, however, say that the company could feel free to elaborate on this during the actual trial.

(“And for this I went to law school,” Judge White might have been heard to mutter.)

It didn’t help that the company had apparently promised that it wouldn’t do exactly what it ended up doing. “The FAC includes a snapshot from Lovense.com that reads: ‘We take your privacy very seriously. We have designed our system to record as little information about our users as possible. Absolutely no sensitive data (pictures, video, chat logs) pass through (or are held) on our servers. All data transfers are peer-to-peer. Furthermore, we encrypt the data before passing it along to your partner,’” White writes. “After reviewing this policy, a reasonable person could conclude that Hytto would not harvest data about how its Body Chat app or paired devices were used.”

S.D. can amend her complaint to remove the time and date component by June 14, White ruled. And then, it’s back to court.


May 14, 2019  9:20 AM

The Strange Case of Seattle University’s ‘Lost’ Laptop

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
privacy, Security

There’s been another case of Companies Behaving Badly with customer data: In this particular case, Seattle University’s “lost” laptop.

“On March 28, 2019, Seattle University was informed by an employee that an unencrypted university-issued laptop was lost while the employee was commuting on a bus on March 26, 2019,” noted the university in its report, entitled Data Security Incident.

There’s several things to unpack in that sentence.

How do you “lose” a laptop on a bus? I can understand “I *left* a laptop on a bus” or “My laptop was stolen on a bus” but how do you lose one?

(In general, that whole sentence is a lovely example of the passive voice being used to remove agency from someone. What’s wrong with “An employee told Seattle University that they had left a laptop on a bus”?)

If the employee “lost” the laptop on March 26, why did it take until March 28 before the employee reported it? It’s not like it was over a weekend; we’re talking a Tuesday and a Thursday.

Let’s move on.

“After learning of the situation, the university immediately began an investigation led by Information Technology Services and has been able to confirm there were files on the laptop that contained the names and Social Security numbers of 2,102 current and former faculty, staff, and their dependents. Although no files with sensitive data were saved directly to the local hard drive, an offline email cache file on the laptop contained attachments with personal information.  The main file of concern was the result of an isolated incident in which an outside vendor emailed the file in error.”

How do they know this? How can they tell what’s on the laptop?

What is an “offline email cache file” and how do they know what’s in it?

Why is an outside vendor emailing unencrypted personally identifiable information (PII) to an employee in the first place, accidentally or not?

How does a vendor accidentally email a file with more than 2000 records of PII?

What was special about these more than 2000 people that they were on a list?

Is Seattle University still using that vendor?

How long ago did the vendor email that file? In other words, how long has this unencrypted PII been sitting in the employee’s laptop?

And more.

“The university recently hired a Director of Cybersecurity and Risk who has been actively involved in leading the efforts to investigate this incident.  In addition, we are redoubling our efforts to encrypt data on all university-managed laptops.”

To what degree was that hiring in response to this incident? Or was this incident simply a great example of why that person needed to be hired? What else has happened that led to that person being hired?

What efforts had the university already made to encrypt data on all university-managed laptops? What was keeping those efforts from working? What does “redoubling” consist of in this case?

Both the Seattle Times and the Associated Press have done articles on the incident, but the articles are simply rewrites of the security notice and don’t provide any additional information.

This isn’t the first time such incidents have happened with Washington universities; the Seattle Times noted that both Washington State University and the University of Washington have had similar incidents.


April 30, 2019  10:56 PM

Mass Media Discovers Google Location Database

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Database, privacy, Security

Privacy advocates are getting up in arms about law enforcement asking Google for location data, and something might actually get done about it – but perhaps not the best thing.

Basically, law enforcement people investigating a crime ask Google for location data of cellphones near the crime, and Google provides them anonymized data. If some of the data looks particularly interesting, such as matching up very closely with the crime, law enforcement then gets a warrant and Google provides the identity of the phone in question.

If this sounds familiar, it’s because people have actually been talking about it for awhile. The difference? Now the New York Times is reporting on it.

That’s getting people’s attention.

“The leaders of the House Energy and Commerce Committee sent Google a list of questions Tuesday concerning a database maintained by the internet giant that tracks hundreds of millions of users’ locations and that is reportedly shared often with local, state and federal law enforcement agencies,” writes Benjamin Freed in StateScoop.  ““The potential ramifications for consumer privacy are far reaching and concerning when examining the purposes for the Sensorvault database and how precise location information could be shared with third parties,” reads the letter to Google CEO Sundar Pichai,” he writes.

Wait. Congress is blaming Google for this?

One could say, sure, of course it’s Google’s fault, because if the data weren’t there, law enforcement wouldn’t ask for it.

On the other hand, having that data can be handy for all sorts of legitimate reasons. It can be useful to see where one has gone over time. There are mileage trackers for expense reports and fitness, for example, that take advantage of location data.

Oddly, Congress doesn’t seem to be doing anything about, say, restricting law enforcement’s ability to gain access to this data. Maybe I’m weird, but it seems to me that that’s the path they should be going down.

Cellphone location data has been a big deal lately, with cases such as Carpenter limiting the sort of data that law enforcement can get from cell towers, which store location information from phones on a regular basis even if the phone isn’t being used (and even, apparently, if you turn location data off). The courts haven’t quite caught up to this particular issue yet, and to the extent that this has been litigated, thus far courts have ruled that “location dumps” are acceptable.

Consequently, Congress actually has the right – indeed, the duty – to write legislation to control the use of such data, once it’s collected.

But that doesn’t seem to be where Congress is going with this. It’s as if Congress discovered that people carrying large amounts of money could get robbed, but instead of making robbery the crime, they make carrying large amounts of money the crime.

And, of course, there are advantages to this line of thinking, if you can get people to go along with it. If you’re worried about people having too much control over their own destiny, limiting the amount of money they can carry — so they can’t easily leave the country without detection, for example — is an easy way to do it. All you have to do is convince the general public that only bad people — like crooks or drug dealers — would carry that much money, and people who don’t see reasons to carry that much money themselves will go along with it.

Tried to deposit or withdraw more than $10,000 in cash lately?

Similarly, people who don’t see why such location data has a legitimate purpose may easily be convinced that there’s something nefarious about the very collection and possession of that data. And if, for example, such data could be declared a public record when it’s about a public official, and if public officials don’t want such data about themselves made public. an easy way to prevent that is to stop the collection of that data in the first place.

If Congress is actually concerned about the nefarious use of such data, then let’s see them fix that problem instead. Update computer privacy laws that date from the 1980s. Ensure that people know that location data is being collected, and give them better tools to stop collecting it, check it, encrypt it, and delete it. Limit the ability of third parties — including law enforcement — to gain access to that data without the person’s permission. But don’t blame the data for the use that people are making of it.

 


April 26, 2019  8:36 AM

Remember the USB That Sets Computers on Fire? Somebody Used One. 66 Times.

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Security

Longtime readers may remember a piece from 2015 about a USB drive that could hypothetically set a computer, or whatever else it was plugged into, on fire. At the very least, it could zap it dead. It turns out that there’s now a guy who just did that to 66 devices at his alma mater, which was not someplace in Silicon Valley but an Albany-based former Catholic school for girls.

The alleged culprit is said to be Vishwanath Akuthota, who graduated from the school in 2017 with a master’s in business administration, as well as a certificate in Computer Information Systems.

What’s kind of funny about the whole situation is that the guy was known. The College of St. Rose, where the incident allegedly occurred – on Valentine’s Day – still has an interview with the guy on its Facebook page, dating from 2016, when he was a graduate assistant in the music department.

Needless to say, the comments are interesting.

Akuthota also participated in GitHub starting in September, 2017, and that activity – culminating in 34 commits in January and 16 commits in February – abruptly stopped after that, which isn’t terribly surprising, as he was arrested on February 22.

Akuthota ‘s LinkedIn profile, however, has been taken down, though, as has his Facebook page.

His future goal, Akuthota said on Facebook at the time, was to be an entrepreneur, and one can imagine he’s rather kicked that plan into a cocked hat. But he did have a listing on AngelList, which is sort of a Monster.com for startups, where he said he was an application developer for the New York State Office of Information Technology Services, working on IBM’s Watson artificial intelligence system. “I want to contribute for the 4th industrial revolution with artificial intelligence. let’s make it great,” read his bio. As his achievements, he lists “I’ve lunched world’s first talking interface for the chatbot. I’ve lunched New York states first chatbot with in 4 days.”

Okay. So let’s hear how he lunched 66 devices.

“Akuthota admitted that on February 14, 2019, he inserted a ‘USB Killer’ device into 66 computers, as well as numerous computer monitors and computer-enhanced podiums, owned by the college in Albany,” reports the U.S. Attorney’s Office from the Northern District of New York. “The ‘USB Killer’ device, when inserted into a computer’s USB port, sends a command causing the computer’s on-board capacitors to rapidly charge and then discharge repeatedly, thereby overloading and physically destroying the computer’s USB port and electrical system.

“Akuthota admitted that he intentionally destroyed the computers, and recorded himself doing so using his iPhone, including making statements such as ‘I’m going to kill this guy’ before inserting the USB Killer into a computer’s USB port.  Akuthota also admitted that his actions caused $58,471 in damage, and has agreed to pay restitution in that amount to the College.”

How Akuthota’s going to earn that money isn’t clear. One might imagine that a computer career is not in the cards.

Akuthota pled guilty on a single count of causing damage to computers, and is scheduled to be sentenced on August 12, where he faces up to 10 years in prison, a fine of up to $250,000, and a term of post-imprisonment supervised release of up to 3 years.

There are still a number of remaining questions. The office noted that Akuthota is a citizen of India, residing in the United States on a student visa.  He has been in custody since he was arrested in North Carolina on February 22. Will he be deported? If so, before or after he serves his time and pays restitution? If he graduated in 2017, why was he still in the U.S. on a student visa in the first place? How did he come to be in North Carolina? What led him to do it?

Most notably, how could he do it? “The defendant did not have, and knew he did not have, permission from the College to insert the ‘USB Killer’ device into any of the College’s computer hardware or otherwise ‘kill’ the College’s computer hardware,” notes the plea agreement, just in case that was in question.

Does St. Rose typically let people who haven’t been in the school for two years come in and mess around with the computers unsupervised?

Hopefully they don’t now.

 

 

 

 

 

 

 


April 22, 2019  9:00 AM

Virginia Court Throws Out License Plate Readers Data Collection

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
government, privacy, Security

As you may recall, police departments and other organizations have been loading up on automated license plate readers that help them track automobile locations, and even selling the data. Now, at least one judge, in Fairfax County, Virginia, has told them they can’t do that.

“The ruling by Fairfax Circuit Court Judge Robert J. Smith is a victory for privacy rights advocates who argued that the police could track a person’s movements by compiling the times and exact locations of a car anytime its plate was captured by a license plate reader,” writes Tom Jackman in the Washington Post. “Police say they can, and have, used license plate location data to find dangerous criminals and missing persons. Privacy advocates don’t oppose the use of the technology during an active investigation, but they say that maintaining a database of license plate locations for months or years provides too much opportunity for abuse by the police.”

This has actually been an ongoing legal case. Originally, Smith had thrown the case out of court, saying that the data didn’t meet the statutory definition of “personal information” under Virginia’s “Data Act.” However, that ruling was overturned last year by the Virginia Supreme Court, which then sent the case back to him because it wasn’t sure whether the database met the statutory definition of “information system” under that same data act. Smith’s ruling is that it does.

Consequently, though the ruling technically applies only to Fairfax County, it’s likely that other Virginia counties also using license plate readers could also be stopped. The Fairfax County police chief has said he will appeal the ruling, and state legislators – who tried to pass a law limiting the collection of such data, which was vetoed by the governor – said bring it on. “Va. Sen. Chap Petersen (D-Fairfax), one of the founders of the privacy caucus and a sponsor of the failed legislation, told Jackman he was “very glad to see this ruling. I hope that Fairfax County appeals it to the Supreme Court so it can become a statewide ruling.”

Fairfax County Police, which had been storing data for up to a year, will be required daily to purge its database of license-plate reader data that isn’t linked to a criminal investigation and stop using license plate readers to passively collect data on people who aren’t suspected of criminal activity, writes the Electronic Frontier Foundation, which with the Brennan Center for Justice wrote an amicus brief to support the case, which was brought by the American Civil Liberties Union of Virginia.

“Often mounted on police vehicles or attached to fixed structures like street lights and bridges, ALPR systems comprise high-speed cameras connected to computers that photograph every license plate that passes,” the EFF writes. “The systems then log, associate, and store the time, date, and location a particular car was encountered. This allows police to identify and record the locations of vehicles in real-time and correlate where those vehicles have been in the past. Using this information, police are able to establish driving patterns for individual cars. Some ALPR systems are capable of scanning up to 1,600 plates per minute, capturing the plate numbers of millions of innocent, law-abiding drivers who aren’t under any kind of investigation and just living their daily lives.”

In fact, the Fairfax County system could scan up to 3,600 plates per minute, according to the Fairfax County Times.

The result was a gigantic database of people’s locations, including information that could be used for political purposes. “The state of Virginia knows the plate number of every vehicle that crossed a Potomac River bridge from Virginia into the District of Columbia on the day of the first Obama inauguration,” writes Clifford Atiyeh in Car and Driver, which isn’t where one often expects to find privacy information. “It also has the plate of every vehicle that showed up at the site of a Sarah Palin rally in a D.C. suburb. In fact, as of 2013, it had eight million license plates scanned and saved in a database. At our last count, there were three billion license-plate data points across the country, in states with little or no data privacy protections for drivers who’ve done no crime but drive.”

Similarly, the ACLU has disclosed that the federal Immigration and Customs Enforcement agency was tapping into a national database of police and private license plate readers, Jackman adds.

While 16 states have some sort of regulation on license plate reader data, the remaining 34 do not, Atiyeh writes.

Fox 5 DC pointed out a variety of cases where license plate readers had helped catch criminals, but that’s not exactly the point. There’s any number of techniques that can be used to help catch criminals, if you don’t care about protecting people’s privacy.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: