Yottabytes: Storage and Disaster Recovery


June 14, 2016  10:48 AM

What’ll OpenText Do With Recommind?

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
E-discovery, ediscovery, Opentext

The e-discovery acquisition market has been pretty much a snore the past few years, after several years of musical chairs. But things might start heating up again due to a major acquisition from an unexpected source: enterprise information management software vendor OpenText has acquired Recommind for a reported $163 million, scheduled to close in Q1 2017.

Recommind has been reliably, though quietly, in the leaders section of the Gartner magic quadrant for e-discovery software for years now (though it did start out in 2011 as a “visionary”). The acquisition might lead to other vendors thinking that they might want to take a look at the other leaders – something Gartner actually predicted in last year’s MQ – particularly since 2015 featured a number of other peripheral e-discovery acquisitions.

Incidentally, you’ve got to figure that Gartner has to be really POed about the timing of this announcement; it typically releases its annual Magic Quadrant on e-discovery about this time of year, and now they’re either going to have to scramble to rewrite it, or know that it’s going to come out immediately out of date.

Ian Lopez of Legaltech News points out that Recommind had lost several executives recently. “Among them are former VP of business development Dean Gonsowski, who is now with kCura; Bob Clemens, who left his position as vice president of sales in America at Recommind to join CS Disco; and Philip Favro, who left his post as Recommind’s senior discovery counsel to join Driven as a consultant,” he writes.

What’s particularly interesting about this acquisition is that it’s OpenText. While a number of e-discovery vendors have been acquired over the years, generally the acquirers have been fairly major vendors – Symantec, HP, Microsoft, and so on. But OpenText? It’s interesting to see a vendor in this market seeing e-discovery as a field to enter. Image and Data Manager suggests it’s because of the increasing threat of litigation that companies face about their data.

And what OpenText is going to do with Recommind is not yet clear. OpenText could take Recommind’s technology and deploy it within proprietary offerings as Microsoft has with Equivio, and withdraw from the e-discovery industry altogether, or could become part of OpenText like Clearwell did when Symantec bought it, Lopez quoted Favro as saying.

OpenText has made a number of other acquisitions recently, and it’s thought that it’s not quite done. The company recently bought ANXeBusiness, a provider of cloud-based information exchange services for the automotive and healthcare industry, for about $100 million, and $170 million for some pieces of HP, including the HP TeamSite multichannel digital experience management platform, digital asset management solution HP MediaBin, and intelligent workforce optimization solution HP Qfiniti, reports the Waterloo Region Record.

In fact, content management consultant Laurence Hart thinks that OpenText might even acquire EMC’s red-headed stepchild Documentum. “We knew more acquisitions were coming after they then announced that they were raising $600 million and are planning to spend upwards of $3 billion on acquisitions over the next five years,” he writes. “It is fun to conjecture about the possibilities of Open Text acquiring Documentum from EMC. The problem is that Documentum is a huge acquisition. Documentum will cost a lot and that $600 million Open Text is raising would likely not cover the cost. The price does fit into Open Text’s 5 year plan financially but the acquisition would front-load that $3 billion in 2016.”

The company also flexed its muscle in 2014 by suing Box for $268 million for patent infringement, right as Box was going public. Eventually, it had to settle for less than $5 million.

In any event, the announcement could set off another round of musical chairs in the e-discovery market, Favro notes.

May 31, 2016  10:49 PM

Fearing Government Warrants, Firms Ditching Client Data

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Encryption, government, privacy, Security

In this day and age of “big data,” where many companies are trying to collect more and more data about their customers so they can analyze it to serve them better or sell them more stuff, some Silicon Valley companies are trying the opposite tactic: Jettisoning as much customer data as possible.

While some companies have done this before because they want to protect themselves from revealing incriminating details in the event of electronic discovery in a legal case, these Silicon Valley companies are going even further. They don’t want to possess any of their customers’ data in case the federal government comes looking for it.

Instead, the companies are decoupling encryption from the services they provide to their customers while having the customers encrypt their own data. Though this can reduce the performance of the companies’ products and make it harder for them to support their customers, it also means that they can’t be forced to surrender their customers’ data, because they won’t have it to give.

‘Radicalized Engineers,’ Oh My

This is all according to a recent article in the Washington Post that is drawing a lot of attention, with some people calling these companies traitors with “radicalized engineers” who are protecting terrorists. Others say they companies are responding logically to government efforts such as the Edward Snowden revelations and the more recent attempts to force Apple to make it easier for the FBI to break into an encrypted iPhone.

“The trend is a striking reversal of a long-standing article of faith in the data-hungry tech industry, where companies including Google and the latest start-ups have predicated success on the ability to hoover up as much information as possible about consumers,” writes Elizabeth Dwoskin. “Now, some large tech firms are increasingly offering services to consumers that rely far less on collecting data.”

It’s not just the government – the companies also don’t want to become targets for hackers by  possessing a lot of valuable data – but it’s the government aspect that’s attracting the most attention, especially as the FBI and Congress continue to press for legislation to force companies to include “back doors” in their encryption products.

While such a back door might be intended for governmental agencies or law enforcement, a door is a door, and as such is a risk, contend security experts such as Bruce Schneier. Consequently, an increasing number of high-profile companies are following the lead of Apple and Google, which in 2014 turned on encryption by default, meaning they couldn’t decrypt their customers’ products even if they wanted to.

‘So Secure Even We Can’t Read It’

The upside of this decision is customers of these companies can feel pretty sure that their data is secure, since even their own vendor can’t get at it. The downside is that if their customer loses their encryption key or something, they’re hosed, because not even the vendor can help them out.

But for a number of users, that’s a risk they’re willing to take. While originally, customers wanting to hold their own encryption keys were large, sophisticated financial services companies, such as Goldman Sachs and Blackstone, a wide variety of other companies, including media and automotive firms and small banks, are now making these requests to hold their own encryption keys, Dwoskin writes.

Critics claim that such actions help protect terrorists, while the companies say it forces law enforcement agencies to focus their attention where it belongs: On the suspect. “If you have an issue with my customer, go talk to my customer, don’t talk to me,” a representative from one vendor tells Dwoskin. “I’m just a tech guy, and I don’t want to be in the middle of these things.”


May 29, 2016  12:35 AM

Spokeo Decision Causes Ripples

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
privacy, Security

As expected, the Supreme Court ruled before June on the case of Spokeo Inc. vs. Robins. But instead of ruling in one of the two directions the industry expected – each of which drew all sorts of dire predictions from its opponents – the court found a way to Kobayashi Maru its way out of it by finding a different solution.

As you may recall, the whole thing started last November when a guy, Thomas Robins, sued the data aggregator Spokeo for having inaccurate data about him in its databases. He said the inaccurate data could have made it harder for him to get a job or credit, though he couldn’t point to specific examples of this having happened. He ended up suing Spokeo under the Fair Credit Reporting Act (FCRA) that requires such database companies to use accurate information.

Ironically, the real upshot was not this particular case itself, but what a final ruling could mean in terms of precedents. Spokeo supporters warned that finding in favor of Robins would mean that practically anybody could file a class-action suit on practically any tiny technical detail that some company screwed up, potentially costing millions or billions. Robins’ supporters warned that finding in favor of Spokeo would mean that nobody would ever be able to file a class-action suit ever again, unless each member could point to specific, enumerated injuries.

What added an extra fillip to the whole situation is when Justice Antonin Scalia suddenly died in February. That left open the notion that the Supreme Court could be deadlocked 4-4 on the case – which would have had the effect of upholding the Ninth Circuit court’s verdict, because they would have needed a majority to overturn it, but without setting a precedent for future cases.

As it happens, the Court voted 6-2 on its final decision, which was to send it back to the Ninth Circuit to let them figure out more details on the Robins case. Specifically, the Court directed the Ninth to determine two things about the case:

  • Whether it was particular. “There was no dispute that Robins had satisfied the particularity requirement of injury in fact,” writes James McKenna of Jackson Lewis, in National Law Review. “The injury he alleged was based on the violation of his own statutory rights and was due to the inaccurate information about him.”
  • Whether it was concrete. “A concrete injury must be ‘real’ and not ‘abstract,’” McKenna writes. “There are three aspects of concreteness. First, it is not synonymous with being tangible. A concrete injury need not be tangible. Intangible injuries, such as the abridgment of free speech rights, can be concrete.” Second, while Congress implementing a law is important, just because a law provides the right to sue doesn’t mean they can, he writes. “Third, the risk of real harm’ can constitute a concrete injury, even if the harm may be difficult to prove or measure,” he adds.

Where the Ninth had erred, the Supreme Court ruled, was by not considering those two points separately in the first place, especially the concreteness one. The Supreme Court didn’t say the Ninth was wrong, simply that, like Srinivasa Ramanujan, it hadn’t proved it enough. What actually “counts” as an injury, and how can the Ninth persuade the Supreme Court about this case?

“FCRA is designed to prevent harm resulting from a lack of ‘fair and accurate credit reporting,’ but inaccurate reporting is not a harm, in and of itself, because the incorrect information may not impact, or create a material risk of impact to, the consumer,” write Rand L. McClellan and Keesha N. Warmsby of Baker Hostetler for Mondaq. “But an agency’s failure to disclose information, which must be disclosed by statute, however, immediately violates the very right (i.e. access) the statute was designed to protect, and therefore constitutes injury-in-fact.”

(Interestingly, some states don’t have these distinctions.)

The two dissenters, who wrote their own opinion starting on page 21 of the decision, were Justice Ruth Bader Ginsburg and Justice Sonia Sotomayor, who said they believed that the Ninth Circuit had made enough of a case for the “concreteness” of the issue.

What’s interesting is that, in response to the Supreme Court’s decision, several lower courts have already tossed out several cases by saying the person or class didn’t have standing, using this most recent decision. For example, several data breach cases where people were suing a company for losing their personally identifiable data have been thrown out because the people can’t point to specific harms that have occurred due to the loss of their data.

“It will be a long while until the lower courts decide who won Spokeo – but it is already clear that defendants in privacy class actions are going to wield the Supreme Court ruling like a weed wacker,” writes Alison Frankel in Reuters.

Similarly, people who sue companies providing robocalls in violation of the law against them could have problems, write Diana Eisner, Christine Reilly, and Marc Roth of Manatt, Phelps & Phillips, LLP in JD Supra Business Advisor.  “Plaintiffs may need to allege a concrete or real harm,” they write. “This may be difficult, given that so many Americans are on ‘unlimited’ or flat-rate cell phone plans where no charges are incurred for incoming calls or text messages and no other ‘injury’ exists other than an alleged privacy invasion. In cases where the plaintiff did not answer the phone or know about the call absent the use of Caller ID, the plaintiff may be unable to allege a concrete harm stemming from the unanswered call, potentially shuttering the lawsuit.”

So if the Ninth can’t figure out a way to prove the concreteness aspect, the ultimate result may be very much like what the Spokeo supporters feared.

It isn’t clear when the Ninth Circuit is expected to rule, but if it decides to rule in a way that sends the case back to the Supreme Court, it may be next year – after the new President we elect in November chooses a new Supreme Court Justice who can make it through the approval process — before a decision is made.


May 24, 2016  1:42 PM

Gov. Vetoes Arizona Cloud Mandate

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
cloud, government

Arizona Governor Doug Ducey has vetoed a bill intended to encourage state agencies to move to the cloud –at the risk, some said, of jail time for agencies that didn’t comply. While the Arizona cloud mandate bill has failed, it may still be coming to a state near you.

The bill, Senate Bill 1434, specified that the IT department would adopt a policy to establish a two-year hardware, platform, and software refresh cycle that required each budget unit to “evaluate and progressively migrate” the group’s IT assets “to use a commercial cloud computing model or cloud model,” according to the bill. Budget units were directed to consider purchasing and using cloud computing services before making any new IT or telecommunications investment.

Moreover, each budget unit had to develop a cloud migration plan by next January, and thereafter report twice a year to the Arizona chief information officer (CIO) on how its migration was progressing.

It wasn’t obvious where the Arizona cloud mandate came from, such as whether it was written by legislative staff or what. It had 24 bipartisan cosponsors and was supported overwhelmingly by the legislature at every step of the process.

Imagine, Cloud Fans Supported It

Certainly, cloud supporters were all over it. “The endless hand-wringing over the use of the cloud needs to end,” writes David Linthicum, a consultant at Cloud Technology Partners, in InfoWorld.  “It may take laws such as this to get at least the public sector moving in the right direction. It might even save them — and thus all of us — money.”

There’s no argument that the cloud has advantages over a traditional data center. It is funded through operational money rather than capital money. Its expenses can be more predictable. It can be expanded and contracted more quickly and granularly than a data center. It doesn’t require as many skilled staff members. It offers the possibility of being more reliable. “The State of Arizona has already migrated its DNS solution to AWS, which saves it approximately 75 percent in annual operating costs on its DNS solution compared to its previous on-premise infrastructure,” notes Nicole Henderson in the blog Talkin’ Cloud.

But to the extent of mandating jail time if state CIOs don’t comply? (To be fair, only Linthicum mentioned this aspect, only in passing, and he didn’t say where it came from; it

Clouds? In Arizona?

And why Arizona? Arizona CIO Morgan Reed, appointed in October, 2015, was formerly Expedia’s director of data center services, according to Jake Williams at StateScoop.  Before that, he served in a variety of positions with Web hosting company GoDaddy.com, including leading IT governance, disaster recovery and business continuity, as well as leading global data centers, IT systems and advanced support operations in previous roles, Williams writes. (Arizona lost its top three IT executives at the beginning of 2015 as they all quit to join the private sector; Reed replaced an acting CIO.)

“When Reed speaks of the benefits of moving at the pace of the cloud, he does so with prior experience,” writes Ricky Ribeiro in StateTech. Reed goes on to say that Arizona wants to be “as agile and accelerated as the private sector.”

On the other hand, some commenters have speculated that it’s simply a conservative Republican attempt to streamline government and turn over as much as possible to the private sector.

Not to mention, do we really want legislators mandating the technology that state agencies should use? When we’ve seen the tenuous grasp that government officials have on technology such as encryption, is it a good idea for them to pick winners and losers in technology?

The veto message from Ducey, a Republican, wasn’t very enlightening. It consisted of two sentences. “It’s time for state government to enter the 21st century, and major advances in technology are needed to get there,” he writes. “This bill appears to add extra layers of bureaucracy that are unnecessary and will stall needed advancements in technology.” So perhaps he was convinced by the argument that legislators shouldn’t be deciding on technology.

Next Steps

The Arizona Legislature, has already adjourned so it won’t be able to attempt to override the Governor’s veto. It remains to be seen whether someone will attempt to bring the bill back next year.

All that said, some predict that other states, Ducey’s veto notwithstanding, will follow suit, noting that the federal government started it all in 2010 with its cloud mandates. “If the proposal becomes law and is successful, count on other states to follow the same path,” Linthicum writes. “That would be a good development because state agencies have built a whole lot of planet-heating data centers over the last 20 years and are planning to build many more.”

Even if you’re not the sort of person who typically follows your state legislature, it might be a good time to start.


May 13, 2016  11:10 PM

Man Jailed for Forgetting an Encryption Passcode

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Encryption, government, privacy, Security

It’s like one of those nightmares where you keep trying to type in a password and it doesn’t work. Except in this case, it’s real life, and a man has been in jail for seven months – without being charged – for forgetting an encryption passcode.

Francis Rawls, a former sergeant in the 16th district of the Philadelphia Police Department, has been accused of having child pornography on two encrypted Macintosh hard drives, which had been seized in March, 2015. He was ordered by a judge in August, 2015, to provide the passcode to decrypt the drives, but he claims to not remember it.

I don’t know about you, but I’ve been known to forget a password over a long weekend, let alone five months.

Consequently, for the last seven months, Rawls has been in jail for contempt of court for refusing the judge’s order, though he hasn’t been charged with a crime. His attorney is claiming that having to provide the passcode violates his right to self-incrimination.

Back story

Courts have been deciding back and forth on the issue for several years now and, most recently, have decided that a phone password is more like the combination to a safe than a physical object such as a key. It matters because something that is the expression of one’s mind, like the combination to a safe, is protected under your Fifth Amendment rights not to incriminate yourself. A physical key, something you possess, is something you can be forced to produce.

The Fifth Amendment angle is, in fact, why police wanted Rawls to type in the passcodes himself rather than tell someone else what they were – because it was thought that the latter would amount to testifying against himself, writes Christine Hauser in the New York Times.

“Quite sensibly, Rawls refused,” writes William Grigg in this website Personal Liberty. “He is a veteran cop and knows – better than the public he supposedly served in that capacity – what happens when a targeted citizen offers the police unrestricted access to his home and personal effects. If he had acceded to the demand for his encryption codes, Rawls would have done the equivalent of allowing the police to rummage through every room, closet, and drawer in his home, while letting them inspect all of his correspondence, medical records, and personal finances. Diligent and motivated investigators would eventually find something that an ambitious prosecutor could use to manufacture a felony charge.”

The government is claiming it wouldn’t be self-incrimination because it knows there’s child pornography on the hard drives. “If the government already knows there’s child pornography on Rawls’s hard drives, then it’s not self-incrimination for Rawls to give his passwords,” explains Travis Andrews in the Washington Post. “Think of it like a search warrant: if an officer of the law is granted a search warrant to someone’s house, then that suspected party has no recourse but to allow the officer enter his house. Much like that hypothetical person wouldn’t be able to lock the door, Judge Rueter ruled that Rawls can’t refuse to provide the passwords.”

On the other hand…

Needless to say, Rawls’ defense attorney takes issue with this belief. “The government has not, as required by the relevant decryption precedents, demonstrated that the storage of any particular file or files on the hard drives is a foregone conclusion,” notes the brief. “Instead, it put forward only a suspect witness who gave attenuated testimony and a forensic examiner who was unable to offer any authoritative opinion regarding the contents of the hard drives.”

Nor does his attorney agree with the dodge of having Rawls type in the passcodes rather than provide them to someone else. “The fact that the government seeks to have the codes entered into a forensic interface rather than spoken aloud does not change the analysis,” notes the brief. “It still demands that Mr. Doe divulge the contents of his mind, not demonstrate his typing skill.”

In an interesting coincidence, Rawls is being compelled to enter his passcode using the All Writs Act, the same act under which the Federal Bureau of Investigation recently tried to force Apple to develop an operating system to make it easier for it to break into a suspected terrorist’s iPhone.

After Rawls was accused of having child pornography on his iPhone 6, he did enter the passcode to decrypt that, and no child pornography was found. Investigators were able to decrypt his Mac Pro using a passcode found on the iPhone 6, without finding child pornography on it, according to a brief filed in the case. And he did try, reportedly for several hours, to come up with the passcode to decrypt his hard drives.

Was it a ruse?

The fact that Rawls did willingly decrypt the phone, and did attempt to enter a passcode for the hard drives, is being called a “ruse” by prosecutors. The government “contended that Mr. Doe’s unlocking of the iPhone 6 and entry of passcodes for the hard drives ‘was a deliberate façade designed to feign compliance with the Court,’” writes Rawls’ defense attorney. How it would have looked different if he honestly did want to cooperate and honestly did forget the passcode, nobody seems to have said.

This isn’t the first time that someone has been jailed for contempt for refusing to provide a passcode for an encrypted drive – several people in the U.K. have been jailed for refusing to provide an encryption passcode — but it may be the first time in the U.S.

It’s also not the first time someone has refused to provide the passcode to decrypt their hard drive; Jeffrey Feldman, who was also accused of having child pornography, refused to provide a passcode, and several judges ruled several different ways on whether he had to provide it. In that case, however, investigators learned enough details about what was on the drives on their own that they dropped the attempt to have him decrypt them.

Solitary

To make matters worse for Rawls, because he is a police officer – that is, former police officer; he was fired, after 17 years on the force, after he was jailed – he was put into solitary confinement to protect him from the other inmates. So not only has he been in jail seven months; he’s been in solitary confinement for seven months.

“He spends 22 and a half hours of every day completely alone,” writes Andrews. “If someone visits him, they have to remain behind a barrier. Each month, he’s granted one fifteen-minute phone call.”

Meanwhile, government prosecutors have until May 16 to file their brief with the Third District Federal Circuit Court of Appeals in response to Rawls’ attorney’s motion to have him released.

On the other hand, Rawls has a long way to go to set any sort of record for longest jail time, without a charge, for contempt. H. Beatty Chadwick was released in 2009 after serving 14 years for contempt after saying he had lost $2.75 million and could not split it with his former wife. He pointed out that he would have served less time if he had been convicted of third-degree murder.


April 30, 2016  10:57 PM

‘License and Registration. And, Cell Phone.’

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
government, privacy, Security, smartphone

We’re all accustomed to handing over our driver’s license and registration when we get pulled over. But if New York lawmakers have their way, you’ll also have to hand over your cell phone after an accident, so police can check to see whether you were using it illegally at the time.

While the Supreme Court ruled in June, 2014, that police need a search warrant, or your consent, to gain access to your phone, this bill — New York Senate Bill S6325A — elides this requirement by declaring that by driving in New York, you are automatically giving consent.

Motorists involved in crashes who refuse to surrender their phones to police would have their licenses suspended, and could face a one-year revocation of their license and a $500 fine, according to Mike McAndrew in Syracuse.com.

The bill prohibits police to review the content of a person’s cell phone, such as texts, social media postings, and email messages, McAndrew writes. “It would only allow officers to check if the phone was illegally used while driving.”

New York would be the first state to pass such a law, McAndrew adds, but the New York Times notes that such a law could spread to other states. For example, the Vermont legislature earlier this year considered a bill that would also require drivers to hand over their phones so police could check whether they’d been in use, without an accident being required.

The bill was announced in collaboration with an activist group founded by parents of children involved in crashes caused by drivers distracted by their devices, called Distracted Operators Risk Casualties, or Dorcs. (And no, we’re not making that up.)

But some are skeptical about the idea. “They would get into your phone only for the purpose of finding out whether you were using it at the time of the collision. They would not, of course, look at anything else while they were in there. They totally promise,” snarks Kevin Underhill in Lowering the Bar. “If there’s anything history has taught us, it’s that law enforcement and government can be trusted not to adapt a new technology for questionable uses despite promises to the contrary.”

Underhill points out, too, that the proposed law is pretty broad. “It applies not just to a driver suspected of causing injury or death but to all drivers in any ‘accident or collision involving damage … to property,'” he notes. “Somebody backed into your car at the gas station? Hand over your phone. Oh, you didn’t have it with you right then? Did you have it ‘near the time of such accident or collision’? Hand it over. And I’m not sure how they could determine whether you were using it at any particular time without learning about what you were using it for. For that matter, if the fact of ‘use’ is really all they would learn, how would they know you were using it illegally? I ‘use’ my phone all the time for voice navigation. Is that a problem now? What about hands-free calling?”

“There are so many ways in which somebody could be using the phone in a car that is not a violation of any laws,” Mariko Hirose, a lawyer with the New York Civil Liberties Union, tells Joel Rose of NPR. “This bill is simply providing a way for law enforcement to get around the privacy protections that apply to a cellphone.”

Other civil liberties organizations also have concerns with the bill, such as the Electronic Frontier Foundation (EFF), which believes it’s unconstitutional. “A law that essentially requires you to hand over your phone to a cop in a roadside situation without a warrant is a non-starter,” Lee Tien, a senior staff attorney with the EFF, told Jose Pagliery of CNN. Similarly, Marc Rotenberg, the president of the Electronic Privacy Information Center, called the bill “excessive, unnecessary, and invasive,” Pagliery writes.

Actually, the real target of the law appears to be the police, who apparently do not always attempt to retrieve records from cell phones after accidents. “Law enforcement can subpoena records from the phone company or ask a judge for a warrant to search the phone itself,” Rose writes. “But they don’t always do it because it takes a lot of money and time for cases that can be hard to prove.”

“All you really need to know about New York Senate Bill S6325A is that it would create a law named after a person (this one would be ‘Evan’s Law’), since any law named after a person is almost always a terrible idea,” Underhill writes. “If the law were a good idea, they wouldn’t need to try to generate support by manipulating people’s emotions.”


April 28, 2016  11:13 PM

Yet Another Set of Terrorists Fails to Use Encryption

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Encryption, government, privacy, Security

Government officials have been using recent terrorist attacks to try to justify limiting the use of encryption. You may recall, for example, that the Federal Bureau of Investigation (FBI) recently attempted to force Apple to develop a different version of the iPhone operating system to make it easier for the agency to break into encrypted phones thought to be owned by the perpetrators in last December’s San Bernardino attack.

Similarly, states such as California and New York have attempted to put forth bills that would outlaw the sale of cell phones with unbreakable encryption, while agencies such as the FBI have been recommending a mandated “back door” for law enforcement into encrypted phones.

These efforts have been continuing even though there’s been very little indication that terrorists are actually using encryption. For example, the FBI used last fall’s terrorist attacks in Paris to justify their long-held position that governments should mandate a “back door” into encryption, even though there’s no evidence the attackers used encryption — and, in fact, quite a lot of evidence that they didn’t.

One of the most recent incidents was the March bombings in Brussels. Rep. Adam Schiff, the California representative who’s the top-ranking Democrat on the House Intelligence Committee, suggested the same day they occurred that encryption might have been involved, writes Cory Bennett in The Hill.

Since then, law enforcement has been studying the laptop of one of the suicide bombers, Brahim El-Bakraoui, who blew himself up at Brussels airport, writes Lucy Clarke-Billings in Newsweek. “The bomber referred to striking Britain, the La Défense business district in Paris, and the ultra-conservative Catholic organization, Civitas, in a folder titled ‘Target,’ written in English, according to the source,” Clarke-Billings writes. “The laptop was found in the trash by police in Brussels shortly after the suicide bombings on March 22 that killed 32 people at the city’s airport and on a Metro train.”

So let’s get this straight. The data was not only unencrypted, but in English. And on top of that, it was located in a file folder. Labeled TARGET.

That’s right up there with Jurassic Park’s “It’s a Unix system! I know this!”

Security experts who are following the incidents believe there’s no indication that terrorist organizations have some sort of overarching encryption plan. “The clear takeaway from this list is that: 1) ISIS doesn’t use very much encryption, 2) ISIS is inconsistent in their tradecraft,” writes an information security researcher known as “the grugq” in Medium. “There is no sign of evolutionary progress, rather it seems more slapdash and haphazard. People use what they feel like using and whatever is convenient.”

The laptop discovery fits in with what appears to have been the strategy used thus far, writes Quartz. “ISIL’s strategy in last year’s Paris attacks and others was simple: avoid trackable electronic communications like email and messaging apps in favor of in-person meetings and disposable devices, or ‘burner phones,’ that are quickly activated, used briefly, and then dumped,” the organization writes. “Communications from the Paris attacks were reportedly (paywall) largely unencrypted, and investigators have found much of their intelligence through informants, wiretaps, and device-tracking rather than by trying to decipher secret messages. That’s not to say that terrorists won’t use encryption to carry out heinous acts. They will. But encryption is by now a fact of life: your apps, credit cards, web browsers and smartphones run encryption algorithms every day.”

Of course, to some people, the TARGET folder discovery was almost too good to be true. Skeptics on social media have been suggesting that the folder was planted by a group such as the CIA, that the folder was a decoy, and so on.

On the other hand, there doesn’t seem to have been much of a question about confirming who performed the Brussels attacks, especially since they were suicide attacks. If the folder was really planted, wouldn’t it have made for sense for the government agency involved to have used some sort of encrypted – though easily breakable – code? That way, the agency could have used it to justify its attempts to outlaw encryption. If the FBI planted the TARGET folder, it missed an opportunity.


April 24, 2016  10:57 PM

People Still Poke USB Sticks In Things, Researchers Find

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Security

It turns out that the reason people keep poking USB sticks into things isn’t necessarily because they’re stupid. It’s because they’re nice.

A recent study by researchers at the University of Illinois discovered that almost half of the people who found USB sticks scattered by the researchers ended up sticking them into their computers. This isn’t new. What was new is the reason – ostensibly, so the people who found them could return them to their owners.

“We dropped nearly 300 USB sticks on the University of Illinois Urbana-Champaign campus and measured who plugged in the drives,” writes Elie Bursztein, one of the researchers, who heads Google’s anti-abuse research team. “We find that users picked up, plugged in, and clicked on files in 48 percent of the drives we dropped,” he writes. “They did so quickly: the first drive was connected in under six minutes.” The full study will be published in May 2016 at the 37th IEEE Security and Privacy Symposium, he adds.

“We dropped five types of drives on the University of Illinois campus: drives labeled ‘exams’ or ‘confidential,’ drives with attached keys, drives with keys & return address label, and generic unlabeled drives,” Bursztein writes. “On each drive, we added files consistent with the drive’s label: private files for the sticks with no label, keys or a return label; business files for the confidential one; and exam files for the exam drives.”

In fact, researchers found that they could make people even more likely to perform this altruistic behavior by personalizing the stick, Bursztein adds. “Attaching physical keys to elicit altruistic behavior was most effective,” he writes. “Keys with an attached return label were the least opened, likely because people had another means to find the owner.”

So what makes all this a problem?

For that matter, the researchers’ USB sticks essentially had malware on them. “All the files were actually HTML files with an embedded image on our server,” Bursztein writes. “This allowed us to detect when a drive was connected and a file opened without executing any unexpected code on the user’s computer. When a user opened the HTML file, we asked them if they wanted to opt out or to answer a survey about why they plugged in the drive in exchange of a gift card. 62 users (~20 percent) agreed to respond.”

And that’s where the being nice part comes in. “When asked why did they plugged the drive most survey respondents claimed it was for the altruistic purpose of returning the drive to its owner (68 percent),” Bursztein writes. “Only 18 percent said they were motivated by curiosity.”

Of course, that’s what they said. “The self-reported motivation is not consistent with which files were accessed,” Bursztein notes. “For example for the drives with physical keys attached, users clicked on winter break pictures more often than on the resume file, which would have contact information of the owner. Interestingly the same behavior is observed for the drives with a return label, but not for the drives with no marking.”

The other interesting aspect is how quickly this all happened. “Not only do many people plug in USB devices, they connect them quickly,” writes Bursztein. “20 percent of the connected drives were connected within the first hour and 50 percent were connected within 7 hours.” What makes this a problem is that if such sticks did contain a virus, it could spread before anyone could deal with it. “The windows of time available to detect that this attack is occurring is very short,” he warns. In fact, the first report of the presence of “weird USB keys” on the campus only started to surface on Reddit roughly 24 hours after the first wave – which still didn’t keep people from continuing to plug them in, he writes.

What’s particularly interesting is that this behavior is universal. “We found no difference between the demography, security knowledge and education of the users who plugged USB drives and the general population,” Bursztein notes.

Between this and all the other security flaws inherent in USBs – even Captain America, ATMs, and the International Space Station are vulnerable — Bursztein is actually suggesting getting rid of USB sticks altogether. “You can enforce a policy to forbid the use of USB drives,” he writes. “On Windows this can be done by denying users access to the Usbstor.inf file. With the advent of cloud storage and fast internet connections, this is policy is not as unreasonable as it was a few years back.”

What’s going to be interesting is what sort of ramifications there’s going to be about this research. For example, it sounds like telling people not be stupid isn’t a very successful strategy, because they don’t think they’re being stupid. “Why being more security savvy is not negatively correlated with being less vulnerable is everyone’s guess,” Bursztein writes. “It raises the question of the effectiveness of security education at preventing breaches.”

Not to mention, what other kinds of things might people be convinced to do because they think they’re being nice?


April 7, 2016  9:55 AM

After Outcry, CIA Drops Plan to Erase Its Email Messages

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Email, privacy, Security, Transparency

Despite all the attention paid to Hillary Clinton’s email server, it’s easy to overlook the fact that a number of federal agencies have been looking for ways to delete email messages.

The Central Intelligence Agency (CIA) recently decided that it was dropping a plan it had made in 2014 that would have deleted the email messages of everyone in the agency – other than the top 22 people – within three years of their departure from the agency, “or when no longer needed, whichever is sooner.”

“A representative for the National Archives confirmed to The Hill on Monday that the agency backtracked on its proposal last month, following efforts to reorganize its structure,” writes Julian Hattem.

According to The Hill, the restructuring, announced in October, involved creating a fifth “directorate” at the CIA, Digital Innovation, tasked specifically with cybersecurity issues.

The CIA wasn’t the only agency to come up with a plan to scrub old email messages. The Department of Homeland Security announced a similar plan in November, 2014.  That proposal covered more than just email messages, but also included surveillance data, so there was some security rationale for deleting it.

The theory behind the deletion was that any important CIA email messages would have been retained in some other way, such as by being sent to or from one of the 22 senior officials, wrote Hattem in The Hill at the time. In addition, the original request had noted that email messages not from senior staff that were in fact implementing programs on behalf of senior staff were intended to be retained.

The CIA plan was swiftly criticized by a number of transparency organizations and Congressional representatives, including the heads of the Senate Intelligence and Judiciary committees. This led to the National Archives temporarily changing its mind. The CIA has now permanently withdrawn the program. Opponents of the CIA’s proposal also pointed out that the organization had already destroyed 2005 records, such as waterboarding videotapes.

So what’s leading to all these agencies wanting to delete information? In the case of DHS, it was supposedly due to storage costs, a justification that some found disingenuous due to the low cost of storage these days. Overall, it was a request from the Obama administration to help keep track of just the important stuff.

“The National Archives has been pushing all federal agencies for better management of the avalanche of email they generate daily,” writes David Welna for NPR. “The Obama administration has issued a directive giving those government entities until the end of 2016 to propose policies to winnow out important email, store it electronically, and discard the rest.”

That proposal is called Capstone, for a program that helps retain email messages without requiring user input. Based on a 2011 White House directive, Capstone is intended to make it easier for find federal government email messages.

For example, the CIA had previously been preserving  email messages by printing them out and filing them, wrote Ali Watkins for HuffPost Politics. “The CIA’s current system involves printing and filing away emails that are deemed important, a determination that is left largely to the discretion of individual agency employees,” she writes. “It is not clear what the timeframe is for how long those printed emails and any remaining electronic archives are supposed to be retained, though it appears there is currently no official requirement.”

What’s surprising is that the National Archives approved the plan in the first place. One would think that the Archives, of all places, would feel strongly about preserving records such as email messages, knowing that, in many cases, the value of a particular message might not be recognized until years later.


March 31, 2016  5:11 PM

Computer Techs Required to Report Child Pornography

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
law, privacy, Security

You may not know it, but if you’re a computer technician, you may have the same obligation to report child pornography as a doctor, day care facility, or film processor.

Utah just passed the Reporting of Child Pornography law, HB155, which puts this into place. It joins what is said to be 12 other states, as well as the province of Manitoba in Canada, in passing such legislation, which include Arkansas, Illinois, Missouri, New Jersey, North Carolina (where there has been at least one such case), Oklahoma, South Carolina, and South Dakota. Oregon and Florida have also at least considered such laws.

Under the Utah law, if computer technicians encounter what they consider to be child pornography in the course of their jobs, they are required to report it to a state or local law enforcement agency, the Cyber Tip Line at the National Center for Missing and Exploited Children, or an employee designated by the employer to receive and forward any such reports to one of the aforementioned agencies, writes the law firm of Fabian Van Cott.

It isn’t clear whether the legislation is also intended to apply to, say, sysadmins or technicians who work on computer storage, but it’s a safe way to bet. The law also allows technicians to disregard confidentiality agreements between clients to protect them from being sued by the company or clients as a result of the report, writes the Daily Universe.

The legislation is derived from similar legislation that had been enacted for film processors in a number of states. Back when there was such a thing as Fotomat, film technicians were also obligated to report it when they encountered child pornography in film left for them to develop.

As with film technicians, computer technicians are specifically told that “baby in the bathtub” pictures shouldn’t count. And technicians are told they aren’t obligated to go hunting for child porn on computers under their purview.

On the other hand, under the Utah law, they could themselves have charges filed against them for failing to report. “Computer technicians could face a $1,000 fine or six months in jail if they don’t report child pornography to police,” warns Fox 13. “Proving that the technician saw something illegal, but didn’t report it would fall on the shoulders of the prosecutor.”

Plus, the bill provides “immunity against civil lawsuits for technicians who report in good faith but make a mistake, writes the Salt Lake Tribune. The combination certainly seems intended to encourage people to err on the side of reporting.

What’s most surprising, actually, is how many Utah legislators – 13 out of 75 — actually debated and voted against the bill. Critics called its language “fuzzy” and were concerned about the implications.

“The idea of somebody from Geek Squad or somebody who is helping someone else at home with their computer, if you find something that you find offensive or that you think is pornographic, you must report that? That concerns me,” Rep. Johnny Anderson, R-Taylorsville, who owns several child care facilities, told City Weekly. “I don’t want us going down the road of telling someone that if you think your neighbor is doing something wrong, you are required by law to go to the police about it. I don’t want to immediately compare it to Nazi Germany, but it feels that way.”

Some computer technicians were also worried about the ramifications of the bills. Though several of them pointed out that they already report such findings, they were concerned about the legal penalties should prosecutors decide that they had seen the images.

Lawmakers also expressed concerns about the possibility of false accusations, writes the Deseret News. “Rep. Ed Redd, R-Logan, said he could see people being blackmailed and their reputations sabotaged. ‘Anyone of us in this room could be accused of this,’ he said.”

And indeed, at least one computer technician in Missouri has already attempted to blackmail one such client, instead of reporting the files to the authorities.

This wasn’t the only pornography legislation passed by the Utah state government during its most recent session. The state became the first in the country to name pornography a public health crisis through a bill that was passed unanimously by both houses of the Utah Legislature.

The organization behind the bill, the National Center on Sexual Exploitation – which considers the American Library Association a “pro-pornography organization” — is reportedly writing similar legislation for eight states so far. The intention behind the bill is said to be to encourage the federal government to more decisively enforce its anti-obscenity laws.

“This ought to be seen like a public health crisis, like a war, like an infectious fatal epidemic, like a moral plague on the body politic that is maiming the lives of our citizens,” Elder Jeffrey R. Holland, member of the Quorum of the Twelve Apostles of the Church of Jesus Christ of Latter-day Saints, reportedly told some 2700 attendees at the 14th annual conference of the Utah Coalition Against Pornography. “We do need to see this (pornography) like avian flu, cholera, diphtheria or polio.”

Ironically, despite its reputation, Utah has been considered to be a hotbed of porn use, ranking #1 in the country in 2009 for subscriptions to porn sites, according to one study. While some have criticized the methodology of that statistic, pointing instead to a different one where Utah ranks 40th, that study, too, has its flaws, as it’s limited to only a single site, PornHub.com, and measures “pageviews per capita.” Not to mention, “Mormon porn” is apparently a thing, as are other ways of getting around the porn restriction.

And despite Rep. Steve Eliason, R-Sandy, telling the Salt Lake Tribune that the bill “makes clear that in Utah, we don’t stand for child abuse,” Utah is also ranked highly on another statistic: Actual sexual abuse of minors, according to U.S. Department of Health and Human Services’s 2013 report. One study last November found Utah first in the country for child sexual abuse, though one organization claimed this was because Utah had “tougher laws than other states and may pursue child abuse more vigorously.”

With child pornography being one of those sure-fire issues that few people would be caught dead opposing, the computer technician legislation is likely to spread to other states. So be aware.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: