Yottabytes: Storage and Disaster Recovery


September 29, 2016  12:39 PM

Archivists Dealing With Born-Digital Data

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
government, history, museum, Storage

When former Texas Governor Rick Perry left office in 2014, the records he left to the State Library and Archives Commission included more than 7 terabytes of data, writes Isabelle Taft in the Texas Tribune. But there weren’t any digital archives to put it.

“While the governor’s 4,000 cubic feet of paper could be sorted, itemized, boxed and shelved alongside other state records dating back centuries, [state archivist Jelain ] Chubb and her staff had no system in place to store thousands of gigabytes of photos, emails and audio and video recordings, much less make them available to the public,” Taft writes. “The Perry collection presented a Texas-sized challenge for a commission that had no capacity to manage the ‘born-digital’ records — those with no paper or analog footprint.”

Consequently, the commission needed to set up the Texas Digital Archive for the material, a project that so far has cost $700,000. The data is stored in the Amazon cloud.

In many ways electronics records are safer and more durable than paper ones—they can easily be backed up, replicated, and sent elsewhere to prevent a Library of Alexandria-type conflagration. Moreover, electronic records are typically easier to gain access to than paper records because people don’t have to travel to where the records are. That also helps make government more transparent. But that’s all predicated on the data still being readable electronically.

“Ancient hieroglyphics and scrolls have survived centuries, but digital storage is fragile, the files easily swept away or locked up in encryption,” writes Arielle Pardes in Vice.com. “The technology we use to store things today might not be around tomorrow, and many of the platforms we use to store information are owned by private companies, which makes it harder for archival institutions to save them. And how much of what we upload online is worth saving at all?”

There are three main problems with maintaining archives of born-digital material, Pardes writes:

  1. It requires the hardware to read it.
  2. It requires power for the hardware.
  3. It requires software—often proprietary, and sometimes copyrighted—to read it.

This is particularly true as data storage goes to private companies in the cloud—such as Facebook—rather than on software that we own, Pardes warns. “Many of the sites we use that are free, or that you rent space on, like a wedding site, they’re private companies,” she quotes historian Abby Smith Rumsey as saying. “You don’t have ownership of it.”

That was the problem with some of Perry’s data, which dated back to 2000 and in some cases used formats that are no longer around, such as WordPerfect, Adobe Pagemaker, VHS tapes, CDs and raw camera files, Taft writes. “Many files had to be reformatted so the public could view them with contemporary software.”

To help in this effort, the Library of Congress has created a list of recommended formats for archiving digital data, and has an ongoing discussion about the sustainability of digital formats.

After the Perry project, the Texas data archive staff is now working on a similar project for Texas state agencies. “Arguably the most important function of the digital archive, however, is still under development: the ability to ingest and display the born-digital archives of state agencies,” Taft writes. “Archivists are currently working with three pilot agencies — the Texas Historical Commission, the Office of the Attorney General and the Railroad Commission — to get their electronic records from the late 1990s and early 2000s on the digital archive.”

Unfortunately, the group is running into the same problem as with Perry’s data. “Texas requires state agencies to preserve records if the state archives can’t yet take them,” Taft writes. “But floppy disks loaded with files can decay until they’re unreadable. Emails are often deleted to free up expensive storage space. And some formats are already obsolete.”

September 25, 2016  2:02 PM

6 Dell-EMC Shoes Yet to Drop

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Dell, EMC, Storage

EMC is dead. Long live Dell Technologies. With Dell’s acquisition of EMC finalized earlier this month, it may seem like the end to a very long, long, long story. But although the merger is complete, there are still a quantity of potential Dell-EMC fallout to occur.

Layoffs. Reports are that 2,000 to 3,000 people are slated to be laid off from the combined companies.A lot of them are going to be administrative people – HR, finance, all the various business functions that are more or less duplicated by the two companies. This will also be a good time to see, however, what users can expect from Dell Technologies products going forward, if developers start getting laid off as well. And what methodology is Dell going to use? If there’s two competing products, will the Dell product naturally survive and the EMC product get the ax? Vice versa? Will they just pick the biggest product? A look at who gets laid off will reveal a lot of the merged companies’ philosophy.

Executives. Yes, all of this was an elaborate way for EMC CEO Joe Tucci to get out of formally naming a successor. But what will happen to all those former EMC executives? EMC was known for having a deep, if unexciting but businesslike, bench of potential managers. Are they going to stick around? Former VMWare and Pivotal CEO Paul Maritz has already retired. Will people like VMWare CEO Pat Gelsinger join him? Are they going to get buyouts? Will they start spinning off companies? It will be interesting to see.

Ancillary pieces. Individually, EMC and Dell were so big that there were pieces of them that people completely forgot about. (For instance, did you remember EMC bought RSA Security in 2006? I had it in my head that they’d been owned by Symantec. Nope.) As the two companies merge, there’s going to be all sorts of things they find in the back of closets as they set up housekeeping together. Dell has already offloaded its software group (Dell had a software group?) EMC has already offloaded poor, sad Documentum to Open Text; it remains to be seen to what degree Open Text continues to support it, or simply expects all the Documentum users to migrate to the OpenText software. And what will happen to RSA? Data Domain? Virtustream? Pivotal Labs?

VMWare. Of course, the biggest ancillary piece – and the source of a lot of EMC’s profit – is VMWare. (There was even talk that VMWare might acquire EMC, though it was just a rumor that the merged company would have been called Ouroboros.) The companies insist they don’t plan to sell it, but some financial analysts say the acquisition might not pencil out unless they do. What sort of support will a software company get from its parent hardware company? And how will VMWare’s independent reputation among users change when it is owned by a server company?

Culture. Austin and Boston may rhyme, and may both be tech hubs, but their cultures are pretty different. How well barbecue is going to mix with baked beans, only time will tell. One area they do have in common? Both companies have been outspoken in their support of equal rights for gay people, such as opposing the so-called religious freedom bills in Indiana and Georgia.

And finally. What will Elliott Management Co. — having now withdrawn its dripping scimitar from the still-writhing corpse of EMC – be looking at it as its next activist investment target? As it turns out, speaking of Symantec, looks like it’s on the radar: The other EMC more than doubled its holding in the most recent quarter and now owns on the order of 3 percent of the company.


September 19, 2016  11:18 PM

What About Drone Storage?

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
drone, Storage

People don’t talk about it much yet, but the data that the increasing number of drones are collecting has the potential to cause a lot of problems.

“In 2013, only 128,000 units in this product category were estimated to have sold in the United States. Many of these were in the low-cost ‘toy’ category, with poor flight control characteristics, limited range and relatively poor image quality,” writes Wes Brewer of VP and GM of Lexar, a global brand of Micron Technology, for the SD Association. “Now, the CTA has forecast that more than 2.8 million units will be sold in the United States in 2016, while Business Intelligence pegged the worldwide market at 7.1 million units.”

While drones started out having built-in storage, increasingly they are being built to include microSD cards, Brewer writes, noting that a GoPro Omni 4K can capture as much as 162 gigabytes of data per hour. Aside from the drones themselves, the software and hardware they need to run could add up to a $127 billion annual market, according to PriceWaterhouseCoopers.
All that data has the potential for two big problems.

Size. Granted, all drones are not created equal. The quadcopter you got for Christmas that uses a microSD isn’t the same as a military drone, such as the Wide Area Reconnaissance Hyperspectral overhead Real-Time Surveillance Experiment (WARHORSE), where a single frame could amount to 20 gigabytes, according to William Arkin’s book Unmanned. And a single government drone flight can amount to 400 terabytes of data, writes Yasmin Tadjeh in National Defense.

Nonetheless, even if we’re just talking about non-governmental drones, we’re still talking about the potential for an awful lot of data. Where’s it all going to go? And how? Are people going to upload it all to the cloud? How’s all that storage going to be paid for?

It’s enough to make all the data that the National Security Agency is reportedly storing sound like a drop in the bucket. This is especially true now that the Federal Aviation administration has now granted permission to small businesses ranging from farmers to real estate agents to send up drones weighing less than 55 pounds.

“Under the new rules, operators would register their drones online, pass an aviation knowledge exam for drone pilots at an FAA-approved testing center and then they’re good to go,” writes Joan Lowy for the Associated Press. “That’s a big change since operators currently have to have a manned aircraft pilot’s license.” Moreover, the FAA really doesn’t have much of an enforcement mechanism, she adds.

Interestingly, the late, great EMC, now a piece of Dell, has also been looking at this market, including sponsoring drone-racing competitions.

Security. Once you’ve collected all that data, and put it somewhere, you need to protect it. “How you are going to handle that data, which can be massive, especially in the case of video, LiDAR surveys, or other sensor files?” writes drone software specialist VirtualAirBoss. “And, since the data may reveal sensitive information about key property, progress, or infrastructure, how will you ensure data security and integrity?”

Keep in mind that, in the mid-2000s, there was a guy in England named Tom Berge. And Tom got really good with Google Earth. So good, in fact, that he used it to look for lead roofs on historic buildings in southern London, and then he’d scale the building, peel off the lead, and sell it. Altogether, he was thought to have hit 30 buildings, earning more than 100,000 pounds.

So if a guy could do that with Google Earth, how long is it going to take burglars to learn how to do this with drones? And as it happens, not long at all. As long ago as 2014, stories started being reported about burglars who were using drones to pinpoint targets. Even if you’re collecting the data only for good, the repository is going to be pretty tempting for some people – assuming they don’t just nab your drone out of the sky to grab the data on it, or replace it with something else.

If all of this is making you think, “Didn’t we just go through all this with police body cams?” it should. The situations are very similar – except worse. “In many ways, the data challenge presented by UAVs is no different than that created by body cameras or smartphones,” writes Hank Hogan for Gov Tech Works. “All can generate digital data that must be managed. However, one key difference is that as a law enforcement aircraft flies into place, it will pass over – and image – other structures and possibly people. It is tempting for UAV managers to say that unintended imaging is no problem; can’t they simply discard everything that is not evidentiary? But that raises critical legal questions: What if the accused wants to see every inch of the video? Should law enforcement then retain everything? Given that the legal system can take years to run its course, is the agency in question responsible for taking, compressing, housing and managing all of the data for that entire time?”


August 31, 2016  10:58 PM

FBI Deliberately Uses Old FOIA Software

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
E-discovery, FOIA, government

We’ve all experienced the sort of person who performs something badly on purpose in order to get out of doing it. Whether it’s a secretary who makes coffee badly because she doesn’t think it should be her job to make coffee, or a husband who does household chores or child-raising badly so he doesn’t get asked again, we all know of examples.

Now a Massachusetts Institute of Technology (MIT) graduate student is claiming that the Federal Bureau of Investigation (FBI) is deliberately using old, crappy software as an excuse for getting out of responding thoroughly to Freedom of Information Act (FOIA) requests.

“Freedom of Information Act (Foia) researcher Ryan Shapiro alleges ‘failure by design’ in protocols at the Department of Justice (DoJ) for responding to public requests,” writes Sam Thielman in The Guardian. “The Foia law states that agencies must ‘make reasonable efforts to search for the records in electronic form or format.’”

Instead of using a $425 million web-based package called Sentinel that came out in 2012, which can search the full text of FBI records, the FBI uses Automated Case System (ACS), a 21-year-old sainframe-based software package that can search only keywords. In other words, if an agent didn’t happen to include those keywords in their report, the record wouldn’t come up in a search, even if the words might have been included in the report itself, Thielman writes.

Even when nothing comes up using ACS, the FBI reportedly refuses to use Sentinel, claiming that it would be duplicative and wasteful, Shapiro told The Guardian.

‘FOIA Superhero’

Shapiro himself is quite a piece of work. He is a self-styled FOIA expert, called by some a FOIA Superhero, filing an average of two FOIA requests a day. These requests range from records about the Central Intelligence Agency’s role in the arrest of Nelson Mandela in 1962 to whether the American Egg Board improperly used its powers against a particular kind of mayonnaise because it didn’t use eggs in it.

Interestingly, Shapiro’s most recent FOIA efforts are not just about getting various documents themselves, but to help him understand how the FOIA process works at the FBI. “I design each request not only to hopefully produce the requested records, but also to further illuminate the agency’s FOIA operations,” he explains to George LeVines in Muckrock, a nonprofit organization devoted to the FOIA. “Though it of course should not need to be this way, developing as intimate a familiarity as possible with an agency’s internal filing systems, databases, and FOIA practices is frequently the key to success.”

For example, Shapiro has been including waivers from individuals with his FOIA requests in order to circumvent FBI restrictions about including those individuals’ personal information in its responses.

‘Mosaic’

In response, the FBI is resorting to some interesting tactics, such as claiming that while Shapiro’s individual requests are all legitimate and unclassified, added up together they result in a threat to national security because they make it too easy to see how the FBI operates.

“Invoking a legal strategy that had its heyday during the Bush administration, the FBI claims that Shapiro’s multitudinous requests, taken together, constitute a ‘mosaic’ of information whose release could ‘significantly and irreparably damage national security’ and would have ‘significant deleterious effects’ on the bureau’s ‘ongoing efforts to investigate and combat domestic terrorism,’” writes Will Potter in Mother Jones. “So-called mosaic theory has been used in the past to stop the release of specific documents, but it has never been applied so broadly.”

Moreover, Shapiro can’t find out why, because the information was provided to the judge in the form of an ex parte, in camera letter which was secret. “They argued the threat to national security is so severe that they can’t even tell us why,” he told Amy Goodman at Democracy Now!

FBI Won’t Tell Him Why

Similarly, when he filed a FOIA request about the handling of his FOIA requests, the FBI refused, until a judge ruled in February that it couldn’t do that.

“Shapiro and his co-plaintiffs asked for more information about the process by which they had been so often refused,” writes Sam Thielman in The Guardian. “And those requests for clarifying information were categorically denied on the grounds that any information about the FBI’s reasons for denying previous Foia requests were by their very nature secret.”

“The FBI has long been notorious for the frequent poverty of the document searches it performs in response to FOIA requests,” writes The Sparrow Project, an organization devoted to social change. “The consistent inadequacy of the FBI’s FOIA searches has even led to the Bureau receiving an ‘award”’ for ‘worst Freedom of Information Act performance by a federal agency’ from a leading open government group, as well as a declaration from the Associated Press that, ‘If information were a river, the FBI would be a dam.’”

Perhaps someone could file a FOIA request on the FBI’s use of its new software.


August 30, 2016  10:29 PM

FBI Discovers Email on Clinton Server

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Clinton, ediscovery, Email

Many of us probably wish our IT departments were as diligent about finding lost email messages as the Federal Bureau of Investigation (FBI) has been about former Secretary of State Hillary Clinton’s.

To make things clear, this blog post isn’t going to discuss the topic of the email messages. This isn’t going to discuss whether it would be perjury. This isn’t going to discuss the possible effect on her presidential campaign. This is a blog about storage and e-discovery, and, like other blog posts on the subject, is sticking to those nerdy topics.

Reportedly, the FBI has searched through 14,900 email messages that Clinton or her staff were said to have deleted on the grounds that they were personal. In the process, the FBI believes it has found 30 email messages that might have something to do with the death of four Americans at a U.S. outpost in Benghazi, Libya.

The FBI has not yet released the email messages, because the State Department needs to look at them to determine whether there are actually any new messages, as well as whether there is any information in the messages that needs to be redacted for national security or revealing personally identifiable information. In addition, it isn’t clear how specific they are to the Benghazi incident, nor to what extent they involve Secretary Clinton.

The really interesting part, of course, is “Just how did the FBI find all the deleted email messages?” Unfortunately, the FBI doesn’t seem to be revealing that information, and the primary emphasis in court has been on just how and when the messages will be released, rather than on the nerdy details we all love.

And that could take a while. According to a transcript of a late August hearing, the 14,900 email messages were just what was on one disk. There are six more. Disk 5 alone is said to contain more than 55,000 documents. Emphasis was placed on disk one because it contained “E-mails and attachments that were sent directly to or from Former Secretary Clinton or e-mails that we sent to or from the 11 former secretary at a point in time in an e-mail chain, and 12 were not included on the materials provided to State 13 Department by Former Secretary Clinton in December 2014.”

That section of the transcript is actually kind of funny to read, as the lawyers and the judge fight for how much time it should take to review the messages and why it isn’t necessarily faster to release the messages from reviewing just the first disk compared with reviewing all the disks. The upshot of the discussion was that the State Department will review just the first disk, and everyone will come back to court on September 22 to find out how that’s going.

But in a separate legal case, a different judge ruled that the State Department actually had to come back with a report on the email messages by September 13.

The people filing the lawsuit to get the email messages released are understandably concerned that the process will take longer than Election Day, and there are some who believe that the State Department is delaying the process on purpose for just that reason.

One thing is clear: This should be a lesson to all of us on the proper procedures regarding setting up corporate email servers, the problem of mixing personal and business email correspondence, the value of setting up retention policies for discarding email messages on a regular basis, and how much of a pain electronic discovery is when these procedures aren’t followed. Reportedly, this investigation has cost more than $20 million thus far, and it’s not done yet.


August 25, 2016  11:25 PM

Apollo Software Renews Interest in Rope Memory

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
government, history, Storage

The recent publication of the original Apollo 11 lunar module code in GitHub is bringing new interest to a really arcane form of storage: Rope memory.

Let’s get the Moore’s Law, kids-these-days aspect of this over with right away: The Apollo Guidance Computer (AGC) had just “36,864 sixteen-bit words of core rope memory (placed within one cubic foot) and 4,096 words of magnetic core memory (within two cubic feet),” or the equivalent of 72K of the sort of memory we have today, according to programmer (and stoner) Don Eyles in the 2008-2009 documentary Moon Machines. (An MP3 player at the time the documentary was made had 50,000 times more storage space, the documentary notes.) This is inside a 70-pound box that was the state of the art – in the mid-1960s. Heck, it even used integrated circuits, the first device to do so.

But it’s the type of read-only memory that’s particularly interesting. “Fixed memory consisted of core rope, a high-density read-only memory using cores of similar material composition as the erasable memory but of completely different design,” writes NASA in Computers in Spaceflight: The NASA Experience. “MIT adopted the use of core rope in the original Mars probe computer design and carried it over to the Apollo. Chief advantage of the core rope was that it could put more information in less space, with the attendant disadvantages that it was difficult to manufacture and the data stored in it were unchangeable once it left the factory.”

LOL Memory

Yeah, so let’s talk about that manufacturing. It was literally woven by little old ladies on looms (and, consequently, was sometimes known as Little Old Lady, or LOL, memory). “Producing these required skills was analogous to textile work, which had long been part of the New England labor force,” explains the Dibner Institute for the History of Science and Technology at the California Institute of Technology, on its History of Recent Science & Technology website.

“Women in the factory would literally weave the software into this core rope memory,” explains historian David Mindell, an MIT professor of the history of engineering and manufacturing, in Moon Machines. “The rope is made up of rings and wires,” notes Margaret Hamilton, director of software engineering for the project, in the documentary. “If the wire goes through the core, it represents a 1, and around the core, it represents a zero.” A single program could take several months to weave, and an error would be a “nightmare to correct,” the documentary points out.

It could only have been more arcane if the little old ladies in question had woven it using their own hair or something.

Cutting-Edge Rope

That said, rope memory was quite the thing in its day, because it actually stored much more data than competing methods. “In the AGC, up to 64 wires could be passed through a single core,” writes Ralf-Udo Hartmann in his personal website. “A relatively large (by the standards of the time) amount of data could be stored in a small installed volume of core rope memory (72 kilobytes per cubic foot; roughly 2.5 megabytes per cubic meter); about 18-folda the amount of data per volume compared to standard read-write core memory.”

Moreover, core rope memory was more robust than other kinds of memory, even some we use today, because it didn’t require power to operate and could survive in rugged environments like space and splashing down in the ocean. “Core memory is non-volatile storage – it can retain its contents indefinitely without power,” Hartmenn writes. “It is also relatively unaffected by EMP and radiation. These were important advantages for some applications like first generation industrial programmable controllers, military installations and vehicles like fighter aircraft, as well as spacecraft, and led to core being used for a number of years after availability of semiconductor MOS memory (MOSFET). For example, the Space Shuttle flight computers initially used core memory, which preserved the contents of memory even through the Challenger’s explosion and subsequent plunge into the sea in 1986.”

Core rope memory ended up being used commercially in products from companies such as Univac and Burroughs.

On the other hand, there are those who claim that the core rope method is so arcane it couldn’t actually work, and use that as proof that either the moon landing itself was faked or the technology was faked to keep it out of the hands of competing governments.

Maybe they just didn’t trust their little old ladies.


August 15, 2016  5:43 PM

Airlines’ Disaster Recovery Criticized After Outages

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Disaster Recovery

If you sell disaster recovery products and solutions, you have a big batch of potential new customers: Major U.S. airlines. Four airlines — Delta, Southwest, United, and American, which just happen to control 80 percent of all domestic travel in the U.S. — have suffered major outages in recent months that have been attributed to a lack of a proper disaster recovery plan.

In August, Delta customers suffered major delays because of…a power failure? “A power outage in Atlanta, which began at approximately 2:30 a.m. ET, has impacted Delta computer systems and operations worldwide, resulting in flight delays,” the company reportedly wrote on its blog at the time. “Following the power loss, some critical systems and network equipment didn’t switch over to Delta’s backup systems,” reported Kim Nash in the Wall Street Journal.

“How could a company as technologically savvy and mature with its business processes as Delta not have a working disaster recovery plan?” writes M.J. Shoer in Seacoast Online. “It’s a fair question and we are still waiting to learn the details. I can’t for a minute believe Delta does not have a disaster recovery plan to deal with an event like this, but it failed. That begs the question as to when it was last tested and how often this plan is reviewed, revised and retested.”

In July, “Southwest Airlines canceled more than 2,000 flights over several days after an outage that it blamed on a faulty network router,” write Alastair Jamieson, Shamar Walters, Kurt Chirbas and Gabe Gutierrez for NBC News. While the company had redundancies built into its equipment, they didn’t work, according to CEO Gary Kelly.  “A back-up system also failed, extending the outage. Ultimately the company had to replace the router and reboot 400 servers,” the company’s chief operating officer told Conor Shine of the Dallas Morning News.

Delta and Southwest aren’t alone. “Computer network outages have affected nearly all the major carriers in recent years,” writes David Koenig for the Associated Press. “After it combined IT systems with merger partner Continental, United suffered shutdowns on several days, most recently in 2015. American also experienced breakdowns in 2015, including technology problems that briefly stopped flights at its big hub airports in Dallas, Chicago and Miami.”

Repercussions

Altogether, thousands of flights were delayed, resulting in costs of millions of dollars. Southwest’s Kelly could even lose his job over the incident, after criticism from a number of employee unions.

“The meltdown highlights the vulnerability in Delta’s computer system, and raises questions about whether a recent wave of four U.S. airline mergers that created four large carriers controlling 85 percent of domestic capacity has built companies too large and too reliant on IT systems that date from the 1990s,” writes Susan Carey in the Wall Street Journal.

Moreover, it points to how vulnerable airlines are to terrorist attacks, writes Hugo Martin in the Los Angeles Times. In fact, at first, some even attributed Delta’s outage to a terrorist attack – maybe because it just seemed so unbelievable that an airline could be brought down by a power failure. But it’s happening.

So What’s Causing It?

“There have been several reservation system outages that have hit worldwide airline ops with distressing regularity over the past few years,” risk management specialist Robert Charette told Willie Jones of IEEE Spectrum. “Southwest Airlines had one just a few weeks ago. (It had another big one June of 2013 and another in October 2015.) What you’ll see in reviewing them is recurring problems with infrastructure (i.e., power, networks, routers, servers, etc.) that seem to keep surprising the airlines. In every case I can recall, there were backup systems in place, but they failed—another recurring theme. The Southwest CEO claimed that the last outage—caused by a router—was equivalent to a 1000-year flood. Not only was that a comical overstatement, but it also shows the thinking that is probably [leading to the airlines] skimping on contingency management preparations.”

Some have attributed the failures to too much consolidation in the airline industry and too much emphasis on efficiency. One study, Delivering the Benefits? Efficiencies and Airline Mergers, found that not only did mergers not save operational money, but often cost much more than expected – particularly in terms of integrating IT. Some have attributed Delta’s failure, in particular, to the retiring of its previous CIO, Theresa Wise – who merged the Delta and Northwest IT teams — in January.

“Is this a sign that airlines aren’t investing enough money in their IT infrastructure?” writes Adam Levine-Weinberg in The Motley Fool.

Airlines Aren’t Alone

On the other hand, airlines aren’t alone in having insufficient disaster recovery protection – just the most conspicuous. A survey – admittedly by a disaster recovery vendor – found that companies in general aren’t testing their disaster recovery systems enough. “When asked about how frequently they tested their DR environment, more than half of the respondents indicated that they tested less than once a year; even worse, a third said that they tested infrequently or never,” the survey found.

Airlines’ IT systems’ complexity makes it worse. “Since they are needed on a 24/7 basis, 365 days a year, it’s hard to fully test every potential scenario that could cause problems,” Levine-Weinberg  writes. “As a result, it may be impossible to fully eliminate large-scale IT outages across the airline industry.”

The biggest problem with this failure in disaster recovery? The computer networks and systems can be repaired. Disaster recovery plans can be created for the next time. But repairing customers’ trust may not be so easy.


July 31, 2016  4:00 PM

Eek! Laptops Full of ISIS Porn!

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
Encryption, government, privacy, Security

You could call it Porn Wars. The U.S. conflict with the Islamic State of Iraq and Syria (ISIS) is being fought not on the battlefield – to the extent there even is a battlefield with the guerilla-focused organization – but also psychologically, using pornography.

For example, the activist hacker group Anonymous reportedly hacked ISIS Twitter feeds, replacing terrorist-oriented content with rainbows and links to gay porn. “Messages posted to the compromised accounts include ‘I’m gay and proud’ and ‘Out and proud,’” writes Anthony Cuthbertson for Newsweek. “A link to a gay porn site is included in some of the hacked accounts, although no explicit images have been posted in respect to Islam.”

Kids! Animals!

In addition, American military leadership is claiming that – far from having encrypted data – seized ISIS laptops are not only unencrypted, but 80 percent full of porn, according to Lieutenant General Michael Flynn, an ex-chief of the Defense Intelligence Agency. ‘We looked a ruthless enemy in the eye — women and children, girls and boys, raped and exploited, the beheadings stored on a laptop next to pornography,” he reportedly writes in his book, The Field of Fight: How We Can Win the Global War Against Radical Islam and Its Allies, excerpted in the German newspaper Bild. ‘These sick, psychopathic enemies were not only unimaginably hideous, but also treacherous and torn.”

“Some of it was really bad, and it was all over the map. Some of it was kids, animals,” a former intelligence analyst told ABC News, confirming the general’s claim. “I think it’s indicative of their hypocrisy of what they’ve said they believe in — their perverted version of their religion.”

This claim is particularly egregious in the case of ISIS, a religiously-oriented organization, which is ostensibly against porn, particularly the pedophiliac- and bestiality-oriented sort the laptops reportedly contained. Assuming it’s true that the laptops are “full of porn,” this could either be an attempt to make ISIS look bad to Americans, to other Muslims, or to both.

“The Islamic extremist group preaches a strict moral code and its caliphate operates under Shira[sic] Law,” writes Meg Wagner for the New York Daily News. “Drinking alcohol is a crime, women are forced to stay inside their homes unless they’re chaperoned by a male relative and anyone considered a heretic can be sold as a slave.” Osama bin Laden’s hideout was also said to be stocked with porn videos.

That said, some experts disagree with Flynn’s contention that pornography could be used to incite violence. “Porn has been the scapegoat for cheating on a spouse, the national divorce rate and even the 2014 University of California, Santa Barbara shootings, and now Flynn is correlating porn consumption with rape — these accusations are unsound,” writes Melanie Ehrenkranz in Tech.Mic. “Despite what anti-porn advocates argue, some experts say that watching violent pornography might actually help people fight their violent sexual urges.”

The timing of these revelations could also be related to the rumor that Flynn, who was reportedly forced out of his position in 2014 due to clashes with leadership, was being considered as a possible Vice Presidential nominee by Republican Presidential candidate Donald Trump, which was bruited scant days after the publication of his anti-Islam book.

Is it Propaganda?

Moreover, the oodles of porn claim also fits in with the classic “dehumanizing the enemy” propaganda efforts of previous wars. “Tales of atrocities also can dehumanize, as readers of William Randolph Hearst’s newspapers learned when they got whipped up for the Spanish-American war with fake stories and sketches of Spanish atrocities that probably never happened,” wrote Michael James for ABC News in 2003. “Arguably, it set a pattern for phony or embellished American wartime propaganda that would last at least through the Gulf War.”

In what seems prescient now, James Forsher — a film historian and documentary filmmaker who has studied propaganda films, and an assistant professor of mass communications at California State University — predicted, “If things turn nasty [with Iraq], God knows what’s going to happen to them,” James writes. “It worries me. It worries me for the country and for Americans who have Middle Eastern ancestry.”

“I think the demonization of Islam and the Arab world is identical to what happened 100 years ago,” Yale University history professor Jay Winter told James. “The Arab is now a stock figure, a caricature, a symbol of fanaticism, of infinite cruelty and no regard for human rights.”


July 29, 2016  12:31 AM

EU, U.S. Agree On User Data Privacy

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
government, privacy, Security

As you may recall, last fall the European Union got concerned about its citizens’ user data privacy regarding sending data to U.S. companies, because of data security flaws discovered by revelations from Edward Snowden. The European Court of Justice gave the EU and the U.S. till the end of January to come up with a better way to transfer data between themselves.

The good news is, they did. But the bad news is, they got it settled just in time for Brexit – the UK’s exit from the EU – to screw it all up again.

“Privacy Shield”

So here’s the good news. The European Commission, which governs the EU, announced on July 12 that it had adopted the EU-U.S. Data Privacy Shield, which was announced on February 2 and replaces the previous “Safe Harbor” method that was no longer considered adequate. Further modifications were made after one European committee found the additions were also inadequate.

While the whole thing amounts to 44 pages of legalese (and another 104 pages of annexes), the Data Privacy Shield boils down to several new components:

  • The U.S. Department of Commerce will conduct regular updates and reviews of participating companies, to make sure they’re following the rules.
  • The U.S. promises that it won’t perform indiscriminate mass surveillance on the data it gets from EU citizens – and citizens have a method of redress if they think this isn’t being followed.
  • The European Commission and the U.S. Department of Commerce will review the program every year to make sure it’s doing what it’s supposed to.

Without the agreement, more than 4,000 European and U.S. companies wouldn’t have been able to exchange data about each other’s citizens as easily, which could make commerce more difficult. That’s currently worth up to $260 billion, writes Mark Scott in the New York Times.

The next step is for companies to sign up for the self-certification program, which starts August 1, write Maria Camila Tobon and Alfred J. Saikali of the law firm Shook Hardy & Bacon LLP in the legal website Lexology. Google and Microsoft said they would be signing up for the new program; Facebook said it was still thinking about it. (The other, more-cumbersome alternatives to Safe Harbor are still around.) If companies decide to join within two months, they will then have nine months to actually comply; companies that decide after the two months will have to comply right away. Violators can be fined up to 20 million Euro.

Meddlesome Busybody

All that stipulated, the user data privacy legal case that spawned this activity is still going on. Austrian privacy activist Max Schrems sued Facebook to determine whether personal privacy of the data it transfers back to the U.S. is properly protected from U.S. government surveillance, which is what started the whole Safe Harbor issue. In fact, as the Privacy Shield was getting settled, an Irish court was ruling that the U.S. could join Facebook’s case, since the precedent was so significant, writes Padraic Halpin for Reuters.

That ruling was important because it means that the U.S., as well as organizations such as the Electronic Privacy Information Centre, the trade group Business Software Alliance, and the European alliance of Internet companies Digital Europe, can present evidence as an amicus curiae in the case because they have a bona fide interest and are not just a “meddlesome busybody,” The Hill’s Joe Uchill quotes the Irish High Court as saying. (In contrast, the American Civil Liberties Union and the Electronic Frontier Foundation were denied this privilege, he adds.) How long that lawsuit is going to take, nobody is speculating.

And What About Brexit?

The UK may be facing similar issues as the U.S. with its departure from the EU, depending on how the negotiations go that will determine whether it is still considered a European Economic Area, writes Claus Farber in the National Law Review. That process that could take two years, he writes. On the other hand, some see the UK’s departure from the EU as an advantage in that area.

In addition, Canada is concerned that the EU will find its data transfer regulations inadequate as well.

Moreover, some people feel that even the new user data privacy standards still don’t go far enough. “Legal challenges are already being prepared, and the European Court of Justice — the same court that overturned the previous trans-Atlantic data transfer deal — is likely to review the Privacy Shield to see if it meets European standards,” Scott writes. So in a year or two, we may be right back where we started from.


July 20, 2016  12:57 PM

Microsoft Wins One for the Data Privacy Team

Sharon Fisher Sharon Fisher Profile: Sharon Fisher
government, Microsoft, privacy, Storage

Is data owned by a company based in one country, but stored in another country, subject to seizure by warrant by the government of the first country? U.S. courts have been batting the issue back and forth since 2014, and the most recent verdict is that no, it isn’t.

The subject of the case (which has a lot of complex legal nuance) is Microsoft, which reportedly had email messages from one of its customers stored on a server in Ireland. The Department of Justice wanted access to those email messages while pursuing an unspecified case, claiming that since the email messages were controlled by Microsoft, an American country, the DoJ had jurisdiction over them even though they were stored in Ireland.

This scared people, for multiple reasons.

  • Because so many computer companies are American, it would mean an awful lot of data worldwide would be subject to access by the U.S. government.
  • Computer companies worried that worldwide customers would stop using them because they were afraid they’d get their data accessed.
  • Having the data subject to U.S. access could mean that the company – Microsoft in this case, but any company – could be violating data privacy laws in force at the second country. (For that reason, dozens of companies and civil liberties organizations – as well as the government of Ireland itself — filed amicus curiae briefs supporting Microsoft.)
  • If this precedent was set with the U.S., all the other countries in the world could declare that, in that case, all their data laws could apply to any company doing business in their countries, which could be an incredibly complicated, contradictory mess.

So, needless to say, everybody but the Department of Justice was pretty happy with the 3-0 decision by the 2nd U.S. Circuit Court of Appeals in Manhattan. “Circuit Judge Susan Carney said communications held by U.S. service providers on servers outside the United States are beyond the reach of domestic search warrants issued under the Stored Communications Act, a 1986 federal law,” writes Jonathan Stempel for Reuters. “Congress did not intend the SCA’s warrant provisions to apply extraterritorially,” Carney wrote in her decision. “The focus of those provisions is protection of a user’s privacy interests.”

For its part, the Department of Justice said that if users could stash their data overseas, it would make it hard for them to catch bad guys. While there were other methods that would give the U.S. government the ability to request the data stored in the foreign country, the DoJ said they are hard to do, Stampel writes.

For that matter, the new ruling – which applies only in the region served by the court — means a company could ostensibly choose to store data in a foreign country to keep it out of the hands of law enforcement, notes Peter Henning in the New York Times. “Imagine the possibility that a company might offer an email service — perhaps called ‘Crim Mail’ — guaranteeing users that their electronic files would be stored overseas,” he speculates. “It could even choose to put servers in a location that is notably hostile to the United States and that would welcome the chance to throw a wrench in law enforcement efforts. The company could charge a premium to have files maintained only in specified locations, making it almost impossible for investigators to ever gain access to them.” He called such a possibility far-fetched, though.

But hey! We’re not done! The Department of Justice could still appeal the appeal, as it were, which would send the issue to the U.S. Supreme Court.  “We are disappointed with the court’s decision and are considering our options,” said U.S. Department of Justice spokesman Peter Carr in a statement cited by Elizabeth Weise in USA Today, who went on to say that the government was expected to appeal.

What could happen there is anyone’s guess. First, we don’t actually know that much about the current court’s views on worldwide data sovereignty. Second, at the moment we have only eight Justices, meaning that if they have a tied decision, it would apply only to this single case, not as a precedent.  If a ninth Justice is appointed by then, who it is and what their views are will depend a lot on who gets elected President in November – not to mention which party ends up in control of the Senate, which has to approve any new Justices before they could be seated. Plus, if the current eight-member Court starts hearing testimony before the new Justice is seated, the new Justice still wouldn’t be able to rule on the case.

The appeals court judges suggested that what really needed to be done is for the U.S. government to write other laws to make it less complicated to request data stored in another country, while still giving that country sovereignty over the data stored within its borders.

“The case turned on how broadly a court can apply the Stored Communications Act, a law adopted in 1986 as part of the Electronic Communications Privacy Act to give a measure of protection to the then-nascent technology of email,” Henning writes.  “Like almost any 30-year-old law dealing with technology, it is hopelessly out of date because it has not been meaningfully updated by Congress to address how digital information is created and stored.”

Chances are, that too is also something that could come out very differently depending on who gets elected President. Another reason to vote in November.


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: