When will a disk drive fail? The people at BackBlaze, who use 67,814 disk drives, have been looking at Self-Monitoring, Analysis and Reporting Technology (SMART) disk drive statistics to try to predict this.
They’ve been looking at this for a while, at least since 2014. As I wrote then:
“SMART defines — and measures, for those vendors that support it — more than 70 characteristics of a particular disk drive. But while it’s great to know how many High Fly Writes or Free Fall Events a disk has undergone, these figures aren’t necessarily useful in any real sense of being able to predict a hard drive failure.
“Part of this is because of the typical problem with standards: Just because two vendors implement a standard, it doesn’t mean they’ve implemented it in the same way. So the way Seagate counts something might not be the same way as Hitachi counts something. In addition, vendors might not implement all of the standard. Finally, in some cases, even the standard itself is…unclear, as with Disk Shift, or the distance the disk has shifted relative to the spindle (usually due to shock or temperature), where Wikipedia notes, ‘Unit of measure is unknown.’”
At that point, BackBlaze had determined that out of the 70 statistics SMART tracked, there was really only one that mattered: SMART 187, or Reported_Uncorrectable_Errors. At the time, BackBlaze wrote: “Drives with 0 uncorrectable errors hardly ever fail. Once SMART 187 goes above 0, we schedule the drive for replacement.”
Since then, the company has been looking at the SMART statistics some more, and it’s now added four other statistics that it’s determined have a correlation to failed drives, writes senior marketing manager Andy Klein:
- SMART 5 Reallocated Sectors Count
- SMART 188 Command Timeout
- SMART 197 Current Pending Sector Count
- SMART 198 Uncorrectable Sector Count
The company didn’t say why it started looking at other SMART statistics if it had already determined that there was one statistic that was correlated with failure. (It also makes its raw statistics available in case you want to play with correlations yourself.) He also points out that not all vendors report all the statistics, or in the same way — an issue two years ago as well.
Another factor is the period of time in which the errors occur, Klein writes. “For example, let’s start with a hard drive that jumps from zero to 20 Reported Uncorrectable Errors (SMART 187) in one day,” he writes. “Compare that to a second drive which has a count of 60 SMART 187 errors, with one error occurring on average once a month over a five year period. Which drive is a better candidate for failure?” He doesn’t actually say, though he implies that it’s the first one.
Incidentally, BackBlaze has even started looking at High Fly Writes as a possible indicator of future disk failure. “This stat is the cumulative count of the number of times the recording head ‘flies’ outside its normal operating range,” Klein explains, noting that while 47 percent of failed drives have a SMART 189 value of greater than zero, so do 16.4 percent of drives that work. “The false positive percentage of operational drives having a greater than zero value may at first glance seem to render this stat meaningless. But what if I told you that for most of the operational drives with SMART 189 errors, that those errors were distributed fairly evenly over a long period of time?” he asks. “For example, there was one error a week on average for 52 weeks. In addition, what if I told you that many of the failed drives with this error had a similar number of errors, but they were distributed over a much shorter period of time, for example 52 errors over a one-week period. Suddenly SMART 189 looks very interesting in predicting failure by looking for clusters of High Fly Writes over a small period of time.”
That’s not to say that any of the statistics, or even a combination of them, is a perfect predictor of when a disk drive is going to fail – or when it isn’t. The organization points out, “Operational drives with one or more of our five SMART stats greater than zero – 4.2 percent. Failed drives with one or more of our five SMART stats greater than zero – 76.7 percent,” writes Klein. “That means that 23.3 percent of failed drives showed no warning from the SMART stats we record.” But if nothing else, it sounds like starting to see these particular SMART errors is a way to bet.
Disclaimer: I am a BackBlaze customer.
Technology-assisted review electronic discovery products are growing more sophisticated. On the other hand, as with any other industry, big data is giving attorneys much more data to deal with, including the Internet of Things as everyday items such as televisions and refrigerators are collecting information about their users. That’s according to a recent session on the future of ediscovery, held partly to commemorate the tenth anniversary of ediscovery. The session was held in late October as part of the Masters Conference for Legal Professionals at the Capital Hilton in Washington, D.C.
For example, predictive coding enabled Sidley Austin partner Robert Keeling to determine that, in one particular Foreign Corrupt Practices Act investigation, participants used “chocolate” whenever they meant “cash” in the context of bribes, writes Zoe Tillman in Legaltech News.
(Apparently equating chocolate with cash in bribes isn’t even all that uncommon; it has happened with a Malaysian company as well as with the Mexican company Keeling was likely referring to. “A subsidiary paid routine bribes referred to as ‘chocolates’ to Mexican officials in order to obtain lucrative sales contracts with government hospitals,” wrote the Securities and Exchange Commission about the latter 2012 case. “The ‘chocolates” came in the form of cash, laptop computers, televisions, and appliances that were provided directly to Mexican government officials or indirectly through front companies that the officials owned. The bribery scheme lasted for several years and yielded nearly $5 million in illegal profits.”)
Such predictive coding tools fall in general into the realm of technology that can help attorneys more easily do their jobs, Tillman writes. Other examples include:
- Programs that can analyze audio files and images, pieces of information that lawyers typically had to review manually
- Tools that allow lawyers to more easily redact records
- More sophisticated coding that goes beyond text and uses other internal data such as pixels to categorize data
In addition to finding ways to deal with the firehose of data from the Internet of Things, attorneys also have to pay attention to the security implications of using particular categories of data, Tillman writes. Whether the accumulation of such data is considered helpful or creepy depends on how convenient it is, panelists noted.
“It can be a bit off-putting, the panelists quipped, when a married couple is merely discussing the possibility of having children and coupons appear in the mail for diapers,” writes Tera Brostoff in Bloomberg BNA. “Your phone, your car or even your refrigerator is creating a ‘pattern of life’ on you. But the question of who is looking at that data and how it’s being stored is something you might not consider until you face litigation.”
IoT data is particularly useful in labor and employment law, Brostoff writes, including data collected from sensors such as the global-positioning system (GPS) in a car, a FitBit or other wearable technology, or the RFID chip in a employee’s ID card. “For example, we now have your GPS data saying that you were at McDonald’s for four hours during the middle of the work day,” she writes.
But as always, the panelists said, the biggest challenge in ediscovery is convincing organizations to use the tools, Tillman writes. “A big challenge for practitioners continues to be convincing corporate clients to get on board with new [technology aided review] (TAR) products,” she writes. In particular, some are concerned about the security implications of sending data to the cloud, she adds. Vendors are looking to forestall these concerns by integrating ediscovery tools into their existing cloud-based products, she writes.
Apparently OpenText’s June acquisition of Recommind has reminded everyone that the e-discovery M&A market is alive and well: LDiscovery has just acquired Kroll Ontrack for a reported $410 million.
The all-cash transaction is expected to close in the fourth quarter. The combined organization is said to have 12,000 clients worldwide, “including some of the world’s largest financial institutions, Fortune 1000 companies and Am Law 100 firms,” with offices in 19 countries comprising 1,300 employees. Kroll CEO Mark Williams will be CEO of the combined companies.
As you may recall, Kroll had been dropped from the Leaders to the Challengers quadrant in the 2015 Gartner E-discovery Magic Quadrant, due to what Gartner felt was a lack of vision.
And just who is LDiscovery? Identified as “a provider of eDiscovery and Information Governance solutions,” what’s interesting about it is that the company has typically been associated with e-discovery vendor kCura, which never fails to mention that it was named a Leader in the 2015 Magic Quadrant.
But if LDiscovery was in the market for an e-discovery vendor, what kept it from acquiring the one it already worked with? As recently as early this month, LDiscovery was named a finalist for the Relativity (kCura’s product) Innovation Awards for Best Service Provider Solution, adding that “LDiscovery is the only provider to be a finalist three years in a row.” So what will happen to its relationship with kCura? Especially after LDiscovery CEO Chris Weiler called OnTrack “best in class”?
(Incidentally, has anyone else noticed that Gartner hasn’t yet released its E-discovery Magic Quadrant for 2016? Typically it comes out in early summer. Could it be that they’re frantically rewriting it to accommodate these acquisitions?)
On the other hand, the acquisition may not be that weird, writes Gabe Friedman for Bloomberg Law. “Both companies operate in the review space, rather than the software space,” he writes. “Both companies are partners with kCura, and employ dozens of certified experts and administrators of its Relativity software review platform.”
And it’s not out of the question that LDiscovery – or, rather, its parent, the Carlyle Group – might pick up kCura in the future. “The Carlyle Group, working with Revolution Growth, purchased LDiscovery for $150 million this past January,” Friedman writes. “ LDiscovery itself had purchased six companies in the past two years. Will Darman, the managing director at Carlyle who led the purchase of LDiscovery, has predicted there will be continued consolidation in the e-Discovery review market.”
Weiler had noted during the January acquisition that it intended to use the money for further acquisitions, particularly in other geographic areas. Kroll is based in New York, while LDiscovery is based in Tysons Corner, Va.
Interestingly, Kroll’s parent company, Corporate Risk Holdings, was said by Moody’s to have “concerns regarding the company’s weak liquidity and the sustainability of its capital structure given its very high debt leverage.’ Moody’s plans to re-examine Corporate Risk’s ratings in light of the acquisition.
Scientists and researchers love finding new and interesting ways to store data, especially as the traditional spinning disk and even flash drive technologies are starting to run into the laws of physics. The newest one? DNA data storage.
Actually, the notion started out as a joke, writes Andy Extance in Nature. In 2011, biological information scientists were trying to figure out how they could afford to store the amount of genome sequences they had, and were so frustrated by the expense and limitations of conventional computing technology that they started kidding about sci-fi alternatives like using DNA. “Then the laughter stopped,” he writes. “It was a lightbulb moment.”
Scientists at the European Bioinformatics Institute and at Harvard University each demonstrated a proof-of-concept of the idea in 2013, of just 0.74 megabytes. What’s different now is that Microsoft, which has been taking the lead in DNA storage research, was able to save 200 megabytes in DNA, or more than a 270 times improvement over 2013.
In fact, at one point, you could even buy a DNA storage device on Amazon, which stored 512 KB—enough for a small photograph or a document, according to the product description.
The downside? The cost, writes Tobi Ogunnaike in SingularityHub. “The current cost of DNA data storage is not attractive,” he writes. “Storing digital data in DNA involves both reading and writing DNA. While the price of reading DNA (DNA sequencing) has fallen sharply, the price of writing DNA (DNA synthesis) currently remains prohibitively high for data storage.”
The Microsoft experiment might have cost on the order of $150 million, writes Andrew Rosenblum in MIT Technology Review. “Microsoft won’t disclose details of what it spent to make its 200-megabyte DNA data store, which required about 1.5 billion bases,” he writes. “But Twist Bioscience, which synthesized the DNA, typically charges 10 cents for each base. Commercially available synthesis can cost as little as .04 cents per base. Reading out a million bases costs roughly a penny.” On the other hand, costs are dropping rapidly, he adds. “It would have cost about $10 million to sequence a human genome in 2007 but close to only $1,000 in 2015.”
Another downside? The speed, Extance writes. “DNA storage would be pathetically slow compared with the microsecond timescales for reading or writing bits in a silicon memory chip,” he writes. “It would take hours to encode data by synthesizing DNA strings with a specific pattern of bases, and still more hours to recover that information using a sequencing machine.”
But DNA storage offers one big advantage: Its density, or the amount of data it could store in a small space, Extance adds. “By 2040, if everything were stored for instant access in, say, the flash memory chips used in memory sticks, the archive would consume 10–100 times the expected supply of microchip-grade silicon,” he warns. Similarly, a data center holding an exabyte (one billion gigabytes) on tape drives would require $1 billion over 10 years to build and maintain, as well as hundreds of megawatts of power, he writes. “If information could be packaged as densely as it is in the genes of the bacterium Escherichia coli, the world’s storage needs could be met by about a kilogram of DNA.”
For example, the 200-megabyte Microsoft example occupied a spot in a test tube much smaller than the tip of a pencil, the company writes, which performed the experiment with the University of Washington.
DNA storage could also last longer and be more stable than existing data storage methods, writes John Markoff in the New York Times. “Magnetic disks, tape and even optical storage systems safely store information at most for only a handful of decades,” he writes. “The recent advances suggest there may be a new way to store the exploding amount of computer data for centuries rather than decades.” Keep in mind that readable DNA has been found in fossils thousands of years old.
Of course, DNA storage isn’t a panacea. There are a number of open questions about a new kind of storage system, Ogunnaike writes, such as what sort of security and user interface it will have. “Will we all have DNA sequencers, DNA synthesizers, and algorithms that translate digital data into biological data in our phones, our homes, or our local community biohacker spaces? Or will these capacities be restricted to companies?” he writes. “In either scenario, how easily we can interact with DNA data storage technology might affect how quickly we adopt this technology.”
Other complications include error correction, retrieving just portions of the data, and avoiding the sort of protein sequences that causes DNA strings to fold up, Extance writes.
And what if something you save accidentally creates life? Hmm. It could be the next Jurassic Park movie.
We’ve written before about USB sticks that could set computers on fire. Now it’s not just a rumor – it’s something you can order now for less than 50 Euros. And it has free shipping!
Ostensibly for people to test how well PCs and other devices can handle power surge attacks, “the USB Kill collects power from the USB power lines (5V, 1 – 3A) until it reaches ~ -240V, upon which it discharges the stored voltage into the USB data lines, according to the product website. “This charge / discharge cycle is very rapid and happens multiple times per second. The process of rapid discharging will continue while the device is plugged in, or the device can no longer discharge – that is, the circuit in the host machine is broken.”
Companies that make electronic equipment need these devices to help ensure that their products can handle such attacks. “In our tests, over 95% of devices are affected by a USB power surge attack,” notes the website. “Almost all consumer-level hardware fails when tested against the USB Kill. The most frequent outcome is the complete destruction of the device (laptops, tv, telephones, etc).”
“Brutally efficient!” the website proclaims. While the intention is not to destroy data on the target system, “depending on the hardware configuration (SSD vs Platter HDD), the drive controllers may be damaged to the point that data retrieval is impractical,” the website warns. A demonstration of the device basically looks like you’re hot-wiring a teeny-weeny car.
The first batch of stock – the company doesn’t say how many – was shipped on September 19, according to the company blog. “As you may have noticed, due to overwhelming demand, we are currently on backorder,” the blog notes. “Each delivery of stock is larger than the previous.” With weekly shipments, the company pledged to clear its backorders by early October.
Now, the company is aware of what people might want to use these devices for, and says in no uncertain terms that that would be a no-no. “USBKill.com strongly condemns malicious use of its products,” notes the company’s FAQ. “A hammer used maliciously can permanently damage to a third party’s device. The USB Killer, used maliciously, can permanently damage a third party’s device. As with any tool, it is the individual, not the manufacturer of the tool, responsible for how the individual uses the tool.”
(The company also sells a tester device that lets users test their own devices without putting them at risk.)
Security experts were skeptical of the company’s claim for a use case. “Are there companies that hire security testers who say ‘Oh, please fry some electronics if you find an easy way to do so’? Are there security testers whose contracts say ‘We will cause property damage to show you your vulnerabilities, rather than telling you about them’?” scoffs one commenter on security expert Bruce Schneier’s blog, Schneier on Security. “For a security tester, you want something that highlights the vulnerability in a clear way, and causes no damage (or at least as little as possible consistent with finding the vulnerability). If I’m testing whether people will plug in random USB sticks, plugging it in should either report the information to me (probably using the 5V to power a transmitter, rather than trying to rely on a network connection that might not be there), or announces its presence to everyone nearby (how loud a siren can you power via USB?).”
At the same time, some also scoffed at the notion that disgruntled employees would buy these devices to get even with an employer. “A disgruntled employee can slip with his sports drink on someone’s desktop in front of a dozen people, say ‘Oh, my bad, I’m so sorry,’ and totally get away with it,” noted one. “A disgruntled employee who slaps one of these in someone’s desktop had better be doing it when there are no witnesses about, because otherwise there is going to be a really straightforward prosecution for criminal damage. Involving exhibits like ‘camouflaged criminal sabotage device purchased over the internet.’”
What was going to be more of a problem, they noted, would be people who found these devices scattered around and plugged them into systems to see what they did (or so they could return them to their owners). “It’s true that there are other ways to destroy a computer, but your average office worker is not going to destroy company property for no reason,” one wrote. “You can expect, however, that many will plug a USB device of unknown origin into their work computers. Destructive USB devices given away as ‘freebies’ can cause a lot damage.”
Even that, though, wasn’t as damaging as it could be, experts noted. “So you want to plug this into a target to fry it? You’re limited to whatever you physically had access to, and you might as well have poured a vial of who-cares-what-kind-of-goo into the port instead. No more nor less difficult to get away with on-camera,” noted one. “Why would you do that when you could use a USB that actually appears functional (so no immediate suspicion) but contains a pathogen that can reach beyond the mookstation into the deeper network?”
Political and foreign policy ramifications aside, Congress’ recent override of President Barack Obama’s veto of a bill to let 9/11 victims’ families sue Saudi Arabia has the potential for some really interesting e-discovery implications.
In case you haven’t watched CNN lately, Congress put forth a bill, the Justice Against Sponsors of Terrorism Act (JASTA), that would allow citizens, such as family members of 9/11 victims, to sue the nation of Saudi Arabia for its role in the terrorist attack, 15 of the 19 leaders of which were Saudi nationals. President Obama vetoed the bill because of a sovereign immunity law that protects foreign governments from American lawsuits. However, in what appears to be the one thing that Democrats and Republicans could agree on in eight years, both houses of Congress had a decisive override vote of the President’s veto, meaning JASTA is in force.
“Before the law, American victims of terrorist acts could only sue countries designated by the U.S. State Department as state-sponsors of terrorism, currently Iran, Syria and Sudan,” writes Mica Rosenberg for Reuters. “Now, any country can be sued if there are allegations of support for known terrorists that carry out attacks on U.S. soil.” More than 2000 family members sued Saudi Arabia in 2003, a suit that has been stalled due to the sovereign immunity law, he adds.
If nothing else, the law will likely become the Attorneys’ Full Employment Act of 2016. “The legal battle could last for years, and would be waged using thousands of pages of documents, deposition transcripts and official government investigations,” writes Mark Mazzetti in the New York Times. “It could end in millions – or billions – of dollars’ worth of Saudi assets being seized in a court settlement, or a judgment that largely vindicates the Saudi government, which for years has insisted it had no role in the deadly plot.”
Aside from this specific case, there is also the question of precedence, both for other countries, U.S. allies, and for the U.S. itself, if foreign countries pass similar laws and decide that U.S. actions could be considered terrorism, such as drone strikes and abuses committed by U.S.-trained police units or U.S.-backed militias, writes the Associated Press.
“If JASTA were enacted, courts could potentially consider even minimal allegations accusing U.S. allies or partners of complicity in a particular terrorist attack in the United States to be sufficient to open the door to litigation and wide-ranging discovery against a foreign country — for example, the country where an individual who later committed a terrorist act traveled from or became radicalized,” the President wrote in his veto message.
Defense Secretary Ash Carter wrote Rep. Mac Thornberry of Texas, the Republican chairman of the House Armed Services Committee and an opponent of the legislation, before the vote, warning him of the discovery implications.
“In the letter, Carter described the potential for litigants to seek classified intelligence data and analysis and sensitive operational information to establish their cases in what could be an ‘intrusive discovery process,’” writes Richard Lardner for the Associated Press. “If the U.S. were sued overseas, a foreign court would decide whether the information should be protected from disclosure, he said. Paradoxically, the information could be central to a defense against the accusations.”
“Disclosure could put the United States in the difficult position of choosing between disclosing classified or otherwise sensitive information or suffering adverse rulings and potentially large damage awards for our refusal to do so,” Carter wrote, according to Lardner.
Conceivably, lawsuits could be filed that have no hope of succeeding, just for the discovery they could involve. “Whether or not a case has merit, the U.S. would still have to defend itself or its troops against private litigants in foreign courts,” Lardner writes. “That raises the possibility of service members who refuse to appear or to divulge classified information being held in civil or criminal contempt by the foreign court, according to Carter.”
Meanwhile, after realizing the implications of its bill, now law, Congress is considering rewriting it – though not until after its election recess. “Both House Speaker Paul Ryan and Senate Majority Leader Mitch McConnell said that the measure could have unintended consequences — including the fact that it could leave U.S. soldiers open to retaliation by foreign governments,” write Steven Dennis and Billy House for Bloomberg. “But Republicans said the White House didn’t make a forceful case, putting themselves in the awkward position of blaming the president for a bill they enacted into law over Obama’s veto.”
“So, basically, Congress’s excuse is that they didn’t realize that the president was seriously opposed to JASTA despite the fact that he vetoed it, publicly articulated why he vetoed it and personally warned congressional leaders about the implications,” explains Daniel Drezner in the Washington Post.
For now, attorneys are already gearing up, with action leading to discovery likely as early as next week, writes Aliyah Frumin for NBC News.
When former Texas Governor Rick Perry left office in 2014, the records he left to the State Library and Archives Commission included more than 7 terabytes of data, writes Isabelle Taft in the Texas Tribune. But there weren’t any digital archives to put it.
“While the governor’s 4,000 cubic feet of paper could be sorted, itemized, boxed and shelved alongside other state records dating back centuries, [state archivist Jelain ] Chubb and her staff had no system in place to store thousands of gigabytes of photos, emails and audio and video recordings, much less make them available to the public,” Taft writes. “The Perry collection presented a Texas-sized challenge for a commission that had no capacity to manage the ‘born-digital’ records — those with no paper or analog footprint.”
In many ways electronics records are safer and more durable than paper ones—they can easily be backed up, replicated, and sent elsewhere to prevent a Library of Alexandria-type conflagration. Moreover, electronic records are typically easier to gain access to than paper records because people don’t have to travel to where the records are. That also helps make government more transparent. But that’s all predicated on the data still being readable electronically.
“Ancient hieroglyphics and scrolls have survived centuries, but digital storage is fragile, the files easily swept away or locked up in encryption,” writes Arielle Pardes in Vice.com. “The technology we use to store things today might not be around tomorrow, and many of the platforms we use to store information are owned by private companies, which makes it harder for archival institutions to save them. And how much of what we upload online is worth saving at all?”
There are three main problems with maintaining archives of born-digital material, Pardes writes:
- It requires the hardware to read it.
- It requires power for the hardware.
- It requires software—often proprietary, and sometimes copyrighted—to read it.
This is particularly true as data storage goes to private companies in the cloud—such as Facebook—rather than on software that we own, Pardes warns. “Many of the sites we use that are free, or that you rent space on, like a wedding site, they’re private companies,” she quotes historian Abby Smith Rumsey as saying. “You don’t have ownership of it.”
That was the problem with some of Perry’s data, which dated back to 2000 and in some cases used formats that are no longer around, such as WordPerfect, Adobe Pagemaker, VHS tapes, CDs and raw camera files, Taft writes. “Many files had to be reformatted so the public could view them with contemporary software.”
After the Perry project, the Texas data archive staff is now working on a similar project for Texas state agencies. “Arguably the most important function of the digital archive, however, is still under development: the ability to ingest and display the born-digital archives of state agencies,” Taft writes. “Archivists are currently working with three pilot agencies — the Texas Historical Commission, the Office of the Attorney General and the Railroad Commission — to get their electronic records from the late 1990s and early 2000s on the digital archive.”
Unfortunately, the group is running into the same problem as with Perry’s data. “Texas requires state agencies to preserve records if the state archives can’t yet take them,” Taft writes. “But floppy disks loaded with files can decay until they’re unreadable. Emails are often deleted to free up expensive storage space. And some formats are already obsolete.”
EMC is dead. Long live Dell Technologies. With Dell’s acquisition of EMC finalized earlier this month, it may seem like the end to a very long, long, long story. But although the merger is complete, there are still a quantity of potential Dell-EMC fallout to occur.
Layoffs. Reports are that 2,000 to 3,000 people are slated to be laid off from the combined companies.A lot of them are going to be administrative people – HR, finance, all the various business functions that are more or less duplicated by the two companies. This will also be a good time to see, however, what users can expect from Dell Technologies products going forward, if developers start getting laid off as well. And what methodology is Dell going to use? If there’s two competing products, will the Dell product naturally survive and the EMC product get the ax? Vice versa? Will they just pick the biggest product? A look at who gets laid off will reveal a lot of the merged companies’ philosophy.
Executives. Yes, all of this was an elaborate way for EMC CEO Joe Tucci to get out of formally naming a successor. But what will happen to all those former EMC executives? EMC was known for having a deep, if unexciting but businesslike, bench of potential managers. Are they going to stick around? Former VMWare and Pivotal CEO Paul Maritz has already retired. Will people like VMWare CEO Pat Gelsinger join him? Are they going to get buyouts? Will they start spinning off companies? It will be interesting to see.
Ancillary pieces. Individually, EMC and Dell were so big that there were pieces of them that people completely forgot about. (For instance, did you remember EMC bought RSA Security in 2006? I had it in my head that they’d been owned by Symantec. Nope.) As the two companies merge, there’s going to be all sorts of things they find in the back of closets as they set up housekeeping together. Dell has already offloaded its software group (Dell had a software group?) EMC has already offloaded poor, sad Documentum to Open Text; it remains to be seen to what degree Open Text continues to support it, or simply expects all the Documentum users to migrate to the OpenText software. And what will happen to RSA? Data Domain? Virtustream? Pivotal Labs?
VMWare. Of course, the biggest ancillary piece – and the source of a lot of EMC’s profit – is VMWare. (There was even talk that VMWare might acquire EMC, though it was just a rumor that the merged company would have been called Ouroboros.) The companies insist they don’t plan to sell it, but some financial analysts say the acquisition might not pencil out unless they do. What sort of support will a software company get from its parent hardware company? And how will VMWare’s independent reputation among users change when it is owned by a server company?
Culture. Austin and Boston may rhyme, and may both be tech hubs, but their cultures are pretty different. How well barbecue is going to mix with baked beans, only time will tell. One area they do have in common? Both companies have been outspoken in their support of equal rights for gay people, such as opposing the so-called religious freedom bills in Indiana and Georgia.
And finally. What will Elliott Management Co. — having now withdrawn its dripping scimitar from the still-writhing corpse of EMC – be looking at it as its next activist investment target? As it turns out, speaking of Symantec, looks like it’s on the radar: The other EMC more than doubled its holding in the most recent quarter and now owns on the order of 3 percent of the company.
People don’t talk about it much yet, but the data that the increasing number of drones are collecting has the potential to cause a lot of problems.
“In 2013, only 128,000 units in this product category were estimated to have sold in the United States. Many of these were in the low-cost ‘toy’ category, with poor flight control characteristics, limited range and relatively poor image quality,” writes Wes Brewer of VP and GM of Lexar, a global brand of Micron Technology, for the SD Association. “Now, the CTA has forecast that more than 2.8 million units will be sold in the United States in 2016, while Business Intelligence pegged the worldwide market at 7.1 million units.”
While drones started out having built-in storage, increasingly they are being built to include microSD cards, Brewer writes, noting that a GoPro Omni 4K can capture as much as 162 gigabytes of data per hour. Aside from the drones themselves, the software and hardware they need to run could add up to a $127 billion annual market, according to PriceWaterhouseCoopers.
All that data has the potential for two big problems.
Size. Granted, all drones are not created equal. The quadcopter you got for Christmas that uses a microSD isn’t the same as a military drone, such as the Wide Area Reconnaissance Hyperspectral overhead Real-Time Surveillance Experiment (WARHORSE), where a single frame could amount to 20 gigabytes, according to William Arkin’s book Unmanned. And a single government drone flight can amount to 400 terabytes of data, writes Yasmin Tadjeh in National Defense.
Nonetheless, even if we’re just talking about non-governmental drones, we’re still talking about the potential for an awful lot of data. Where’s it all going to go? And how? Are people going to upload it all to the cloud? How’s all that storage going to be paid for?
It’s enough to make all the data that the National Security Agency is reportedly storing sound like a drop in the bucket. This is especially true now that the Federal Aviation administration has now granted permission to small businesses ranging from farmers to real estate agents to send up drones weighing less than 55 pounds.
“Under the new rules, operators would register their drones online, pass an aviation knowledge exam for drone pilots at an FAA-approved testing center and then they’re good to go,” writes Joan Lowy for the Associated Press. “That’s a big change since operators currently have to have a manned aircraft pilot’s license.” Moreover, the FAA really doesn’t have much of an enforcement mechanism, she adds.
Security. Once you’ve collected all that data, and put it somewhere, you need to protect it. “How you are going to handle that data, which can be massive, especially in the case of video, LiDAR surveys, or other sensor files?” writes drone software specialist VirtualAirBoss. “And, since the data may reveal sensitive information about key property, progress, or infrastructure, how will you ensure data security and integrity?”
Keep in mind that, in the mid-2000s, there was a guy in England named Tom Berge. And Tom got really good with Google Earth. So good, in fact, that he used it to look for lead roofs on historic buildings in southern London, and then he’d scale the building, peel off the lead, and sell it. Altogether, he was thought to have hit 30 buildings, earning more than 100,000 pounds.
So if a guy could do that with Google Earth, how long is it going to take burglars to learn how to do this with drones? And as it happens, not long at all. As long ago as 2014, stories started being reported about burglars who were using drones to pinpoint targets. Even if you’re collecting the data only for good, the repository is going to be pretty tempting for some people – assuming they don’t just nab your drone out of the sky to grab the data on it, or replace it with something else.
If all of this is making you think, “Didn’t we just go through all this with police body cams?” it should. The situations are very similar – except worse. “In many ways, the data challenge presented by UAVs is no different than that created by body cameras or smartphones,” writes Hank Hogan for Gov Tech Works. “All can generate digital data that must be managed. However, one key difference is that as a law enforcement aircraft flies into place, it will pass over – and image – other structures and possibly people. It is tempting for UAV managers to say that unintended imaging is no problem; can’t they simply discard everything that is not evidentiary? But that raises critical legal questions: What if the accused wants to see every inch of the video? Should law enforcement then retain everything? Given that the legal system can take years to run its course, is the agency in question responsible for taking, compressing, housing and managing all of the data for that entire time?”
We’ve all experienced the sort of person who performs something badly on purpose in order to get out of doing it. Whether it’s a secretary who makes coffee badly because she doesn’t think it should be her job to make coffee, or a husband who does household chores or child-raising badly so he doesn’t get asked again, we all know of examples.
Now a Massachusetts Institute of Technology (MIT) graduate student is claiming that the Federal Bureau of Investigation (FBI) is deliberately using old, crappy software as an excuse for getting out of responding thoroughly to Freedom of Information Act (FOIA) requests.
“Freedom of Information Act (Foia) researcher Ryan Shapiro alleges ‘failure by design’ in protocols at the Department of Justice (DoJ) for responding to public requests,” writes Sam Thielman in The Guardian. “The Foia law states that agencies must ‘make reasonable efforts to search for the records in electronic form or format.’”
Instead of using a $425 million web-based package called Sentinel that came out in 2012, which can search the full text of FBI records, the FBI uses Automated Case System (ACS), a 21-year-old sainframe-based software package that can search only keywords. In other words, if an agent didn’t happen to include those keywords in their report, the record wouldn’t come up in a search, even if the words might have been included in the report itself, Thielman writes.
Even when nothing comes up using ACS, the FBI reportedly refuses to use Sentinel, claiming that it would be duplicative and wasteful, Shapiro told The Guardian.
Shapiro himself is quite a piece of work. He is a self-styled FOIA expert, called by some a FOIA Superhero, filing an average of two FOIA requests a day. These requests range from records about the Central Intelligence Agency’s role in the arrest of Nelson Mandela in 1962 to whether the American Egg Board improperly used its powers against a particular kind of mayonnaise because it didn’t use eggs in it.
Interestingly, Shapiro’s most recent FOIA efforts are not just about getting various documents themselves, but to help him understand how the FOIA process works at the FBI. “I design each request not only to hopefully produce the requested records, but also to further illuminate the agency’s FOIA operations,” he explains to George LeVines in Muckrock, a nonprofit organization devoted to the FOIA. “Though it of course should not need to be this way, developing as intimate a familiarity as possible with an agency’s internal filing systems, databases, and FOIA practices is frequently the key to success.”
For example, Shapiro has been including waivers from individuals with his FOIA requests in order to circumvent FBI restrictions about including those individuals’ personal information in its responses.
In response, the FBI is resorting to some interesting tactics, such as claiming that while Shapiro’s individual requests are all legitimate and unclassified, added up together they result in a threat to national security because they make it too easy to see how the FBI operates.
“Invoking a legal strategy that had its heyday during the Bush administration, the FBI claims that Shapiro’s multitudinous requests, taken together, constitute a ‘mosaic’ of information whose release could ‘significantly and irreparably damage national security’ and would have ‘significant deleterious effects’ on the bureau’s ‘ongoing efforts to investigate and combat domestic terrorism,’” writes Will Potter in Mother Jones. “So-called mosaic theory has been used in the past to stop the release of specific documents, but it has never been applied so broadly.”
Moreover, Shapiro can’t find out why, because the information was provided to the judge in the form of an ex parte, in camera letter which was secret. “They argued the threat to national security is so severe that they can’t even tell us why,” he told Amy Goodman at Democracy Now!
FBI Won’t Tell Him Why
Similarly, when he filed a FOIA request about the handling of his FOIA requests, the FBI refused, until a judge ruled in February that it couldn’t do that.
“Shapiro and his co-plaintiffs asked for more information about the process by which they had been so often refused,” writes Sam Thielman in The Guardian. “And those requests for clarifying information were categorically denied on the grounds that any information about the FBI’s reasons for denying previous Foia requests were by their very nature secret.”
“The FBI has long been notorious for the frequent poverty of the document searches it performs in response to FOIA requests,” writes The Sparrow Project, an organization devoted to social change. “The consistent inadequacy of the FBI’s FOIA searches has even led to the Bureau receiving an ‘award”’ for ‘worst Freedom of Information Act performance by a federal agency’ from a leading open government group, as well as a declaration from the Associated Press that, ‘If information were a river, the FBI would be a dam.’”
Perhaps someone could file a FOIA request on the FBI’s use of its new software.