Yottabytes: Storage and Disaster Recovery


August 25, 2012  11:04 PM

Amazon Also Announces ‘Cold Storage,’ Called Glacier

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Delayed-retrieval low-cost storage is suddenly cool.

Last week it was Facebook’s Sub-Zero. This week it’s Amazon’s Glacier.

In both cases, the vendors are offering low-cost storage for long-term archiving in return for customers being willing to wait several hours to retrieve their data — though, in Facebook’s case, the customer appears to be primarily itself, at least for the time being.

“To keep costs low, Amazon Glacier is optimized for data that is infrequently accessed and for which retrieval times of several hours are suitable,” says Amazon. “With Amazon Glacier, customers can reliably store large or small amounts of data for as little as $0.01 per gigabyte per month.”

A penny per gigabyte equals $10 per terabyte (1,000 gigabytes) — compared with $79.99 for the cheapest 1-TB external drive from Amazon’s product search, while Dropbox’s 1-TB plan costs $795 annually, notes Law.com.

The service is intended not for the typical consumer, but for people who are already using Amazon’s Web Services (AWS) cloud service. Amazon describes typical use cases as offsite enterprise information archiving for regulatory purposes, archiving large volumes of data such as media or scientific data, digital preservation, or replacement of tape libraries.

“If you’re not an Iron Mountain customer, this product probably isn’t for you,” notes one online commenter who claimed to have worked on the product. “It wasn’t built to back up your family photos and music collection.”

The service isn’t intended to replace Amazon’s S3 storage service, but to supplement it, the company says. “Use Amazon S3 if you need low latency or frequent access to your data,” Amazon says. “Use Amazon Glacier if low storage cost is paramount, your data is rarely retrieved, and data retrieval times of several hours are acceptable.” In addition, Amazon S3 will introduce an option that will allow customers to move data between Amazon S3 and Amazon Glacier based on data lifecycle policies, the company says.

There is also some concern about the cost to retrieve data, particularly because the formula for calculating it is somewhat complicated.

While there is no limit to the total amount of data that can be stored in Amazon Glacier, individual archives are limited to a maximum size of 40 terabytes and up to 1000 “vaults” of data, Amazon says.

While it doesn’t deal with the issue of data for software that no longer exists, the Glacier service could help users circumvent the problem of the “digital dark ages” of data being stored in a format that is no longer readable, notes GigaOm.

Can similar services for other cloud products, such as Microsoft’s Azure, or for consumers, be far behind?

August 19, 2012  10:36 PM

Facebook to Use ‘Hard Drive Thermostat’ in ‘Sub-Zero’ Backup Facility

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Remember when Facebook started designing its own servers and data center? And then its own disk drives?

Now it’s designing its own archival backup.

The story, broken by Robert McMillan at Wired, is that the company is, over the next six to nine months, working to design a storage archive system. Because it stores a second copy of data and is intended to be used only for restores, the system powers down the drives when not in use. Such technology could reduce power use by the data center to one-third, according to the Facebook spokesman quoted by Wired.

More generally, Facebook has been working on what it calls the Open Compute Initiative, which basically means that it is designing new, minimalist hardware for standard functions that — due to the enormous scale of the company’s hardware — saves space, money, energy, and so on. The intention, once the design is complete, is to open source the data and offer the designs to the industry.

It isn’t clear whether this method of archival storage is also going to be open-sourced, according to the Verge. However, Facebook has been talking about the notion of drives that spin down when not in use — what it calls a “hard drive thermostat” — for almost exactly a year in connection with the Open Compute project.

Storage that is saved but rarely used is called “cold” storage, so the proposed building, part of the Facebook data center complex in Prineville, Ore., is nicknamed Sub-Zero, presumably after the line of high-end refrigerators. The company is also considering building a similar facility as part of its Forest City, N.C., data center.

It’s important with such systems to ensure that the data on them really isn’t used very much, because it can take up to 30 seconds for the disk to start from zero, and up to 15 seconds from the slower speed.


August 16, 2012  12:57 PM

Storage No Longer a Barrier to Ubiquitous Government Surveillance

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Big Brother image via Shutterstock

Big Brother image via Shutterstock

“Within the next few years an important threshold will be crossed: For the first time ever, it will become technologically and financially feasible for authoritarian governments to record nearly everything that is said or done within their borders – every phone conversation, electronic message, social media interaction, the movements of nearly every person and vehicle, and video from every street corner.”

This is due to a 1-million-times improvement in the ability to store information since 1985, according to John Villasenor, an electrical engineer at the University of California, Los Angeles, as well as a senior fellow at the Brookings Institution.

As an example, it would cost just $50 for an entire year of location data to 15-foot accuracy for 1 million people, updated every five minutes, 24 hours a day, Villasenor said in a seminar earlier this year. Similarly, storing the audio from telephone calls made by an average person in the course of a year would require about 3.3 gigabytes and cost just 17 cents to store, a price that is expected to fall to 2 cents by 2015, he said.

Scott Shane in the New York Times blog called attention to Villasenor’s work, which was published in a Brookings Institution paper date last December called Recording Everything: Digital Storage as an Enabler of Authoritarian Governments.

“In the 1960s, the National Security Agency used rail cars to store magnetic tapes containing audio recordings and other material that the agency had collected but had never managed to examine, said James Bamford, an author of three books on the agency,” reported Shane. “In those days, the agency used the I.B.M. 350 disk storage unit, bigger than a full-size refrigerator but with a capacity of 4.4 megabytes of data. Today, some flash drives that are small enough to put on a keychain hold a terabyte of data, about 227,000 times as much.”

Civil liberties organizations have increasingly been concerned about the amount of government surveillance that has been permitted, ranging from data that can be obtained from cellphones with no warrant required (though Rep. Markey (D-Mass.) has put forth a bill to limit that) to location data that the Sixth Circuit Court has ruled doesn’t require a warrant.

Moreover, Villasenor notes, individual people are providing a great deal of such data themselves, through the use of social media, mobile location apps, and so on.

But it is the rapidly declining cost of storage that makes such surveillance possible, Villasenor says. “Over the past three decades, storage costs have declined by a factor of 10 approximately every 4 years, reducing the per-gigabyte cost from approximately $85,000 (in 2011 dollars) in mid-1984 to about five cents today,” he writes. “In other words, storage costs have dropped by a factor of well over one million since 1984 [My note: an ironic benchmark to use]. Not surprisingly, that fundamentally changes the scale of what can be stored.”

These technological improvements put it within the reach of a country to store all the data it can obtain, Villasenor says. “For a country like Syria, which has a population of 15 million people over the age of 14, the current cost to purchase storage sufficient to hold one year’s worth of phone calls for the entire country would be about $2.5 million – a high number but certainly not beyond governmental reach,” he writes. “If historical cost trends continue, the annual cost in 2011 dollars to purchase enough storage for Syria’s government to record all calls made in that country will fall to about $250,000 by 2016 and to about $25,000 by 2020.”

While video data takes up much more space, limited video data –such as recording license plate numbers — is becoming increasingly prevalent in various states throughout the U.S., most recently in Minnesota. “Over the course of a full year, a system of 1,000 roadside license plate reading cameras each producing 1 megabit per second would generate image data that could be held in storage costing about $200,000,” Villasenor writes. “The resulting database of license plate numbers (as opposed to the images used to obtain the numbers) could be stored for a small fraction of this cost.”

The so-called “Peaceful Chongqing” universal video surveillance project in China — ostensibly for public safety — could cost as little as 25 cents per person per year by 2020, Villasenor writes.

Villasenor’s paper was focused on what he called authoritarian governments. Extrapolating costs to the U.S. was presumably left as an exercise for the reader.

 


July 31, 2012  10:47 AM

Don’t Have an Encryption Key? Go to Jail

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Christian Szell: Is it safe?… Is it safe?
Babe: You’re talking to me?
Christian Szell: Is it safe?
Babe: Is what safe?
Christian Szell: Is it safe?
Babe: I don’t know what you mean. I can’t tell you something’s safe or not, unless I know specifically what you’re talking about.

— Marathon Man

The topic of whether one is or is not compelled to produce a key to an encrypted disk drive in response to a law enforcement request has come up before (and, in fact, in the U.S. has come up at least five times, according to a DefCon presentation by Electronic Frontier Foundation attorney Marcia Hoffman).

Now that, in at least some cases, U.S. judges are ruling that individuals need to surrender the passwords to decrypt their disks, an interesting side issue has also come up, particularly for those countries that already have laws requiring people to provide encryption keys to law enforcement: What happens if the person doesn’t have the key? Or says they don’t?

Such key disclosure laws are in effect in countries including India, Australia, and a variety of European nations.

But most of the attention has fallen on the U.K., where in October, 2007, a law was put in place that people could be jailed for up to five years for refusing to release an encryption key.  Ostensibly, it was for national security and to help prevent terrorism.

And the law has been used. In 2007, it was applied — against a group of animal rights activists. Then in 2009, the first person – who had been diagnosed as mentally ill – was jailed for nine months for refusing to turn over a key. In 2010, a teenager was sentenced to 16 weeks in prison under a similar charge.

Earlier this month, a U.K. blogger, Rick Falkvinge, brought attention to the possibility that, should law enforcement decide that a random number generator, for example, was actually an encrypted file, one could be jailed for refusing to provide the nonexistent key.

This is not the first time the subject has been raised – Cisco blogged about a similar topic in 2009 – but with the increased focus on encryption and decryption – typically using the spectre of child pornography (according to Hoffman, four out of the five U.S. encryption key cases had to do with alleged child pornography), there is new concern being brought to the issue and how to deal with being forced to prove a negative: That you don’t have a key that law enforcement insists you do.

“So imagine your reaction when the police confiscate your entire collection of vacation photos, claim that your vacation photos contain hidden encrypted messages (which they don’t), and sends you off to jail for five years for being unable to supply the decryption key?” writes Falkvinge.


July 29, 2012  3:23 PM

Black Hat Vendors Resort to Booth Babes #blackhat

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

“Hey, did you see the girls dressed up in the BDSM outfits in the exhibit room?”

And people wonder why women don’t want to get involved in the computer industry.

To be fair, a number of men, as well as women, objected to the booth babes in the exhibit room at Black Hat this week in Las Vegas. In fact, the person who told me about them said, “What do they think this is, the 1980s?”

But really? We’re still having this discussion? In 2012?

To be sure, this is not criticizing the young women who choose to make their living this way. Nor is it ignoring the Las Vegas context of the conference, where women in similar costumes, or less, could be seen in just about any casino. The criticism is to the vendors in a professional conference who believe that this is the best way to attract attendees to their booths.

(Interestingly, the more counter-culture DefCon, also held that week and also held in Las Vegas, apparently didn’t feel the need to do this.)

Women security experts attending a meeting of the Executive Women’s Forum, a Scottsdale, Ariz.-based organization of more than 750 female security professionals in the computer industry, generally rolled their eyes about the two vendors that chose to promote their products this way. Several of them expressed surprise that a major vendor such as RSA would feel the need to resort to such tactics, noting that it’s typically expected more from smaller vendors.

Indeed, the other vendor featuring scantily clad women in its booth was a smaller vendor, SecureNinja – but on the other hand, to judge from the swag people were carrying during the conference, attendees seemed primarily interested in the toy ninja swords the company was also handing out. Which only goes to show that vendors don’t have to resort to scantily clad women to attract visitors, male or female.

This is not to pick on Black Hat in specific – this is apparently endemic among security shows. And what annoyed some EWF members was not just the attire of the booth babes at these shows, but the fact that, typically, the women aren’t actually capable of discussing the companies’ products – with one of them reporting that she took it upon herself at one show to teach them what public key infrastructure meant.

EWF members also pointed out what they saw as a more egregious offense: that of all the speakers in the conference, only three were women. Moreover, some male speakers made jokes about women in their presentations, such as the one who pretended to be confused between new Yahoo! CEO Marissa Mayer and Sports Illustrated swimsuit model Marisa Miller.

Ironically, one of the other speakers — Mark Weatherford, Deputy under Secretary for Cybersecurity for the National Protection and Programs Directorate (NPPD), who was representing the Department for Homeland Security — mentioned in his presentation, “We have a problem. There are not enough smart people in the public or private sector to help us defend our country.”

If this is the case, can we really afford to alienate half of them?


July 22, 2012  8:48 AM

EMC, VMware Move Executives Around

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

EMC isn’t known for having drama-filled management shakeups, which is why the most exciting part of the recent executive shuffle between it and VMware was when word leaked that VMware CEO Paul Maritz was on the way out before it became clear where he was going — to EMC as “top strategist.”

Much handwringing ensued at first, with VMware stock dropping 3% and some analysts criticizing the company for taking its eye off the virtualization ball, with some industry watchers speculating he was being “pushed out” and might head an EMC cloud spinoff.

Now that Maritz’ future at EMC is settled, and VMware is now headed by EMC chief operating officer Pat Gelsinger — himself often said to be a potential future EMC head — people are back to talking about how well EMC and VMware are doing, and stock for both companies went up.

In a world where CEOs resign 20 minutes after their appointment so they can collect a $44 million severance payment, you can’t blame people for getting excited about EMC and VMware, where the most exciting aspect is whether CEO Joe Tucci is going to retire in 2012 or 2013. (He’s now planning to retire in 2013. Unless he stays another year.)

Maritz is a former Microsoft executive with a lovely South African accent whose biggest claim to fame is coming up with the phrase “eating your own dog food” for companies that use their own products while Gelsinger is a former executive at Intel who was said to be being groomed for that CEOship. As always, EMC has a deep bench of qualified, non-drama-king executives.

VMware and EMC have an interesting relationship. VMware is 79% owned by EMC (and accounts for 60% of EMC’s own value), but operates fairly independently for all that. While some observers speculated this may mean further grooming for Maritz as a potential CEO, and that Gelsinger moving to VMware meant he was no longer in the running, the Wall Street Journal and others seemed to believe the EMC CEO slot would primarily be between Gelsinger and CFO David Goulden, who was named COO.

Whether either Maritz or Gelsinger is expecting a child didn’t come up — apparently that’s only a issue for female tech executives — though the Wall Street Journal did mention that they were each planning to stay in the Bay Area to be close to grandchildren.


July 17, 2012  7:59 AM

Maine Politician du Houx Accused of Behaving Badly With Data Storage

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

We’re used to hearing stories about politicians erasing things from hard disk drives that they shouldn’t have. Now we’re hearing about a politician allegedly putting things onto a hard disk drive that he shouldn’t have.

Maine legislator Rep. Alex Cornell du Houx (D-Brunswick) was the subject of a temporary protection from abuse order this spring filed by Rep. Erin Herbig (D-Belfast), after their relationship ended, according to the Bangor Daily News. The two people made a private agreement, which was not filed in court but was said to be legally binding, around the relationship and their future contact.

“In addition to having no contact with Herbig, the private agreement, which was obtained by the Bangor Daily News, requires that Cornell du Houx pay all of Herbig’s $9,000 in legal fees and turn over any computer hard drives and data storage devices with pictures of Herbig or ‘any other women sleeping or in a state of undress’ to Herbig’s lawyer.”

Buh?

Apparently, after their relationship ended, du Houx had allegedly “stalked, harassed and threatened her,” including taking more than 100 photos and video of Herbig sleeping.

Without addressing the veracity of the charges, let’s look at the data storage issues involved. (The private agreement doesn’t appear to be available online; excerpts that the paper posted didn’t cover these aspects.)

  • The agreement would appear to mention only hard drives and data storage devices. Apparently, if du Houx had any pictures of sleeping or undressed women stored in the cloud, those are fair game. (Not to mention, sleeping or undressed men.)
  • As Bangor Daily News letter writer Sid Duncan legitimately pointed out (in an otherwise kind of gross and disgusting letter to the editor on July 16 that the paper has since retracted), what if some of the data storage devices are actually state-owned and contain state documents?
  • What if the data storage devices contain communication with du Houx’ attorney? Passing them to Herbig’s attorney could violate attorney-client privilege.
  • What is Herbig’s attorney intending to do with the data storage devices? Is he a computer forensics expert who could legitimately retrieve any such images, even if deleted, from the devices? Is he intending to do so? Is he going on a fishing expedition to see what other charges could be filed – such as, conceivably, child pornography or stalking? Has he obtained a warrant for this? And what is he planning to do with the images and data storage devices? Is he planning to destroy these images? Or, as Duncan implied, will he be, um, “perusing” them? If he does destroy those images, could he be accused of destroying evidence (as in 2007’s United States vs. Philip D. Russell)? But if there is child porn on any of the devices, could the attorney himself then be charged with possession of child pornography?
  • What about the privacy and rights of any other sleeping or undressed women who might also have images – obtained legitimately or otherwise – stored on du Houx’ devices? Especially since he appears to be a professional photographer? Or, in fact, any other personal pictures he might have of other people? Or, for that matter, of himself?
  • Will du Houx get his data storage devices and data back, including all the data that isn’t of sleeping or undressed women?

 

 


July 11, 2012  8:05 PM

Gartner Updates Backup Magic Quadrant, Sort Of

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

If you’ve been holding your breath waiting for the new Gartner Enterprise Backup/Recovery Magic Quadrant to come out so you could see how vendors had moved around since last year, you can let go. Other than FalconStor moving from Visionary to Niche, and several players in the Niche quadrant changing around primarily due to acquisitions, there wasn’t much change. Commvault, EMC, Symantec, and IBM are still in the Leaders quadrant; HP (now Autonomy, an HP company) and CA are still in the Challengers quadrant; and NetApp is now all alone in the Visionary quadrant. The Niche Players in this Magic Quadrant are Acronis; Asigra; EVault, a Seagate Company; FalconStor Software; Quest Software and Syncsort.

However, Gartner did change some of its strategic planning assumptions (that is, predictions) between last year and this. While it still believes that by 2014, 80% of the market will choose advanced, disk-based appliances and backup software-only solutions over distributed VTLs, it now believes that one-third, not 30%, of organizations will change backup vendors due to frustration over cost, complexity and/or capability, and it will be by 2016, not 2014.

It also now believes that by 2015, at least 25% of large enterprises will have given up on conventional backup/recovery software, and will use snapshot and replication techniques instead, and that by the end of 2016, at least 45% of large enterprises, up from 22% at year-end 2011, will have eliminated tape for operational recovery.

And it is no longer predicting that by 2014, deduplication will cease to be available as a stand-alone; rather, it will become a feature of broader data storage and data management solutions; while the company didn’t say so explicitly in this report, product descriptions seemed to indicate this has largely taken place already.

Other interesting tidbits from the report include:

  • Gartner end-user inquiry call volume regarding backup has been rising at about 20% each year for the past four years.
  • “The rising frustration with backup implies that the data protection approaches of the past may no longer suffice in meeting current, much less future, recovery requirements,” Gartner notes. “As such, companies are willing to adopt new technologies and products from new vendors, and have shown an
  • increased willingness to switch backup/recovery providers to better meet their increasing service levels.”
  • Companies are increasingly considering cloud-based recovery systems, predominantly for midsize enterprise servers and branch-office and desktop/laptop data.
  • Symantec currently owns 34.1% of the market, which has decreased over the past five years. IBM and EMC have 17.3% and 17.0% market share, respectively.
  • No other vendor has more than a 7% market share.
  • In 2011, CommVault and EMC increased their market shares.
  • Along with Symantec, CA Technologies, IBM and Quest Software slid slightly in market share in 2011.

Gartner also identified five trends that it believes will  emerge over the next several years:

  • Re-expanding the number of backup solutions and technologies
  • Backup application switching
  • Decreasing backup data retention
  • Backup modernization
  • Deployment of new technologies and vendors


June 30, 2012  3:24 PM

Cloud Storage Proves Not So Resilient After All

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Oh, hey, just put all your stuff in the cloud. It has IT people to watch it and take care of it, and if there’s any sort of problem, it’s replicated in other places so you’ll still have access to it whenever you want.

Remember that?

It turns out, not so much.

Major Internet services such as Netflix, Pinterest, and Instagram were taken offline on Friday night due to thunderstorms that took out power to the Amazon Web Services site in Virginia.

Amazon’s Cloud services status page was full of power-related error messages,” wrote MSNBC’s Bob Sullivan. “Amazon’s ElastiCache, for example, indicated that starting at 8:43 p.m., the service was “affected by a power event.”  At 9:25 p.m., this message was posted: “We can confirm that a large number of cache clusters are impaired. We are actively working on recovering them.””

But, oh, Sullivan continued, this wasn’t really Amazon’s fault, because the storm was just so very bad. Why Amazon didn’t have backups or replicated copies elsewhere on the network, he didn’t say. (Apparently the problem may have been a routing issue.)

“Outages like this morning’s are a reminder of how fragile, still, our digital architectures actually are,” intoned The Atlantic‘s Megan Garber. “As much as we try to bolster them against the elements, they are made of sand, not stone. Buildings can be brought down by storms; but so, today reminds us, can their digital counterparts. Even the structures that lack structures can be torn by nature’s whims. That is, in its way, terrifying. And yet — here’s the other sliver — it is also, just a tiny bit, reassuring. No matter how advanced we get, today reminds us, nature will always be one step ahead.”

This is probably not a great consolation to those companies that have moved their operations to the cloud because they were assured it would still be there in a disaster. And this is a disaster? A thunderstorm (albeit one that has caused at least a dozen deaths)? Are those companies feeling “reassured” today by discovering that their disaster recovery systems are, in fact, vulnerable to an outage that might be in a completely different part of the country?

What happens if there’s an earthquake, hurricane, or some more severe natural disaster? If the Red Cross or FEMA loses its connectivity because it depended on the cloud, will its managers philosophically fold their hands and talk about how in the great scheme of things this shows just how little we all are? Will the people asking to be saved or helped see it this way?

Instead, hopefully Amazon and other cloud providers will take this as a wake-up call before hurricane season gets going, and ensure that the virtual cloud can stand up to the real thing.

 


June 28, 2012  7:04 PM

Google Docs Adds Offline Support

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

In a move that has been expected for the past month — and desired for much longer than that — Google has made it possible for Google Docs users to edit documents offline and then synchronize them when the user logs back in again. Google called offline editing “one of its most requested features” for Google Drive, which the company said has 10 million users.

At present, the function works only with Google Docs — that is to say, document files using Google Drive — but is expected to be available for spreadsheets and presentations at some point in the future, Google said in its instructions. It is also only available for the Chrome browser, and Google didn’t say whether it expected to make the feature available to other browsers.

Pundits are claiming this functionality will negatively affect other consumer cloud storage systems, such as Dropbox. “Google Kneecapped Dropbox,” proclaimed Business Insider.

The one other point that’s worth noting is that changes made to the online file while the user is offline take precedence over whatever changes the offline user makes.

“If an online collaborator deletes the text you edit while offline, their changes will override yours. If a collaborator deletes the document you’re editing offline, your changes will be lost when you come back online because the document will no longer exist. Try to use offline editing for documents that you own and that won’t be deleted without your knowledge.”

Well, yeah, but that sort of defeats the purpose of using Google Drive for collaboration in the first place, doesn’t it?

Another point — only enable Google Drive on your own computers.

“Enabling offline access on public or shared computers can put your data at risk, since others may be able to view your synced Google documents and spreadsheets.”

Which also sort of seems to defeat the purpose — wouldn’t a great use case for this feature be “I’m traveling without my computer or it broke, and so I’m using somebody else’s”? Might be nice to have a “Mr. Phelps” feature that automatically deleted itself after syncing a file, or on command.

Needless to say, features that require the Internet won’t be available online — sharing, publishing, reporting a problem, etc. Surprisingly, however, so is inserting an image or a picture.

Google made the announcement at its sold-out I/O conference in San Francisco.

 


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: