Yottabytes: Storage and Disaster Recovery

Aug 25 2012   11:04PM GMT

Amazon Also Announces ‘Cold Storage,’ Called Glacier

Sharon Fisher Sharon Fisher Profile: Sharon Fisher

Delayed-retrieval low-cost storage is suddenly cool.

Last week it was Facebook’s Sub-Zero. This week it’s Amazon’s Glacier.

In both cases, the vendors are offering low-cost storage for long-term archiving in return for customers being willing to wait several hours to retrieve their data — though, in Facebook’s case, the customer appears to be primarily itself, at least for the time being.

“To keep costs low, Amazon Glacier is optimized for data that is infrequently accessed and for which retrieval times of several hours are suitable,” says Amazon. “With Amazon Glacier, customers can reliably store large or small amounts of data for as little as $0.01 per gigabyte per month.”

A penny per gigabyte equals $10 per terabyte (1,000 gigabytes) – compared with $79.99 for the cheapest 1-TB external drive from Amazon’s product search, while Dropbox’s 1-TB plan costs $795 annually, notes Law.com.

The service is intended not for the typical consumer, but for people who are already using Amazon’s Web Services (AWS) cloud service. Amazon describes typical use cases as offsite enterprise information archiving for regulatory purposes, archiving large volumes of data such as media or scientific data, digital preservation, or replacement of tape libraries.

“If you’re not an Iron Mountain customer, this product probably isn’t for you,” notes one online commenter who claimed to have worked on the product. “It wasn’t built to back up your family photos and music collection.”

The service isn’t intended to replace Amazon’s S3 storage service, but to supplement it, the company says. “Use Amazon S3 if you need low latency or frequent access to your data,” Amazon says. “Use Amazon Glacier if low storage cost is paramount, your data is rarely retrieved, and data retrieval times of several hours are acceptable.” In addition, Amazon S3 will introduce an option that will allow customers to move data between Amazon S3 and Amazon Glacier based on data lifecycle policies, the company says.

There is also some concern about the cost to retrieve data, particularly because the formula for calculating it is somewhat complicated.

While there is no limit to the total amount of data that can be stored in Amazon Glacier, individual archives are limited to a maximum size of 40 terabytes and up to 1000 “vaults” of data, Amazon says.

While it doesn’t deal with the issue of data for software that no longer exists, the Glacier service could help users circumvent the problem of the “digital dark ages” of data being stored in a format that is no longer readable, notes GigaOm.

Can similar services for other cloud products, such as Microsoft’s Azure, or for consumers, be far behind?

 Comment on this Post

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when other members comment.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: