Amazon now supports data export from its S3 storage cloud onto customers’ removable hard drives.
Amazon first opened up this “sneakernet” for import/upload to the Amazon cloud earlier this spring, allowing customers with large data sets to send the data to Amazon on removable media rather than trying to migrate the data over an Internet connection. This most recent announcement means users can extract data from the cloud using this method, too.
At the time of the first announcement, Amazon bloggers referenced the quote that immediately jumped to my mind reading about the export feature: “Never underestimate the bandwidth of a station wagon full of tapes hurtling down the highway.”
Amazon is far from the first or only cloud storage vendor to use seeding devices to get large data sets into the cloud rather than trying to squish terabytes through the average broadband Internet connection. Indeed, this network bottleneck is considered one of the biggest barriers to cloud computing adoption to date, and cloud backup vendors including EMC’s Mozy already send out seeding devices to upload or restore terabytes of data.
Companies such as NetEx are also offering software that promises to cut down on bandwidth between service providers and consumers downloading large, say, video files from centralized data centers. Others, including Cleversafe, are proposing to split data into chunks and among multiple sites to cut down on bandwidth and preserve data security.
So far, however, for the largest data sets — as this Amazon announcement demonstrates — nobody’s quite beaten the highway.
Akorri’s cross-domain reporting software, BalancePoint, is getting deeper into VMware analytics with version 3.0, which also contains some storage updates including support for more vendors and SAN switch performance analysis.
The version launched today includes virtual-machine level granularity for identifying resource contention, CPU and memory utilization, and CPU efficiency within the physical host; previously, analytics were performed by Akorri according by physical server rather than VM.
On the storage front, this new release adds support for 3PAR storage arrays to existing support for HDS and NetApp. BalancePoint can also now map storage switch infrastructures made up of Brocade and Cisco switches, analyze storage switch performance, and identify overutilized and underutilized SAN switch ports.
The most comparable product to Akorri that storage users would be familiar with would be NetApp’s Onaro, which is also software deployed without host agents, and also contains SAN switch performance analysis, as well as mapping virtual-machine level relationships to the underlying storage infrastructure. However, Akorri also includes more server-specific analytics while Onaro focuses on the storage infrastructure. Another storage-focused monitoring tool, Symantec’s CommandCentral Storage, also offers virtual-machine level analytics, but requires host agents. CommandCentral Storage, while also storage-focused, can be rolled up into Symantec’s overall data center management framework for cross-domain support.
In recent years, widespread VMware deployment has called for better analytics in IT to help smooth out performance bottlenecks and resource contention within the physical infrastructure. Until recently, however, deployment of storage resource management (SRM) tools been sluggish, although recent analyst research this year suggests that the economic downturn has more users looking to analyze and improve existing assets for better storage efficiency using these tools. This research also suggests that storage teams are increasingly cooperating with other IT departments, especially networking, to better optimize data center performance.
Open source backup software vendor Zmanda Inc., among the first to offer customers a direct link between its backup software and cloud storage, is opening up its API for connecting backup software with cloud backup service providers to other vendors, including competitors.
CEO Chander Kant says Zmanda is sponsoring a new open source project called ZCloud, which offers an API to show how backup software connects to storage cloud providers to avoid duplicating work as more backup vendors offer the option. “Today every backup software has to talk a different language,” Kant said. “This will get rid of those idiosyncrasies and make interoperability easier.”
The subject of standard cloud APIs and interoperability is a hot one in the still-nascent cloud computing market. While some are already calling for or developing standardized cloud interfaces, others say it’s too early to establish industry-wide standards without quashing differentiation.
Kant points out that an API specifically for backup tools isn’t the same as industry-wide, homogenizing standardization. “This is how standardization is actually going to happen — not an overall set of specs, but specific ones based on specific use cases,” he said.
He added that while the API specifies common aspects of connecting backup to the cloud, there’s no reason a backup software vendor can’t add its own differentiators to the integration. “The API allows for discovery of the underlying storage clouds, keeping developers from having to repeat basic stuff,” he said. At this early stage, he says “most people are first trying to get to a basic level of functionality.”
Admittedly late to the data deduplication game, Hewlett-Packard Co. is brewing new dedupe offerings to compete with the market’s new 800-pound gorilla — EMC/Data Domain.
“We welcome the competition and the fact that our competitors have shown that owning IP in this space is important,” said Kyle Fitze, HP’s marketing director for storage platforms. HP partners with Sepaton for high-end VTLs and Ocarina for primary storage data reduction, but also develops deduplication software for its entry-level disk backup devices.
Fitze was mum on whether the relationship with Sepaton will change given EMC’s $2.1 billion buyout of Data Domain, but HP does have a track record of acquiring partners after a few years if things go well. Fitzke said HP is focused on the higher-volume SMB market with its dedupe products, and is seeing more demand for the midrange/entry level D2D products. “One of the things we’re seeing develop is very large enterprise accounts with multiple sites consolidating their backup operations, and there’s been a lot of interest in the D2D portfolio.”
Currently, D2D cannot be used to replicate to the higher-end Sepaton-based VLS product. One HP shop, the Mohegan Tribe in Uncasville, Conn., is a midsized operation with about 8 TB capacity on its EVA 4100 primary SAN, but chose VLS over D2D because of Sepaton’s content-aware dedupe. “We felt content-aware deduplication was more efficient,” said David Shoup, technology manager for the tribe.
RenewData announced today it’s bought the privately-held Digital Mandate for an undisclosed sum, and plans to add Digital Mandate’s Vestigate legal review software to its eDiscovery software as a service (SaaS) offerings.
RenewData CEO Steven Horan says law firms and large corporations use Digital Mandate software to do “first-pass review” of electronically stored information (ESI) to determine its relevance to litigation. The application culls data sets before they are submitted for more granular legal review. “If I’m a customer, I don’t want to invest a million dollars just to know if I have a problem,” he said. Horan mainatins Vestigate can provide about 85% validity of the final data set, with the goal of cutting down legal fees that would be incurred by submitting the full data set for granular review.
RenewData provides services for planning, preservation and collection, processing, review, and production of electronic evidence, risk management, and data archiving. Horan said Vestigate was also offered as a service, but ReviewData will use it to offer something that can be put “behind the firewall” in customer environments where compliance or security concerns make a service less appealing.
Enterprise Strategy Group senior analyst Brian Babineau said RenewData, which had been partnering with Attenex (now owned by FTI), had to make this move in response to wider industry consolidation in eDiscovery. “This is a simple e-discovery market roll up,” he said. “[RenewData] had a legal service provider business by using Attenex software. However, when FTI bought Attenex, FTI also has a legal service provider business, [and] Renew was then using a direct competitor’s software.”
Babineau called the merger with Digital Mandate a “good hedge, but it doesn’t propel them forward” in the market.
Socha Consulting founder George Socha, said he’d had a “limited degree of exposure” to Digital Mandate’s product, but RenewData’s approach of bringing more parts of the e-discovery process in house is “consistent with what I hear consumers and law firms say they want – one throat to throttle. RenewData has expanded further across the eDiscovery spectrum.”
Consumer appliance maker Storage Appliance Corporation, makers of Clickfree-branded portable backup drives, launched a new connector today that will allow consumers to use their iPod or iPhone as the storage device for PC backups.
As an iPod user myself, I find this a pretty cool concept, especially since it requires no software — according to the website, you plug it in to your computer, plug the iPhone or iPod in, and it starts syncing automatically. The iPod or iPhone can be used in raw disk mode for a similar effect, but in the consumer world, people are generally willing to pay for convenience. This also makes it possible to use the free capacity of the device for data while retaining a music and video collection on the rest of the device simultaneously. The maximum capacity of an iPhone 3G currently is 32 GB; iPod Classics are available in up to 120 GB sizes.
The product, called the Transformer, is available now for $49.99. A device that syncs with external USB hard drives (for truly tiered home backup, I would presume) for $89.99 will be available next month.
The press release also contained this intriguing tidbit:
The new Transformers will also allow customers to retrieve music collections back off their iPhone or iPod. Instead of “orphaning” the content, the new Transformer allow you to retrieve your content quickly and easily.
Apple’s Digital Rights Management encoding makes it impossible to do this with either device by default, in order to prevent unauthorized sharing of copyrighted content among users with a sneaker-net of iPods and iPhones as ‘go-between’ devices. A Storage Appliance Corp. spokesperson said that files protected by Apple’s DRM would need to be re-activated in iTunes after transfer, meaning unauthorized transferred files would be unplayable.
Enterprise storage managers could still find this a good forensics tool for accessing the content on a portable device that might otherwise be automatically wiped when connected to a PC with iTunes; I can also picture it being used at very small companies or home offices for mobile data backups and file transfers.
In general, though the iPhone seems headed for a business role alongside the BlackBerry, the real Holy Grail for enterprise IT will be the ability to virtually provision applications to mobile devices from the cloud on-demand. Citrix has already demonstrated a form of this; Symantec has also talked in the past about offering such capabilities.
SGI has launched its first enterprise data storage product since Rackable bought Silcon Graphics Inc. for $42.5 million and changed its name to SGI following the acquisition.
The new product, CloudRack X2, is a scaled-down version of its CloudRack C2, which is shipped already racked. CloudRack X2 holds up to nine TR2000 trays, which contain CPU as well as up to eight 3.5-inch SATA hard disk drives. C2, by contrast, holds up to 20 drives. X2 can be deployed as a cabinet or slotted into an existing rack.
SGI says it can customize the hardware it ships in either product, offering a range of Intel Xeon and AMD processors and user-configurable CPU/ memory/capacity combination. The goal is to sell this product into Internet service providers as well as HPC and digital graphics shops such as post-production editing. SGI is the latest in a list of HPC scale-out vendors to bring their high-end products downmarket, although unlike Isilon and BlueArc, SGI is not targeting mainstream enterprises with new software.
“The hardware design has been tailored for cloud computing,” SGI senior product marketing director Geoffrey Noer said. “Most of those service providres have their own custom software — one of our big value propositions is that we can tune our hardware to their specific apps.”
Ah, there’s that word again: “cloud.” We reported Thursday on some new plans brewing behind the scenes at NetApp and Emulex in the cloud storage space, as well as the commentary from several industry analysts about how, well, cloudy this buzzword has become. Just look at these three companies alone — one, NetApp, is focused on delivering IT as a service within enterprise data centers through a virtual infrastructure; another, Emulex, is focusing on connectivity between internal enterprise infrastructure and external service provider data centers; a third, SGI is offering little in the way of virtualization or software integration and cites the hardware design in defining its system as suitable for the cloud.
“It’s like trying to bag smoke right now,” Illuminata analyst John Webster said of pinning down a technical definition for the “cloud” buzzword.
However, one thing is abundantly clear: whatever the cloud turns out to be, vendors are gung-ho about it leading into fall. Expect further developments, particularly from NetApp. Then we’ll have to see whether end users follow.
Emerging data archiving software player Nayatek released a new version of its Datosphere software this week, adding support for archiving Windows file systems to its existing support for email, Simple Mail Transfer Protocol (SMTP), instant messaging and unified communications. The company’s goal is to build what it calls a “data neutral” archive through a modular design that features connectors for each type of data supported.
Nayatek’s file archiving offers federated search, some e-discovery/custodian role features, although VP of product management Scott Lehmann said the company is still working on legal hold and SharePoint. Datosphere can stub or copy a file to the archive while deleting it completely from primary storage. File archiving policies are available according to age, size or document type. End users can access and view archived files and emails through an Outlook folder or web client, and perform federated searches across all data types from one interface.
Datosphere comes with a Redundant Array of Independent Nodes (RAIN) architecture, in a standard HA (dual) version and an enterprise n-way version. The software itself ships within a virtual appliance. According to Lehmann, Datosphere remains Windows-focused for email and files so far, though Unix support is planned. Similarly, single instancing in the Datosphere archive is currently limited to email and within file shares – no global data reduction yet.
While Nayatek has managed in a short time (the company came out of stealth in December) to match many of the major features of more established competitors, it will be difficult to break into this market without significant differentiation. According to Lehmann, Datosphere’s software-only model and the simplicity of its modular design will make it more user-friendly than competitors’ offerings.
But Enterprise Strategy Group senior analyst Brian Babineau said it will probably take more than that for Nayatek to overtake competitors like Symantec, EMC, Autonomy-Zantaz and Mimosa Systems in the data compliance and archiving market. The biggest differentiator in this market is breadth of support for multiple operating systems and applications, especially Microsoft SharePoint and Lotus Notes email as an alternative to Exchange. E-Discovery, search and compliance capabilities, and a SaaS option or cloud partners are also keys to success, Babineau said.
“No one has it all,” Babineau said. “As a shiny new object in the marketplace, Nayatek may get some attention — but where they’re going to go long-term is the biggest question mark I have looking at them right now.”