Storage Soup


December 14, 2011  2:03 PM

HDS sharpens file capabilities for the cloud

Dave Raffo Dave Raffo Profile: Dave Raffo

Hitachi Data Systems added a few more lanes to its cloud storage on-ramp today.

HDS brought out the Hitachi Data Ingestor (HDI) caching appliance a year ago, calling it an “on-ramp to the cloud” for use with its Hitachi Content Platform (HCP) object storage system. Today it added content sharing, file restore and NAS migration capabilities to the appliance.

Content sharing lets customers in remote offices share data across a network of HDI systems, as all of the systems can read from a single HCP namespace. File restore lets users retrieve previous versions of files and deleted files, and the NAS migration lets customers move data from NetApp NAS filers and Windows servers to HDI.

These aren’t the first changes HDS has made to HDI since it hit the market. Earlier this year HDS added a virtual appliance and a single node version (the original HDI was only available in clusters) for customers not interested in high availability.

None of these changes are revolutionary, but HDS cloud product marketing manager Tanya Loughlin said the idea is to add features that match the customers’ stages of cloud readiness.

“We have customers bursting at the seams with data, trying to manage all this stuff,” she said. “There is a lot of interest in modernizing the way they deliver IT, whether it’s deployed in a straight definition of a cloud with a consumption-based model or deployed in-house. Customers want to make sure what they buy today is cloud-ready. We’re bringing this to market as a cloud-at-your-own-pace.”

December 14, 2011  8:25 AM

Amazon S3, Microsoft Azure top performers in Nasuni stress tests

Sonia Lelii Sonia Lelii Profile: Sonia Lelii

Nasuni is in the business of connecting its customers’ cloud NAS appliances to cloud service providers in a seamless and reliable fashion. So the vendor set out to find which of those service providers work the best.

Starting in April 2009, Nasuni put 16 cloud storage providers through stress tests  to determine how they handled performance, availability and scalability in real-world cloud operations.

Only six of the initial 16 showed they are ready to handle the demands of the cloud, Nasuni claims, while some of the others failed a basic APIs functionality test. Amazon Simple Storage Service (S3) and Microsoft Azure were the leaders, with Nirvanix, Rackspace, AT&T Synaptic Storage as a Service and Peer 1 Hosting also putting up passing grades.

“You won’t believe what is out there,” Nasuni CEO Andres Rodriguez said. “Some had awful APIs that made them unworkable. Some had some crazy SOAP-based APIs that were terrible.”

Nasuni did not identify the providers that received failing grades, preferring to focus on those found worthy. Amazon and Microsoft Azure came out as the strongest across the board.

Amazon S3 had the highest availability with only 1.43 outages per month – deemed insignificant in duration – for a 100% uptime score. Azure, Peer 1 and Rackspace all had 99.9% availability.

Rodriguez described availability as the cloud providers’ ability to continue operations and receive reads and writes, even through upgrades. “If you can’t be up 99.9 percent, you shouldn’t be in this business,” he said.

For performance testing, Nasuni looked at how fast providers can write and read files. Their systems were put through multiple, simultaneous threads, varying object sizes and workload types. They were tested on their read and write speed of large (1 MB), medium (128 KB) and small (1 KB) files.

The tests found S3 provided the most consistently fast service for all file types, although Nirvanix was fastest at reading large files and Microsoft Azure wrote all size files fastest.

Nasuni tested scalability by continuously writing small files with many, concurrent threads for several weeks or until it hit 100 million objects. Amazon S3 and Microsoft Azure were also the top performers in these tests. Amazon had zero error rates for reads and writes. Microsoft Azure had a small error rate (0.07%) while reading objects.

The reported stated: “Though Nirvanx was faster than Amazon S3 for large files and Microsoft Azure was slightly faster when it comes to writing files, no other vendor posted the kind of consistently fast service level across all file types as did Amazon S3. It had the fewest outages and best uptime and was the only CSP to post a 0.0 percent error rate in both reading and writing objects.”


December 13, 2011  9:42 AM

Apple eyes flash startup Anobit

Dave Raffo Dave Raffo Profile: Dave Raffo

Anobit might be the next solid-state storage vendor to get scooped up. Israeli publication Calcalist reports that Apple is in “advanced negotiations” to acquire Anobit for around $400 million to $500 million.

Apple uses Anobit’s mobile flash chip in the iPhone, iPad and MacBook Air laptop. Anobit also sells an enterprise-grade multi-level cell (MLC) flash controller, the Genesis SSD. Anobit launched the second-generation Genesis product in September. The startup claims its proprietary Memory Signal Processing (MSP) controllers can boost endurance levels so that consumer-grade MLC can be used in the enterprise.

While Apple would likely focus on Anobit’s mobile flash controller, it might also use Anobit’s enterprise flash to enter that competitive market. The acquisition continues a trend of consolidation in the SSD market this year. SanDisk acquired Pliant in May for $327 million to move into the enterprise SSD market, and LSI bought flash controller startup SandForce for $322 million in October.

“We believe this yet again highlights the importance of controller technology in the SSD market,” Stifel Nicolaus Equity Research analyst Aaron Rakers wrote in a note to clients today. “While it appears that a potential acquisition of Anobit … would likely leave investors primarily focused on Apple’s ability to leverage the MSP controller technology across its product portfolio, we believe Anobit’s enterprise-class controller capabilities must also be considered/watched with regard to competition against Fusion-io (albeit there have yet to be any signs of Anobit playing in the PCIe SSD market).”

Anobit, which came out of stealth in 2010 with its first Genesis product, has raised $76 million in funding.


December 9, 2011  11:06 AM

Why we keep data

Randy Kerns Randy Kerns Profile: Randy Kerns

I’ve attended two conferences recently where a speaker talked about storage efficiency and the growing capacity demand problem. The speaker said that a part of the problem is we don’t throw data away. That blunt statement suggests that we should throw data away. Unfortunately, that was the end of the discussion and the rest was promotion of a product.

This really begs the question, “Why don’t we delete data when we don’t need it anymore?” When I asked this question to IT people, they had several reasons for keeping data.

Government regulation was the most common reason. Many of these regulations are in regard to email and associated with corporate accountability. People in vertical markets such as bio-pharmaceuticals and healthcare have extra industry-specific retention requirements.

Business policy was another top reason for not deleting information. There were three underlying reasons for this category. In some cases, the corporate counsel had not examined the information being retained and had issued orders to keep everything until a policy was developed. Others keep data because their executives feel the information represents business or organization records with future value. (It was not really clear what this meant.). In other cases, IT staff was operating off a policy written when records were still primarily on paper and had not received new direction for digital retention.

Another common response was that IT staff had no time to manage the data and make retention decisions or to involve other groups within the organization. In this case, it is simpler to keep data rather than make decisions and take on the task of implementing a policy.

The other reason was probably more of a personal response – some people are pack rats for data and keep everything. I call this data hoarding.

Rather than only listing the problems, the discussion about data retention should always include ways to address the situation. Data retention really is a project. To be done effectively, it usually requires outside assistance and the purchase of software tools. In every case, an initiative must be undertaken. This includes calculating ROI based on the payback in capacity made available and reduced data protection costs. The project requires someone from IT to:

• Understand government regulations. Most are specific about the type of data and circumstances, and almost all of the regulations have a specific time limit or condition for when the data can be deleted.
• Examine the current business policies and update them with current information from executives and corporate counsel. Present the costs of retaining the data along with the magnitude and growth demands as part of the need to review the business policies.
• Add system tools to examine data, move it based on value or time, and delete it when thresholds or conditions are met.
• Get a grip. Data hoarding is costing money and making a mess. The person who replaces the data hoarder has to clean it up.

Knowing when data can be deleted is good operations practice in IT. It is a key component of storage efficiency. The Evaluator Group has more on storage efficiency here and here.

(Randy Kerns is Senior Strategist at Evaluator Group, an IT analyst firm).


December 8, 2011  1:42 PM

Symantec enlists partners for Backup Exec beta testing

Sonia Lelii Sonia Lelii Profile: Sonia Lelii

Symantec Corp. is ready to roll out the Backup Exec 2012 beta, which includes better capabilities for protecting virtual machines, a new user interface and more granular recovery.

Backup Exec 2012, designed for SMBs and Windows-based backups, will support the V-Ray features already integrated into Symantec’s NetBackup enterprise application. Backup Exec 2012 and Backup Exec 2012 Small Business Edition betas have gone out to 45,0000 Symantec registered global partners.

Symantec has generated a lot of noise around its V-Ray technology, which makes all of its products better optimized for virtual machines. For Backup Exec 2012, V-Ray lets Symantec customers do a physical-to-virtual conversion. This lets an administrator take a backup copy and convert it from a physical machine to a virtual machine for disaster recovery. Organizations will be able to recover a failed system to a physical server or to a Hyper-V or VM guest.

“The backup to the VM can be done in parallel, in sequence or serially,” said Aidian Finley, senior manager for product marketing for Symantec’s Information Management Group. :”The administrator has the choice of when to do that conversion. We call it ‘no hardware physical recovery.’”

The Backup Exec administration console has a new interface so administrators can automatically configure common policies and settings for quicker configuration and data protection.

Backup Exec can be purchased as an appliance, a software application or as a cloud service.

The rollout is the largest partner-only beta program Symantec has ever put into place. The idea of providing the beta product to partners is to give them a chance to shape the application, said Arya Barirani, senior director of product marketing for Symantec’s Information Management Group.

“It’s really a way [for partners and customers] to get their hands on this product and test it,” Barirani said.


December 7, 2011  5:03 PM

EMC adds cloud, big data certifications

Sonia Lelii Sonia Lelii Profile: Sonia Lelii

EMC Corp. is building up its training and certifications program by adding new levels of curricula for IT professionals who want to deepen their skills in the cloud, virtualization and data analytics.

The new courses fall under the umbrella of the EMC Proven Professional Training and Certification program. EMC launched a Cloud Architect program last January for IT professionals with a broad and deep understanding of server, storage, security, networking and business application disciplines but want to get better skills for handling virtualization and cloud computing. This course would likely appeal to storage architects.

Now EMC is adding EMC Cloud Infrastructure and Services and the EMC Cloud Architect IT-as-a-Service training and certifications. The first one is targeted for individuals who manage storage, networking or security as part of a team implementing and managing a cloud infrastructure. This is aimed more for the IT specialist rather than a storage architect.

Those who have taken the Cloud Architect Training and Certification program can advance their skills by taking the EMC Cloud Architect IT-as-a-Service course. Participants learn how to create service catalogs and self-service portals. Both of these programs have been open since October.

“The (Cloud Architect) certification program was for those with a high-level skill set,” said Chuck Hollis, EMC’s global marketing CTO. “It’s a week-long course but you had to show prerequisites that were very high. You get a deep level lab experience. You come to our facility and you get expert level experience in building a cloud architecture. For the new certifications, the prerequisites are a bit lower. You are working with a team that builds a cloud and you see how it affects you.”

EMC will also offer a foundational data science and big data analytics training and certification in January 2012. It’s a week-long, associate-level course where participants learn how to use big data and analyze it to help make informed business decisions. “Most business leaders realize it’s no longer your father’s business intelligence any longer,” Hollis said.

These training and certification programs are part of EMC’s Education Services. The training can be taken three ways, based on customer needs. IT professionals can attend an Instructor-led training course given at an EMC training center.  These courses are offered frequently based on demand in more than 70 locations globally.  Also, EMC can send an instructor to customers’ locations for when there are big teams that need training quickly. The last way to obtain the training is via a video instructor provided through a DVD.


December 6, 2011  5:33 PM

Aptare expands storage management software

Dave Raffo Dave Raffo Profile: Dave Raffo

Storage software vendor Aptare is branching out to managing file storage. The vendor today launched Aptare StorageConsole File Analytics, which helps discover and analyze unstructured data. Or, as Aptare CEO Rick Clark puts it, now Aptare “tames the beast of big data.”

The File Analytics app joins the Aptare Storage Console suite, which previously included Backup Manager for backup reporting, Replication Manager, Storage Capacity Manager, Fabric Manager and Virtualization Manager for managing storage in virtual environments. The suite apps are available as standalone products or part of a package that can be managed through a common console. Hitachi Data Systems sells Aptare software through an OEM agreement, and NetApp licenses the Aptare Catalog Engine as part of its snapshot technology.

Aptare’s File Analytics app collects and aggregates unstructured data on storage arrays. Clark said instead of walking the file system for information – which can be a resource intensive process – File Analytics runs without agents for minimal impact on the storage system. It uses a compressed database developed by Aptare code-named “Bantam,” that Clark said can store more than 100 million records in less than 1.5 GB. File Analytics analyzes metada in Bantam, and uses that metada for storage tiering, compliance and to recognize and eliminate duplicate files.

HDS hasn’t disclosed that it will sell File Analytics yet, but Clark said it would be a good fit with HDS’ recently acquired BlueArc NAS platform as well as its Hitachi Content Platform (HCP) object-storage system marketed for cloud storage.

“This allows you to figure out which data to put into those respective platforms,” Clark said. “The first thing a customer asks is, ‘What should I put into those platforms?’ It can also be used to determine which information to move to the cloud.”

Aptare can use a friend like HDS to help sell File Analytics. The product will go head-to-head with Symantec Data Insight for Storage, as well as capabilities included in EMC Ionix ControlCenter, Hewlett-Packard Storage Essentials and Solar Winds Storage Manager.


December 6, 2011  3:02 PM

Archiving software gains ground, according to IDC

Dave Raffo Dave Raffo Profile: Dave Raffo

Archiving and storage and device management pushed storage software revenue up 9.7% last quarter over the third quarter of 2010, according to IDC.

IDC said archiving software revenue grew 12.2% and storage and device management increased 11.3% over last year. Data protection and recovery software is still the most popular storage software, with 34.9% of the market. Backup software giants EMC and Symantec were the overall storage software leaders, with EMC generating $847 million and Symantec $530 million. EMC had a 24.5% share of the $3.46 billion market, followed by Symantec with 15.3%, IBM with 14%, NetApp with 8.8% and Hitachi Data Systems (HDS) with 4.4%. HDS grew the most since last year with a 15.3% increase. EMC increased 10.3%, and the other three vendors in the top five lost market share. IBM grew 8.8% since last year, Symantec grew 2.2% and NetApp grew 0.3%.

Customers apparently feel more comfortable buying storage software from smaller vendors than they do buying storage systems. “Others” – those not in the top five – combine for $1.15 billion in storage software revenue last quarter. That was up 15.3% over last year and made up 33.2% of the market – more than any single vendor.

On the hardware side, external disk storage system revenue from “others” dropped 5.2% over last year, according to IDC’s report released last week. The others in hardware had only 18.5% of the market, well below leader EMC’s 28.6% share. Total storage system revenue increased 10.8%, slightly outgrowing the rate of storage software sales.


December 3, 2011  10:01 AM

EMC maintains disk revenue lead

Dave Raffo Dave Raffo Profile: Dave Raffo

EMC solidified its lead as the No. 1 external disk storage vendor last quarter, according to the latest IDC worldwide disk storage systems quarterly tracker.

EMC increased its networked storage revenue by 22% from a year ago, more than doubling the industry year-over-year growth of 10.8%, EMC posted third-quarter revenue of $1.65 billion – up from $1.35 billion a year ago – and increased its market share from 25.9% to 28.6%. IBM held onto second with $735 million, followed by NetApp at $700 million, Hewlett-Packard at $651 million, Hitachi Data Systems (HDS) with $505 million, and Dell at $459 million. IDC considers IBM and NetApp, and HDS and Dell in a statistical tie because less than 1% of market share separates them.

HDS grew the most year-over-year, increasing 22.1% and increasing market share from eight percent to 8.8%. Dell (down 2.6% after ending its OEM deal with EMC) and IBM (10.2% increase) grew at a slower rate than the overall market. Revenue from all other vendors slipped 5.2%, mainly because three of the biggest “others” – 3PAR, Isilon, and Compellent – were acquired by larger vendors since the third quarter of 2010.

IDC found that the midrange segment (from $50,000 to $150,000) had strong growth. “The trend to buy modular systems offering enterprise level functionality, such as scale-out architectures, tiering, data deduplication, etc. continues,” Amita Potnis, senior research analyst, Storage Systems, said in the IDC release.

IDC also said the SAN market grew 16.1% year-over-year and the NAS market grew 3.5%. EMC led the SAN market with 25.3%, followed by IBM (15.4%) and HP (14%). EMC also led in NAS with 46.7% share following its Isilon acquistion, with NetApp at 30.9%. iSCSI SAN revenue increased 19.5%. Dell, with its EqualLogic product line, led the iSCSI market with 30.3% followed by EMC with 19.2% and HP at 14%.

Server vendors HP, IBM and Dell have higher market shares in the overall disk storage segment, which includes server and direct attached storage. EMC still led the category, although all of its revenue comes from external storage. EMC’s overall disk market share is $21.7. HP is next with $1.436 million (18.9%), followed by IBM with $1.125 million (14.8%), Dell $879 million (11.6%) and NetApp at $700 million (9.2%). HDS did not crack the top five. As with EMC, NetApp and HDS are pure-play external storage vendors and get all their revenue from SAN and NAS systems.

The total disk storage market grew 8.5% to $7.6 billion. The external disk storage revenue was $5.8 billion.


December 2, 2011  10:59 AM

Physical, virtual backup still mostly a two-headed beast

Dave Raffo Dave Raffo Profile: Dave Raffo

We received a couple of reminders this week about how important backing up virtual machines is in an organization’s data protection strategy.

First, virtual server backup specialist Veeam released Backup & Replication 6. That in itself wasn’t a huge development. Veeam revealed full details of the product back in August, and said it would be shipping by end of year. It even leaked the most important detail – support of Microsoft Hyper-V – six months ago.

The most interesting part of the launch was the reaction it brought from backup king Symantec. Symantec sent an e-mail reminding that it too does virtual backup (through its X-ray technology) and claimed “point products are complicating data protection.” Symantec released a statement saying “In the backup world, two is not better than one. Using disparate point products to backup virtual and physical environments adds complexity and increases management costs … Organizations should look for solutions that unite virtual and physical environments, as well as integrate deduplication, to achieve the greatest ROI.”

Sean Regan, Symantec’s e-Discovery product marking manager, posted a blog extolling Symantec’s ability to protect virtual machines.

In other words, why bother with products such as Veeam and Quest Software’s v-Ranger for virtual machines when Symantec NetBackup and Backup Exec combine virtual and physical backup? But the established backup vendors opened the door for the point products by ignoring virtual backup for too long. Symantec didn’t really get serious about virtual backup until the last year or so.

Randy Dover, IT officer for Cornerstone Community Bank in Chattanooga, Tenn., began using Quest vRanger for virtual server backup last year although his bank had Symantec’s Backup Exec for physical servers. Dover said he would have had to put agents on his virtual machines with Backup Exec and it would have cost considerable more than adding vRanger.

“Before that, we were not backing up virtual machines as far as VMDK files,” he said. “If something happened to a VM, we would have to rebuild it from scratch. That’s not a good scenario, but basically that’s where we were.”

Dover said vRanger has cut replication time and restores for his 31 virtual machines considerably. And he doesn’t mind doing separate backups for virtual and physical servers.

“Using two different products doesn’t concern us as much,” he said. “We generally look for the best performance option instead of having fewer products to manage.”

Quest took a step towards integrating virtual and physical backup last year when it acquired BakBone, adding BakBone’s NetVault physical backup platform to vRanger.

Walter Angerer, Quest’s general manager of data protection, said the vendor plans to deliver a single management console for virtual and physical backups. He said Quest would integrate BakBone’s NetVault platform with vRanger as much as possible. It has already ported NetVault dedupe onto vRanger and is working on doing the same with NetVault’s continuous data protection (CDP).

“We are looking forward to an integrated solution for for virtual, physical and cloud backup,” Angerer said. “I’m not sure if either one will go away, but we will create a new management layer. The plan is to have a single pane of glass for all of our capabilities.”


Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to: