Since my blog entry about CA posted yesterday, CA representatives and I have had a number of conversations about what I wrote and what the company has delivered in product functionality. In doing so, both of us have come to the realization that there were some misperceptions and missteps on both of our parts as to what I was asking about, and what products they actually delivered.
In terms of the context for the interview and the update I expected from CA, I was looking for what CA was doing to pull different data protection components together to manage them under one umbrella –whether its own components or those of competitors. Maybe I was unclear in articulating those expectations or they did not understand them – probably some of both.
I don’t for one second believe that integration is a trivial task. In fact, this may be one of the greatest challenges backup software vendors face this decade, and possibly the next, but that is also one of the reasons I am covering it. Archiving, virtual tape libraries (VTLs), compliance, continuous data protection (CDP), synchronous and asynchronous data replication and retention management are now all part of the data protection mix. Frankly, I’d be concerned if CA claimed it had fully integrated all of these components because analysts probably would have had a field day verifying, and likely debunking, that claim.
On the other hand, in conversations I have had with CA’s competitors, on and off the record, I sense that CA is starting to lag behind. There is nothing tangible I can point to, just a sense of the depth and quality of conversations I have had.
That is not to say CA is doing nothing, as yesterday’s blog post could have incorrectly led readers to conclude. In data protection, CA has focused on adding new features and integration to the XOsoft CDP and Message Manager email archiving products. To CA’s credit, they did bring up a good point, that companies still internally manage data protection and records management separately. So they first sought to bring out new features and functions in those products based on internal customer demand before tackling the overarching integration problem.
For example, since I last spoke to CA in February, a second integration service pack was released in March containing two features that I believe administrators will find particularly useful. Through the ARCserve interface, administrators can browse replicated jobs set up in WANSync and see the sources being selected for replication and the target replication servers. Then, when they need to restore jobs backed up from the XOsoft replica, the restore view in ARCserve provides a view of the production servers rather than the XOsoft WANSync Replica server.
The ultimate question remains, “Is CA doing enough and doing it fast enough?” Someone older and wiser than me once told me that it takes about 8 years for changes in storage practice and technology to work their way into the mainstream. Whether that holds true in the rapidly changing space of data protection remains to be seen.
This last week, I had a chance to catch up with CA on what integration has occurred in their Recovery Management product line since I last spoke to them in February. Based upon what they told me in the first interview and the little progress they had made, I spoke to them again to make sure I didn’t miss something.
“Scrambling” is the word that Frank Jablonski, CA’s Product Marketing Director, used to describe CA’s efforts to pull together and offer customers some level of integration between their XOSoft and ArcServe product line. To that end, they have released two service packs to start to integrate these products.
The first service pack enabled ArcServe to use a script to create backups from CA’s CDP product — XOSoft WANSync. The script does the following:
- Periodically stops the replication on XOSoft WANSync.
- Takes a point in time copy of the data on the WANSync server.
- Resumes the replication.
- Backs up that point-in-time copy.
ArcServe then centrally manages that point-in-time backup which companies can use for longer term retention. The second service pack provided that same functionality but added a GUI interface.
However, I know many good system administrators that could write that script in their sleep. That leaves the integration with XOSoft WANSync and ArcServe at little more than a rudimentary level. Though it demonstrates progress, CA needs to accelerate their efforts in light of the announcements that CommVault and Symantec have made in the last couple of months.
To catch up, CA is planning major version upgrades of XOSoft in January 2008 and ArcServe in the spring of 2008. Jablonski promised users will see both product upgrades and more integration across CA’s different data protection products at that time. However, CA will likely not complete their integration efforts until about 2010 or 2011.
CA definitely has the potential and the software to offer users a robust data protection and management package. To CA’s advantage, backup software is generally not a product that users are apt to rip and replace. However, CA is currently trailing some other backup software vendors in their integration and delivery of new features. But, this will likely only begin to matter if I am still making blog entries similar to this one about their product a year from now.
Over the past month, I’ve been working on putting together podcast tips with some of our experts. Pierre Dorion, certified business continuity professional for Mainland Information Systems Inc., recently contributed a podcast called “Outsourcing backup: Get the right service level agreement”.
In this tip, Pierre discusses the questions that can help you ensure that a service level agreement (SLA) meets your requirements when outsourcing backup, such as:
- What are your data recovery needs?
- How fast can your data be restored?
- What are the contractual obligations of the SLA?
- Does your service provider have a solid disaster recovery plan in place?
Pierre also offers practical advice on making sure these questions get addressed. Check it out below.
Elsewhere on the Web, check out www.sla-zone.co.uk. It’s got a bunch of useful SLA information, broken down into topics such as services, performance, problem management, customer requirements, termination, and so on. Also, www.itil-itsm-world.com has a series of documents that are used to help build a framework for service management, including information on service level agreements and IT outsourcing.
It is being widely reported that HP is in “advanced talks” to buy Bull SA, a French IT integrator. The reports, originated by a French website, capital.fr, contain detailed information about the potential price of the deal (approximately $1 billion US) and have been reprinted by sources including CNNMoney.com and Reuters. HP declined comment on the rumors.
Bull, whose major shareholders include the French government, has a storage business unit, though it’s mostly a channel/storage integration play. The company also deals in other IT products, including servers and networking equipment, and has a customer footprint mostly in French government as well as a few overseas state and local government agencies, according to storage industry analysts.
Otherwise, analysts said, they’re mystified at the potential merger. “HP would have an interesting job on its hands getting a sleepy company to wake up,” said Arun Taneja, founder and analyst for the Taneja Group.
The company underwent a restructuring at the turn of the millenium, refocusing itself on channel sales and systems integration. However, despite attempts to penetrate US markets since, 80 percent of its revenues come from Europe, with a full 40 percent from France alone. Prior to its restructuring, Bull had gotten into the business when it purchased Honeywell, a mainframe and minicomputer manufacturer that was ultimately left behind by the advent of the PC. (It’s a story similar to Digital Equipment and Wang, which went the way of the dinosaur when they couldn’t compete with IBM, Sun et al).
“I can’t see what’s in [a potential acquisition] for HP other than the acquisition of a customer base for servers and storage,” said John Webster, principal IT advisor with Illuminata, Inc.
Following EMC Corp.’s storage announcements last week, which included the introduction of a new Symmetrix array, the industry has been buzzing with the claims and counterclaims of EMC and high-end disk array rival Hitachi Data Systems (HDS), as well as debates over the merits of each company’s products.
In the past week, two storage consultants in the UK have dug into the technical specs of Hitachi’s USP and the new Symmetrix DMX-4. Nigel Poulton over at Ruptured Monkey takes a close look at the pros and cons of Hitachi’s external virtualization vs. EMC’s internal tiered storage. Meanwhile, storage consultant Chris M. Evans discusses the “green” claims being made by both vendors in their recent array announcements.
Nigel concludes that there are pros and cons to both the HDS and EMC approaches, depending on a user’s particular environment, which leads him to ask a very pertinent question:
There is certainly a demand for both [approaches to tiered storage]…When compared to something like Thin Provisioning, which both vendors are working on, implementing the above features would be a comparative walk in the park.
So if it’s not that hard to implement, and by doing so you potentially hang on to your customers, why not pinch your nose and take the plunge?
Too much Kool-Aid might be the answer.
As for Evans, his conclusion is that “neither vendor can really claim their product to be ‘green’.” HDS’s USP, he concludes, still has a higher power cost per-drive than EMC’s Symmetrix. However, he doesn’t gloss over the weakness of using higher-capacity drives (to which every systems vendor has the same access) to make a “green” claim, saying, “customers choosing to put some SATA drives into an array…[will] see only modest incremental power savings.”
Evans is not the first to bring up the need for big vendors to step up their efforts further around power consumption, particularly when mushrooming data retention and compliance archiving requirements mean data management strategies for reducing storage growth are losing their effectiveness. Users at this year’s Storage Networking World conference in San Diego also called on storage vendors to invest in better silicon rather than pushing the issue back onto users and, in essence, blaming them for their storage management practices. Elsewhere, server and PC makers have already begun moving to more efficient power designs within systems, and users, like Evans, are looking for a similar committment from storage manufacturers to built-in reductions in power consumption–rather than lip service about the latest SATA drives.
HP and Quantum put out a press release very quietly a week ago (it crossed the wire at 2:30 on a summer Friday afternoon; hard to fly much farther below the radar than that) announcing that they will be partnering more closely on development of LTO-5 tape products.
The exact terms of the agreement are confidential, though reps from both companies said this week that HP will be handling the “productization” of LTO-5 products, from the selection of components to decisions about product packages, whereas Quantum will be handling the design work for meeting LTO-5 specs.
This news follows on an announcement a few weeks ago that HP will be bundling Quantum’s StorNext file system with its EVA arrays for multimedia storage at production houses including Warner Bros.
As the two companies cozy up–and do so with such an emphasis on confidentiality, at least with this latest agreement–it begs the question: could an acquisition be next?
Like Sun when it acquired StorageTek, HP might be able to boost server sales out of owning Quantum. HP has also been doing battle with IBM lately in storage, and proprietary tape is one thing IBM has that HP doesn’t. (Hence IBM’s bluster a few months ago about being No. 1 in pure storage hardware sales, according to IDC). Also, Quantum’s products tend to appeal to the midmarket and small businesses, and lately HP’s storage strategy has been moving downmarket as well.
Right now, analysts say there’s probably nothing more to this latest tape deal between the two companies than meets the eye–if there are broader implications, according to Arun Taneja, founder and analyst with the Taneja Group, it’s for the tape market in general. “The reality is that despite tape people shouting, the tape market is maturing and declining,” he said. “For Quantum to develop two distinct tape products with both DLT and LTO is a fool’s paradise in that kind of environment.”
But if a company focused entirely on tape starts to offload production of one half of its tape business, you have to start to wonder. Especially when it’s offloaded to a vastly bigger company, which has a spot open for said products in its portfolio, and which itself has been pushing to continue its momentum in the storage market of late. For now there hasn’t been any smoking gun we’ve seen pointing to an impending merger, but rest assured we’re keeping an eye on these two.
Storage system-based asynchronous replication isn’t perfect, but for many corporations it is good enough. Having just completed researching and writing a feature on the topic of storage system-based asynchronous replication for an upcoming issue of Storage magazine, it appears user adoption of asynchronous replication is no longer a rarity, at least if one believes the storage system vendors.
While I did not speak to every storage system vendor for this report (there are dozens), the ones I did speak to consistently said that anywhere from 30% to 50% of their users employ this technology. To a certain degree, one might expect these numbers from a storage system vendor like EqualLogic, that includes asynchronous replication as part of its storage system’s base software package. But, when Hitachi Data Systems (HDS) went on the record and said that they are seeing similar adoption rates among their user base, it caught my attention.
Users of HDS storage systems generally need to license asynchronous software separately so it gives some indication as to the value users now ascribe to making copies of their data on a secondary storage system. Though it would take some time and a lot of cooperation on the part of HDS to find out what percentage of their licensed users actually use this feature and on what scale, it does follow that if users paid for it that a high percentage of them are probably using it.
Companies are figuring out they can repurpose money budgeted for tape and offsite storage and instead use it to buy cheaper secondary storage systems with asynchronous software. Companies can then do point in time snapshot of their production data, replicate it offsite and use it for daily backups, faster restores and, in a worse case scenario, recover their application from the data copy.
Is this architecture perfect? No. But companies are running out of time waiting for the perfect scenario and tape is certainly not it. At least in this scenario, recoveries happen much faster than waiting on restores from tapes residing in someone’s warehouse. Companies are looking for a more cost effective means to improve their backup and recoveries without breaking the bank and it looks like for a growing number of companies, storage system-based asynchronous replication is a reasonable compromise between perfection and what is affordable.
Generally the drumbeat of messaging from HDS is as constant as a metronome: array-based virtualization is the answer. Storage virtualization will heal your environment, bring about peace in the Middle East, and solve global warming.
So when an HDS exec writes a piece on his blog about who might not benefit from storage virtualization, it’s definitely worth a read.
David Merrill, storage consultant and solution architect with HDS since 1996, recently got back from what sounds like a rather thorny customer engagement in Korea. The customer, who is not named, wanted to extend its XP array’s virtualization to legacy systems (the XP being a rebranding of HDS). During a TCO analysis, Merrill writes, “Total purchase cost for the virtualization solutions was, as you can guess, less than a monolithic, but the 4-year TCO costs were higher” due to power and cooling costs, and maintenance costs with legacy systems (“when virtualizing older systems, the old hardware maintenance comes along too,” notes Merrill).
The user still went with virtualization because there was a “tipping point” with 20% storage growth over the next three years during which the virtualization will become more cost effective. “Moral of the story, be sure to look at many factors when considering different architectures. Just because you can virtualize does not mean that every old system needs to be kept around indefinitely…Your mileage will vary,” Merrill concludes.
Wonder what Mr. T would think of that.
42 man years of work and 18 months of development. That’s the amount of time and effort that CommVault put into its Simpana 7.0 Software Suite announced on June 10th, according to Dave West, CommVault’s VP of Marketing and Business Development.
While it is encouraging to note that CommVault spent so much time on this release, it’s equally sobering to ponder that data protection upgrades now take this much time and effort to complete. But, based upon what enterprise customers have needed for the last 5 to 10 years, this is the first product that comes close to delivering on those requirements.
Consider this. Frank Albi, the President of Business Information Solutions, a records management provider in Cincinnati, OH, manages paper, tape and optical media. In this role, he often is asked to help his clients develop a records disposal policy. He can with a high degree of certainty deliver one for his client’s paper records. Not so with tape and optical media. He does not even know where to begin, because his clients can’t easily identify which files or records are on which media so how can he develop an appropriate disposal schedule for the media? So, customers end up keeping it all — resulting in higher data storage costs and unnecessarily exposing them to future legal discovery costs.
What is compelling about CommVault’s Simpana is that it opens the door to address this dilemma that Albi and many others face.
It combines backup and archive data into one common pool and, using its newly licensed FAST search engine, allows users to search, access and retrieve archived and backed up data stored in this new pool. Since they both use a common policy engine, Simpana can set retention and expiration schedules for any file in the pool. Simpana’s new Single Instance Store (SIS) feature only sweetens the deal since it eliminates redundant file copies, which also reduces the size of data stores and expedites backups.
Granted, to gain Simpana’s benefits administrators need to upgrade or install backup agents on servers – something I always looked forward to as an administrator. Not. But, as CommVault’s West points out, users can deploy them with push technologies. This may take some of the sting out of the deployment plus the value-add of shortened backups and conducting centralized enterprise searches into archives and backups should appeal to most organizations and offset whatever concerns they have.
CommVault’s Simpana also still lacks the breadth and scope of features that data protection products from Symantec NetBackup, EMC NetWorker and Tivoli Storage Manager offer. But, with disk a growing part of the backup equation and e-discovery a shadow over most companies’ future, the features that traditional data protection products offer may not carry the same weight they once did.
Bottom line, for companies willing and able to standardize on a single data protection product, CommVault has jumped to the head of the pack and is the one by which data protection products should now be measured. It can reduce the size of data stores, expedite backup and recoveries and search across multiple data stores. Plus, CommVault offers continuous data protection, email archiving and replication products that administrators can manage through the same policy engine — making Simpana without equal in the industry. CommVault’s Simpana 7.0 Software Suite sets the mark high for data protection and is a template that other data protection products will be hard-pressed to match.
I have a hard time imagining that anyone who reads this blog isn’t already aware of The Onion, but just in case you missed it, no one in storage–particularly backup–should miss this video report on wide-scale DR from America’s Finest News Source ™.
Be sure to watch until about 1:40 for that rarest of birds: storage-related humor on a mainstream website. Even rarer: backup-specific storage-related humor.
If only The Onion could fill in the rest of what would surely follow this story: a huge swath of the US workforce left to office-chair races to pass the time; dramatic TV footage of Al Gore flying in to help troubleshoot his invention; and of course, every storage vendor in the world putting out press releases about how if the government had been backing up the InterWebs with [insert product name here], none of this would’ve happened.
Unfortunately, given the nature of the disaster, they’d probably have to start hanging their announcements up on telephone poles.
Meanwhile, however, as anyone reading this post on company time is no doubt keenly aware, there’s another very real workplace problem facing our nation right now, which leaves almost no one unaffected. For more, see this report.