By Ian Lock, GlassHouse Technologies (UK), storage & backup service director
Two of my clients are having differing experiences with deduplicating backup appliances they have purchased in the last year. Neither is having issues with the technology, but rather with their storage management processes.
Client A purchased a pair of appliances just under a year ago, having gone through a well-defined sizing, design and proof-of-concept process to ensure not only that the solution would function as planned, but also that they had a good handle on likely deduplication ratios in their environment.
Client A’s environment is based on Symantec NetBackup and backs up a growing estate of virtualised Windows and Linux servers. The new appliance-based backup environment is performing excellently, with great stability, a tiny failure percentage and automatic replication of backups to the DR site. After a year in operation, Client A’s appliances are running at 50% capacity and, as they have visibility of new projects requiring backup capacity over the next two to three months, they are already planning the purchase of fresh capacity to ensure they don’t run out and crash the backup environment.
Client B purchased a similar, albeit smaller appliance a year ago, based on sizing figures originally produced a year earlier and put theirs straight into a production Backup Exec environment. The new backup system has also been a great success with vastly improved success rates and the amount of time spent fixing backups drastically reduced.
However, Client B has migrated backup clients from legacy systems and added new clients with minimal planning or control. Annual data growth was not factored into their initial sizing calculations. As a result they now stand at 99% capacity utilisation with month-end backups approaching. Their only options are to make an emergency purchase, delete valid backups or stop backups altogether – not a good place to be.
Deleting valid backups may not even immediately help – the backup data stored in the appliance is deduplicated and so by definition is chopped up into small chunks, many of which will likely be shared between several different backup images. NetWorker blogger Preston de Guise makes the point very well:
You’d rather be in Client A’s shoes wouldn’t you? So make sure you keep a very careful eye on the capacity usage of your backup de-duplication target, and make your plans early.