Help developing backup strategy for large data volumes

pts.
Tags:
Backup and Recovery
Veritas
I have three remote sites/servers I support daily for data backups. We use Veritas BackupExec v10 and the data resides on a Clarion CX500 with direct 2gb fibre connectivity to the media host. The tape library is a Dell Powervault LTO2 scsi attached to the server. The problem is extremely long backup times for one server at each site. The total volume of data is about 750GB per remote server. The data consists of different files types (gif,tif,txt,etc). The last full backup I managed to run logged 8.456 million files, .8 million directories and the average file size is about 10kb each. No antivirus runs during the backup process and the media server is dedicated to this function. The other servers being backed up on these media servers average about 700-800MB/m. The problem backup averages about 112-120MB/m. A 'Full' backup is this environment takes over 65 hours, on the weekend. I suspect the shear number of very small files is the problem as this only happens with these three severs. Does anyone have a similiar environment or suggestions for an alternate backup process, product, method?

Answer Wiki

Thanks. We'll let you know when a new response is added.

You should use Ultra 320 SCSI adapters and no more than two tape drives per chain. If you can afford it, change to fibre channel drives and upgrade to netbackup.

The FC drives will give you less of a bottleneck and the Netbackup will allow you to make better use of multiple tape drives.

Discuss This Question: 8  Replies

 
There was an error processing your information. Please try again later.
Thanks. We'll let you know when a new response is added.
Send me notifications when members answer or reply to this question.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy
  • Dcsys99
    You should use Ultra 320 SCSI adapters and no more than two tape drives per chain. If you can afford it, change to fibre channel drives and upgrade to netbackup. The FC drives will give you less of a bottleneck and the Netbackup will allow you to make better use of multiple tape drives.
    0 pointsBadges:
    report
  • Zsr681
    What I don't see in your description of your data is the changability of it. If much of it is static, a strategy the incorporates replication and differential backups would seem more appropriate than doing full backups too often. There is software that will monitor folder trees for changes and then copy the changed files to another area. This would help insure realtime backup of the files. Doing differential backups between full backups will lessen the time required and still allow for off-site storage.
    0 pointsBadges:
    report
  • BlueKnight
    In looknig through the responses to your query, the first several responders concentrated on hardware. While their suggestions may help improve your situation, I believe you should look into doing incremental backups, which I think zsr681 was saying in his response. There is no real need to do full backups on a daily basis. If you backup only those files that have changed, your backups will take far less time. I checked with one of our guys downstairs... we backup 230 servers daily over the network using TSM. Daily backups are incremental so they run pretty quickly. The way our backups are sped up is that the data is backed up to NAS (RAID 5) then it is subsequently written to tape (300GB LT03). This way the bottleneck of tape isn't a factor. In the shop I used to work in I scheduled daily incremental backups. At the end of each week, I had a set of weekly backups that ran to copy most application volumes in their entirety. At month-end, I would backup everything including the OS volumes. Although I work primarily in a mainframe environment and you're apparently dealing with LAN servers the same principles apply. What you want in the end, is the ability to restore your data to a particular point in time whether the need is to restore a corrupted file or for disaster recovery.
    10 pointsBadges:
    report
  • Dcsys99
    Very good points from BlueKnight.....HOWEVER.... I just recovered a client's servers from Katrina. They use TSM with Incrementals and are highly optimized for backup. What that created was a "forever recovery" (Over two weeks!). As an example, one database server required a restore of 2.96Gb of data - no big deal. TSM had that spread across 75 LTO-2 tapes! It took 22 hours to restore the data. There needs to be a good balance with optimizing for backups and optimizing for recovery. You may need to change hardware to get there. Software (and I've used most) solutions can get you when you least expect it.
    0 pointsBadges:
    report
  • BlueKnight
    dcsys99's comments about TSM are excellent and illustrate an aspect of backup systems that need to be considered when shopping for one. Personally, I don't think I'd have selected TSM for exactly the reasons cited... too much effort required to restore what really is a fairly insignifiacnt amount of data. I can imagine how long it would take to restore 4 TB if "the big one" hit our data center. Since we're on the subject, does anyone use Syncsort's Backup Express? If so, what's your opinion of it? I'm curious. I don't get to play with those things any more, but I still have an interest in anything that works well and in finding out what doesn't work so well. Good comments from everyone.
    10 pointsBadges:
    report
  • Grinner30
    Another method might be to use Symantec (bought out Veritas) Netbackup 5.1 or higher. It provides the ability for synthetic backups. In essence you take the first backup as a full. There after just take a incremental. NetBackup will then take the last full backup and the incremental tape backups an produce a synthetic full backup without going back out to the server or network. This elminates the need for wekkly or monhtly full backups. I am just starting to test this and will have more knowledge and tested backups on it later this year, but it should save me a ton of time for the 6TB of data I need to backup weekly.
    0 pointsBadges:
    report
  • Epeterson
    Upgrading to NetBackup will also give you synthetic backups which will cut down the need to do the full backups. There are other products like Galaxy from Commvault that offer synthetic backups. I don't know if Backup Exec offers the ability to do image backups. They would give you less granularity on restores but would address the millions of files issues during backup. As said earlier, speed up every interface and/or add more tape resources. The responses so far have been hardware or software related. If you can I might try to determine if all this data is really critical to the business and warrants the resources it is consuming. Long term that will make it easier to throw hardware/software (aka money) at the problem.
    0 pointsBadges:
    report
  • Theizer
    Backup Exec 10.0 has the ability to perform synthetic backups as well if you purchase the Advanced Disk Based Option. I currently backup approx 70 servers consisting of about 2.5TB of data. We are running Backup Exec 10.0 rev 5520. I am actually making the switch to synthetic backups this weekend. I stage everything to disk 1st (right now I am using an HP MSA1000 SAN) and use a duplicate job template to clone the jobs to tape. With the synthetic backup you will do a full backup once (this is the baseline). This will stay on disk all the time. From then on - you perform incrementals and on the weekends you would have synthetic full job run that combines the baseline and the incrementals into one full backup. You could stage all of this to disk (if you have the space) and then duplicate it to tape or you could run the synthetic full straight to tape.
    0 pointsBadges:
    report

Forgot Password

No problem! Submit your e-mail address below. We'll send you an e-mail containing your password.

Your password has been sent to:

To follow this tag...

There was an error processing your information. Please try again later.

REGISTER or login:

Forgot Password?
By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Thanks! We'll email you when relevant content is added and updated.

Following