Comprehensive data protection for all workloads
Post Reply
JustinCredible
Influencer
Posts: 10
Liked: 3 times
Joined: Mar 01, 2012 11:54 pm
Full Name: Justin D
Contact:

Lots of changed data?

Post by JustinCredible »

Hello,
We've been using VEEAM (on 6.5 now) for over a year now and it is a great product. But we are now looking to move our backups offsite and I had a few Qs.
We've got 6 VMs running Windows Server 2008 and 2008 R2, and one Linux VM (SpamTitan). Total data is 1.1TB and it backs up to 377GB.

When we do backups, there seems to be a lot of changed data. I tested by running a backup immediately after one finished and there was still 3GB worth of changed data there. Every 24hr it is usually 18-30GB which can be tough to move offsite over the internet within an overnight timeframe. It can be done but it's tight. My question is- is this a lot of changed data? I can't conceive of my users generating that much data a day; and our data isn't growing by that much each day just changing that much. We have excluded swap files from the backup already so I'm not sure what else it could be. But maybe that's normal? 5-8%/day?

Also how would I configure VEEAM for this situation. I want to "seed" the backup to another system online but I have a feeling that when it does the full backup it has to do every week, that 377GB file is going to have to be retransmitted since it's a "new" file. I have it set to Incremental with a synthetic full done every Sunday. How do you guys account for this full backup every week?
I know there are services out there that you buy an appliance to put on site, copy the data to that, and IT send the data up to the provider's online storage. But I figured it'd be easy enough (and much cheaper) to set up an Amazon S3 account and just FTP the files to it daily or something. Any insight would be appreciated!
Vitaliy S.
VP, Product Management
Posts: 27377
Liked: 2800 times
Joined: Mar 30, 2009 9:13 am
Full Name: Vitaliy Safarov
Contact:

Re: Lots of changed data?

Post by Vitaliy S. »

Hi Justin,
JustinCredible wrote:My question is- is this a lot of changed data?
It depends, we've got an existing topic where you can see all possible reasons of that many changes on your VMs: Large VIB file on a small static server
JustinCredible wrote:Also how would I configure VEEAM for this situation. I want to "seed" the backup to another system online but I have a feeling that when it does the full backup it has to do every week, that 377GB file is going to have to be retransmitted since it's a "new" file. I have it set to Incremental with a synthetic full done every Sunday. How do you guys account for this full backup every week?
You've configured it in a right way, with synthetic full backup you will not need to transfer the entire VM image once again, it will be assembled from existing incremental and full backup files automatically.

One thing to note here, you should be using Windows/Linux repositories to keep traffic from synthetic full offsite. In addition to this, you may want to change storage settings to WAN optimized (Backup job properties -> Storage section -> Advanced button -> Storage tab).

Hope this helps!
Post Reply

Who is online

Users browsing this forum: Bing [Bot], Semrush [Bot] and 69 guests