Hi,
We are performing image-based backups of vSphere VMs.
Let's say we want to set up secondary backups to Wasabi (Immutable).
For example, we have a question about the following settings for a secondary backup job.
===
・Job Retention Policy : 2 days
・Immutable : 7 days
・Job Schedule : Only Saturday
・Block Generation : 10 days
===
In this case, how should the amount of data stored on the object storage side be calculated?
https://helpcenter.veeam.com/docs/backu ... -retention
Kind Regards,
Asahi,
Climb Inc.
-
- Expert
- Posts: 162
- Liked: 11 times
- Joined: Jun 03, 2016 5:44 am
- Full Name: Iio Asahi
- Location: Japan
- Contact:
-
- Product Manager
- Posts: 10637
- Liked: 2866 times
- Joined: May 13, 2017 4:51 pm
- Full Name: Fabian K.
- Location: Switzerland
- Contact:
Re: Capacity sizing for secondary backups to object storage with regular immutability, rather than daily backups
Hi Asahi,
Object Storage is always incremental.
The first Saturday will create a full backup, and on all subsequent Saturdays, the copy job will create incremental backups.
Each incremental backup will contain all data blocks that have changed on the production workload between the two job runs.
What is your expected change rate within one week? If you know the expected change rate, you can use the Veeam Calculator to estimate the amount of storage required.
I also suggest reconsidering only doing weekly copies to Wasabi. Having just one backup copy offsite per week could be risky.
For example, imagine a hacker attack on Friday. Would your business be able to recover if you had to restore from a 6-7 day old backup copy? I don’t know the specifics of your business, but companies processing hundreds of new orders per day could lose information about thousands of orders over the course of a week. A daily offsite copy could have saved them.
Best regards,
Fabian
Object Storage is always incremental.
The first Saturday will create a full backup, and on all subsequent Saturdays, the copy job will create incremental backups.
Each incremental backup will contain all data blocks that have changed on the production workload between the two job runs.
What is your expected change rate within one week? If you know the expected change rate, you can use the Veeam Calculator to estimate the amount of storage required.
I also suggest reconsidering only doing weekly copies to Wasabi. Having just one backup copy offsite per week could be risky.
For example, imagine a hacker attack on Friday. Would your business be able to recover if you had to restore from a 6-7 day old backup copy? I don’t know the specifics of your business, but companies processing hundreds of new orders per day could lose information about thousands of orders over the course of a week. A daily offsite copy could have saved them.
Best regards,
Fabian
Product Management Analyst @ Veeam Software
-
- Expert
- Posts: 162
- Liked: 11 times
- Joined: Jun 03, 2016 5:44 am
- Full Name: Iio Asahi
- Location: Japan
- Contact:
Re: Capacity sizing for secondary backups to object storage with regular immutability, rather than daily backups
Hi Fabian,
Thank you for reply.
> you can use the Veeam Calculator to estimate the amount of storage required.
I was unable to calculate based on the conditions you provided.
> I also suggest reconsidering only doing weekly copies to Wasabi.
> Having just one backup copy offsite per week could be risky.
This is just one example.
I wanted to ask what would happen in the case of the conditions I wrote.
So, how much capacity would it be under the conditions I described?
Kind Regards,
Asahi,
Climb Inc
Thank you for reply.
> you can use the Veeam Calculator to estimate the amount of storage required.
I was unable to calculate based on the conditions you provided.
> I also suggest reconsidering only doing weekly copies to Wasabi.
> Having just one backup copy offsite per week could be risky.
This is just one example.
I wanted to ask what would happen in the case of the conditions I wrote.
So, how much capacity would it be under the conditions I described?
Kind Regards,
Asahi,
Climb Inc
-
- Product Manager
- Posts: 10637
- Liked: 2866 times
- Joined: May 13, 2017 4:51 pm
- Full Name: Fabian K.
- Location: Switzerland
- Contact:
Re: Capacity sizing for secondary backups to object storage with regular immutability, rather than daily backups
So far, zero TB, because you haven’t provided any information about your source data and expected weekly change rate:)So, how much capacity would it be under the conditions I described?
But let’s assume an example with 1TB of production data and a weekly change rate of 10%. Typically, we see an average compression of about 50% (though this heavily depends on the source data type). The first full backup would be 0.5TB, and each weekly incremental would be 0.05TB.
In v12.*, backup data blocks on object storage are stored longer as you may expect (actual retention = job retention policy + immutability period + block generation period). Each uploaded data block will be stored for 19 days in your scenario.
With the 1TB example, I would estimate approximately 0.65TB (0.5TB for the full backup + 2-3x 0.05TB) to be kept on object storage.
Best regards,
Fabian
Product Management Analyst @ Veeam Software
Who is online
Users browsing this forum: Google [Bot] and 3 guests