Discussions related to using object storage as a backup target.
Post Reply
asifganai
Influencer
Posts: 20
Liked: never
Joined: Dec 11, 2017 6:32 am
Full Name: Asif Ganai
Contact:

Backup to S3

Post by asifganai »

Hi All,
I am trying to get rid of tapes and introduce S3 for long term archivals. Here is my scenario. I want to hold 30 days data on local repository and then off-load to S3. My retentions on AWS S3 would be 10 years(12 Monthly Jobs, 10 Yearly Jobs). I am wondering if backup job or a backup copy job would perfectly fit my requirement. And also if selected the option "keep certain full backups longer for archival purposes" on the storage option of a job, does it store this data on local repository or capacity tier(S3). Any help would be appreciated.
Regards
Asif Ali
Natalia Lupacheva
Veteran
Posts: 1143
Liked: 302 times
Joined: Apr 27, 2020 12:46 pm
Full Name: Natalia Lupacheva
Contact:

Re: Backup to S3

Post by Natalia Lupacheva » 1 person likes this post

Hi Asif,

seems GFS retention policy for backup copy job is what you are looking for.
You can configure monthly and yearly backup cycle here.
Also, this thread might be interesting for you.

Thanks!
asifganai
Influencer
Posts: 20
Liked: never
Joined: Dec 11, 2017 6:32 am
Full Name: Asif Ganai
Contact:

Re: Backup to S3

Post by asifganai »

Hi Natalia,
If selected the option"keep certain full backups longer for archival purposes" does this data sit on local repository or capacity tier extent of the SOBR??
And secondly, i understand we need 2 full backups(Synthetic or Active Full) before data can be off-loaded to S3. Does the same rule apply for a backup copy job??
Regards
Asif Ali
oleg.feoktistov
Veeam Software
Posts: 1912
Liked: 635 times
Joined: Sep 25, 2019 10:32 am
Full Name: Oleg Feoktistov
Contact:

Re: Backup to S3

Post by oleg.feoktistov » 1 person likes this post

Hi Asif,

In SOBR data never sits on capacity tier immediately and solely. It is either copied there right after it is created on performance tier or moved when the operational restore window is exceeded.
Now, about backup and backup copy. You can use either of them, just note that in case SOBR is a target for backup copy job, regular fulls and increments are never offloaded to Capacity Tier. Only the GFS ones. And it is not the case with backup job.
Also, for backup job only restore points belonging to inactive chain are the subject to offload, whereas for backup copy job it is inactive chain criteria and GFS.

Thanks,
Oleg
asifganai
Influencer
Posts: 20
Liked: never
Joined: Dec 11, 2017 6:32 am
Full Name: Asif Ganai
Contact:

Re: Backup to S3

Post by asifganai »

Thank you Natalia and Oleg.
asifganai
Influencer
Posts: 20
Liked: never
Joined: Dec 11, 2017 6:32 am
Full Name: Asif Ganai
Contact:

Re: Backup to S3

Post by asifganai »

Hi,
Is there a way to limit bandwidth in Veeam exclusively for S3 Jobs??
Regards
Asif Ali
Mildur
Product Manager
Posts: 8549
Liked: 2223 times
Joined: May 13, 2017 4:51 pm
Full Name: Fabian K.
Location: Switzerland
Contact:

Re: Backup to S3

Post by Mildur » 1 person likes this post

Hi

Anton Gostev has mentioned this in this weeks forums Digest

If you have internal S3 Storage, create a network traffic rule specific to this ip adress of your object storage:

https://helpcenter.veeam.com/docs/backu ... ml?ver=100

If you have amazon or azure Object storage, you can try this script:
Cloud object storage users > there were a few questions regarding granular Internet access throttling control for object storage offload operation in the past. As you know, Veeam Backup & Replication has a single "Any to Internet" network throttling rule which covers any internet access, but sometimes customers want more granularity for different activities. This came up again on the internal forum last week, and as usual my advice was to figure out your chosen cloud data center subnet, and create the corresponding network traffic throttling rule. But the engineer who asked me did not just settle at the manual solution! He figured how to script all of this, as apparently Amazon and Azure provide APIs to request the IP subnets of their datacenter regions. Great stuff!
Link with the script:
https://horstmann.in/adding-azure-aws-r ... fic-rules/
Product Management Analyst @ Veeam Software
Post Reply

Who is online

Users browsing this forum: No registered users and 13 guests