Hi, I want to basically move old yearly backups (about 20TB) from 2018 I've run a powershell script which shows the backups from 2018 that are labeled Y.VBK. I'd like to select those specific backups and export or copy them to a google cloud storage bucket. I should mention also these backups are where part of an imported job performed by a previous tech who is no longer with the company. I haven't really seen a clear explanation on how to do this. I've looked at the options below so far, but unclear how best to execute it. I'm on vBR 12 and I did notice there is a direct to storage option available that will connect to the google storage bucket. I'm also open to using azure of aws if these are more viable, or better suited to the task.
1. Add the NFS share in jobs/backups and attempt to manually select the files to be copied from there.
2. SOBR configuration, have the bucket as part of a retention tier. (This doesn't seem viable as the only way I would be able to select the files is via some conditional options, ie: over certain date etc.
3. Export the content as virtual disks, setting the target as the google cloud bucket, (not sure if this is even possible.)
Thank you
-
- Lurker
- Posts: 2
- Liked: never
- Joined: Mar 13, 2023 2:06 pm
- Full Name: Thor
- Contact:
-
- Product Manager
- Posts: 7591
- Liked: 1974 times
- Joined: May 13, 2017 4:51 pm
- Full Name: Fabian K.
- Location: Switzerland
- Contact:
Re: Transferring selected backups to google storage bucket.
Hi Thor
Choosing only yearly backups by exporting single backup files to object storage would lead to a lot of duplicated space. Each Export will be seen as a new Full Backup and therefore will use the entire space of a full backup.
As an alternative you could use Copy Backup Chain to Object Storage. That would copy the entire backup chain and it would be a block clone aware operation. Only unique blocks would be offloaded. That way you save costs on API Calls and Storage.
Please note, our product doesn't support immutability for Google Cloud storage. If you require immutable backups, check out other cloud services which provide support for immutable object storage. I also recommend to compare AWS/Azure with other providers which don't bill you for API calls. For example Wasabi, but there are a lot of other providers which have a similar pricing scheme.
- Veeam Ready Database: https://www.veeam.com/alliance-partner- ... tml?page=1
- Unofficial object storage compatibility list: object-storage-f52/unoffizial-compatibi ... 56956.html
Best,
Fabian
Choosing only yearly backups by exporting single backup files to object storage would lead to a lot of duplicated space. Each Export will be seen as a new Full Backup and therefore will use the entire space of a full backup.
As an alternative you could use Copy Backup Chain to Object Storage. That would copy the entire backup chain and it would be a block clone aware operation. Only unique blocks would be offloaded. That way you save costs on API Calls and Storage.
Please note, our product doesn't support immutability for Google Cloud storage. If you require immutable backups, check out other cloud services which provide support for immutable object storage. I also recommend to compare AWS/Azure with other providers which don't bill you for API calls. For example Wasabi, but there are a lot of other providers which have a similar pricing scheme.
- Veeam Ready Database: https://www.veeam.com/alliance-partner- ... tml?page=1
- Unofficial object storage compatibility list: object-storage-f52/unoffizial-compatibi ... 56956.html
Best,
Fabian
Product Management Analyst @ Veeam Software
Who is online
Users browsing this forum: Semrush [Bot] and 1 guest