Sorry for the noob question... LONG time VB&R user, but first time jumping into cloud-stored backups.
- We currently back up all VMs to local Win repo
- Backup-copy jobs of primary repo go to another Win repo for longer term GFS copies.
- Replicate a few critical VMs to a vsphere host at another site across our WAN.
Now we want to start putting those critical VMs in immutable Veeam Data Vault, so I purchased the 1TB plan for a year, just to get started. My question is... where do I go from here? Should I create a new backup job that sends these specific VMs to that repo daily?
Or should I send backup-copies of existing jobs to the cloud repo? (though I fear that will burn through a LOT of cloud storage quickly) as we retain quite a lot in our local repo.
Some other strategy I'm unaware of?
Any advice appreciated.
-
- Expert
- Posts: 233
- Liked: 17 times
- Joined: Jul 02, 2009 8:26 pm
- Full Name: Jim
- Contact:
-
- Chief Product Officer
- Posts: 32311
- Liked: 7657 times
- Joined: Jan 01, 2006 1:01 am
- Location: Baar, Switzerland
- Contact:
Re: Do I need a new job to send to Vault?
The recommended way is to turn your primary repo into SOBR, and use object storage offload functionality that is native to SOBR (aka Capacity Tier).
-
- Expert
- Posts: 233
- Liked: 17 times
- Joined: Jul 02, 2009 8:26 pm
- Full Name: Jim
- Contact:
-
- Product Manager
- Posts: 20728
- Liked: 2398 times
- Joined: Oct 26, 2012 3:28 pm
- Full Name: Vladimir Eremin
- Contact:
Re: Do I need a new job to send to Vault?
You can find more information about the Scale-Out Backup Repository and its tiers in our User Guide and ask additional questions if you have any after reviewing the documentation.
Who is online
Users browsing this forum: No registered users and 1 guest