-
- Service Provider
- Posts: 53
- Liked: 2 times
- Joined: Feb 21, 2014 5:15 am
- Full Name: Chris A
- Contact:
GFS backups to AWS Deep Archive
I have a customer who requires some long term data retention of their VMWare environment (Currently 2TB, to grow 50% in next 6 months). We are using Veeam B&R to backup to a local repository (a ESXi host dedicated to a Veeam B&R VM, with storage locally). We keep about 45 days of data locally but no GFS backups (YET!) We also send this customers data off to a Cloud Repository (I am the provider) via backup copy job, keeping only a week of data offsite.
I have gone down the rabbit hole of looking into utilizing GFS with AWS Glacier / Glacier Deep Archive for storing data more long term. The goal being to keep 12 months & 7-10 years of GFS backups. From what I have found out I need to let Veeam manage the data in AWS, I cannot move from S3 to Glacier using Lifecycle policies. Thus no use of AWS object storage with a SOBR, the first rabbit hole I went down. VTL looks like what I need to be using.
I will be needing an AWS Tape Gateway VTL appliance. I see you can install locally as a VMWare OVF or they have an actual physical appliance. Is there an option that does not require anything local aside from Veeam B&R? I saw something mention EC2 but was not sure if that even made sense to go that route. I am leaning the route of the VMWare appliance.
Is it bad practice to have the VTL Gateway appliance & my Veeam B&R both reside on the same physical VMWare host if I have enough resources? The one positive I see here is Veeam B&R needs an iSCSI connection to the VTL Gateway appliance, deploying on the same host should eliminate this traffic from leaving that host.
I see the data first hits S3 and then is moved to Glacier or Deep Archive pool depending on how I configure my tapes. How long does this data sit in S3 before being moved over? Any best practices on tape sizing?
Am I just wasting my time here or doing it all wrong?
I have gone down the rabbit hole of looking into utilizing GFS with AWS Glacier / Glacier Deep Archive for storing data more long term. The goal being to keep 12 months & 7-10 years of GFS backups. From what I have found out I need to let Veeam manage the data in AWS, I cannot move from S3 to Glacier using Lifecycle policies. Thus no use of AWS object storage with a SOBR, the first rabbit hole I went down. VTL looks like what I need to be using.
I will be needing an AWS Tape Gateway VTL appliance. I see you can install locally as a VMWare OVF or they have an actual physical appliance. Is there an option that does not require anything local aside from Veeam B&R? I saw something mention EC2 but was not sure if that even made sense to go that route. I am leaning the route of the VMWare appliance.
Is it bad practice to have the VTL Gateway appliance & my Veeam B&R both reside on the same physical VMWare host if I have enough resources? The one positive I see here is Veeam B&R needs an iSCSI connection to the VTL Gateway appliance, deploying on the same host should eliminate this traffic from leaving that host.
I see the data first hits S3 and then is moved to Glacier or Deep Archive pool depending on how I configure my tapes. How long does this data sit in S3 before being moved over? Any best practices on tape sizing?
Am I just wasting my time here or doing it all wrong?
-
- Veteran
- Posts: 643
- Liked: 312 times
- Joined: Aug 04, 2019 2:57 pm
- Full Name: Harvey
- Contact:
Re: GFS backups to AWS Deep Archive
Hey Chris,
Just fyi, v11 has Archive Tier which lands on Glacier: https://helpcenter.veeam.com/docs/backu ... ml?ver=110
This is 10x better than dealing with AWS VTL, trust me. Maybe revisit scale out repos in v11?
Just fyi, v11 has Archive Tier which lands on Glacier: https://helpcenter.veeam.com/docs/backu ... ml?ver=110
This is 10x better than dealing with AWS VTL, trust me. Maybe revisit scale out repos in v11?
-
- Service Provider
- Posts: 53
- Liked: 2 times
- Joined: Feb 21, 2014 5:15 am
- Full Name: Chris A
- Contact:
Re: GFS backups to AWS Deep Archive
When creating the backup repository for my SOBR extent I am using Object Storage > Amazon S3 > Amazon S3 Glacier? When trying to set this up I am able to select my Data Center, find my Bucket & Folder. I select to use Deep Archive and I get the error "Insufficient AWS EC2 permissions". What is this doing? Trying to set up a EC2 instance? I don't see any mention of this here: https://helpcenter.veeam.com/docs/backu ... ml?ver=110 Am I going down the wrong path again? I tried support but the person I spoke with didn't know anything about this new feature. Case #04783940
***EDIT***
Well support actually just got back to me haha! Looks like this does spin up a EC2 instance and I was looking in the wrong part of the manual! Do you see this EC2 appliance costing much each month to run? What type of instance do you typically run? I was looking at the VTL because I could bring the compute local (VTL Gateway) vs using EC2 and paying the extra $$$ each month.
***EDIT***
Well support actually just got back to me haha! Looks like this does spin up a EC2 instance and I was looking in the wrong part of the manual! Do you see this EC2 appliance costing much each month to run? What type of instance do you typically run? I was looking at the VTL because I could bring the compute local (VTL Gateway) vs using EC2 and paying the extra $$$ each month.
-
- Product Manager
- Posts: 20415
- Liked: 2302 times
- Joined: Oct 26, 2012 3:28 pm
- Full Name: Vladimir Eremin
- Contact:
Re: GFS backups to AWS Deep Archive
Can you confirm that the specified account has all required permissions mentioned here?I select to use Deep Archive and I get the error "Insufficient AWS EC2 permissions".
A proxy appliance is used to transfer data from Capacity to Archive Tier. It gets smaller objects from Capacity Tier and combines them into bigger ones to avoid additional solution costs. The proxy appliance gets deployed, once the offload starts, and gets removed, after it finishes. So, appliance usage does not cost much.Do you see this EC2 appliance costing much each month to run?
-
- Service Provider
- Posts: 53
- Liked: 2 times
- Joined: Feb 21, 2014 5:15 am
- Full Name: Chris A
- Contact:
Re: GFS backups to AWS Deep Archive
Veremin, I am trying to assign those permissions but getting the following error.
https://snipboard.io/07Qwbm.jpg
https://snipboard.io/07Qwbm.jpg
-
- Veeam Software
- Posts: 2010
- Liked: 670 times
- Joined: Sep 25, 2019 10:32 am
- Full Name: Oleg Feoktistov
- Contact:
Re: GFS backups to AWS Deep Archive
Please check these examples for exact syntax and formatting of such JSON policies. Thanks!
Who is online
Users browsing this forum: No registered users and 14 guests