-
- Novice
- Posts: 9
- Liked: 2 times
- Joined: Feb 07, 2019 1:00 pm
- Full Name: Derrick Ritchie
- Contact:
Amazon S3 Buckets and Glacier Archiving
Good Afternoon Everyone,
I was wondering if someone could possibly help with this issue and a query around this issue for Amazon S3 storage and the 5TB limit that exists in the buckets that are used for the capacity tier in SOBR.
We have a client with backup jobs that keep monthly full backups for a long priod of time as well as the daily incrementals. We want to move the Monthly jobs into Amazon S3 using the capacity tier process now available in Update 4 but we have concerns over the 5TB limit as the overall capacity is around 7TB of backups generated per month. While the process will substantially slim this down I have concerns that this will run out relatively quickly, over the space of 6-12 months due to the volumes being generated.
For example we would be looking to move any monthly fulls out of the performance tier after 90 days into an Amazon S3 bucket with IA. Initially this would be around 7TB but with the data reduction we believe this could be about 3-3.5TB in the bucket (this my be a pessimistic and the savings could be better). My query is can any of this be moved into glacial storage using the Amazon bucket logic and the capacity process still work or would it be better to look at using Azure Blobs as this has a higher capacity threshold for storing data?
Many Thanks,
Derrick
I was wondering if someone could possibly help with this issue and a query around this issue for Amazon S3 storage and the 5TB limit that exists in the buckets that are used for the capacity tier in SOBR.
We have a client with backup jobs that keep monthly full backups for a long priod of time as well as the daily incrementals. We want to move the Monthly jobs into Amazon S3 using the capacity tier process now available in Update 4 but we have concerns over the 5TB limit as the overall capacity is around 7TB of backups generated per month. While the process will substantially slim this down I have concerns that this will run out relatively quickly, over the space of 6-12 months due to the volumes being generated.
For example we would be looking to move any monthly fulls out of the performance tier after 90 days into an Amazon S3 bucket with IA. Initially this would be around 7TB but with the data reduction we believe this could be about 3-3.5TB in the bucket (this my be a pessimistic and the savings could be better). My query is can any of this be moved into glacial storage using the Amazon bucket logic and the capacity process still work or would it be better to look at using Azure Blobs as this has a higher capacity threshold for storing data?
Many Thanks,
Derrick
-
- Chief Product Officer
- Posts: 31816
- Liked: 7312 times
- Joined: Jan 01, 2006 1:01 am
- Location: Baar, Switzerland
- Contact:
Re: Amazon S3 Buckets and Glacier Archiving
Hello! That is incorrect, there no limit to the maximum bucket size with Amazon. S3 buckets have unlimited capacity, so no special design considerations are required. Thanks!
-
- Novice
- Posts: 9
- Liked: 2 times
- Joined: Feb 07, 2019 1:00 pm
- Full Name: Derrick Ritchie
- Contact:
Re: Amazon S3 Buckets and Glacier Archiving
Hi there,
Thank you for the update, I can see that I got tripped up on the semantics as it is a 5TB individual file, not a 5TB limit. Thank you for the clarification.
In an ideal scenario we would like to move some of these files onto the cheapest possible storage by archiving them after a certain time period into a Glacial vault. Is this possible at all in the current Veeam set-up or does this break the logic that Veeam uses for storing in AWS?
Or is this only covered by selecting the IA option in the Bucket set up in Veeam?
Many Thanks,
Derrick
Thank you for the update, I can see that I got tripped up on the semantics as it is a 5TB individual file, not a 5TB limit. Thank you for the clarification.
In an ideal scenario we would like to move some of these files onto the cheapest possible storage by archiving them after a certain time period into a Glacial vault. Is this possible at all in the current Veeam set-up or does this break the logic that Veeam uses for storing in AWS?
Or is this only covered by selecting the IA option in the Bucket set up in Veeam?
Many Thanks,
Derrick
-
- Chief Product Officer
- Posts: 31816
- Liked: 7312 times
- Joined: Jan 01, 2006 1:01 am
- Location: Baar, Switzerland
- Contact:
Re: Amazon S3 Buckets and Glacier Archiving
Glacier has too many design peculiarities that makes it incompatible with the Capacity Tier, so IA option is the way to go. Thanks!
-
- Veeam Software
- Posts: 2097
- Liked: 310 times
- Joined: Nov 17, 2015 2:38 am
- Full Name: Joe Marton
- Location: Chicago, IL
- Contact:
Re: Amazon S3 Buckets and Glacier Archiving
If you want to use Glacier you'll need to use some sort of storage gateway (e.g. Amazon's, StarWind) which leverages VTL functionality. Copying backups to tape uses S3, and archiving tapes uses Glacier.
Joe
Joe
Who is online
Users browsing this forum: No registered users and 9 guests