Comprehensive data protection for all workloads
Post Reply
JHuston
Enthusiast
Posts: 36
Liked: 5 times
Joined: May 29, 2018 1:06 pm
Full Name: Jeff Huston
Contact:

Long term off site retention of larger data sets?

Post by JHuston »

I've been working on a backup refresh because our NetApp Altavault that handles our long term backups to the cloud is end of life for what seems like 6 months now as I can't nail down a good solution. This device allowed us to deduplicate 1PB of data down to 80TB and replicate that to the cloud for longer term backups which cost ~$1500 a month to store in Azure cool tier.

I was pumped for Veeam V10 as it introduced offloading to cloud on SOBR's. Tested and it works great! What I can't seem to come up with is how much using that would cost on my full data set long term. I do use REFS for my SOBR so I see substantial savings there. Being conservative lets say I got a 3:1 savings from dedupe and or REFS savings going to cloud i'd increase my monthly cost to $10-15K+, ouch.

We have around a 19TB full backup set weekly. We have to keep data for 7 years off site.

What is everyone else doing for long term off premise backups besides tape? I can't be on premise swapping tapes ( ahem, covid19...)

Starwind has a VTL appliance that can tier from local disk to cloud then to archive tier. Anyone use this for long term retention? What is performance like? I'm pretty sold on this but am looking for some real life reviews.

Quest Qorestor also seems like a good solution as they can dedupe really good for a second on premise copy and tier that to the cloud after x days for long term retention. However in the Veeam ready database there is a blurb that tiring features are not supported... The AltaVault listing in the database doesn't have that blurb but it does the same thing, not sure why that is?

Looking for input on what others are doing to get a pretty decent sized data set off site, long term without spending crazy money.

Thanks!
nitramd
Veteran
Posts: 297
Liked: 85 times
Joined: Feb 16, 2017 8:05 pm
Contact:

Re: Long term off site retention of larger data sets?

Post by nitramd »

That seems to be a thing with NetApp - they make interesting technology (AltaVault) then end up abandoning it. I'd stick with Veeam as it has a lot to offer, is easy to use, and they're always adding new features.

AWS also offers VTL.

Have you considered AWS Glacier for long term storage?
JHuston
Enthusiast
Posts: 36
Liked: 5 times
Joined: May 29, 2018 1:06 pm
Full Name: Jeff Huston
Contact:

Re: Long term off site retention of larger data sets?

Post by JHuston »

nitramd, I have looked into glacier, latest forum posts I found state it's not compatible with capacity tier to object storage, only AWS S3, Azure hot/cool and wasabi etc.

VTL can tier to glacier though. That was my thought, a starwind vtl appliance that provides fast backup to local "tape" disk which also copies it to the cloud provider of my choice as well as archiving to glacier. Just looking for someone that can provide feedback if they've used it like that with Veeam.
Gostev
Chief Product Officer
Posts: 31460
Liked: 6648 times
Joined: Jan 01, 2006 1:01 am
Location: Baar, Switzerland
Contact:

Re: Long term off site retention of larger data sets?

Post by Gostev »

Guys, we're actively working on adding Glacier support for the "keep data for 7 years off site" type of use cases. So, you won't go wrong by starting with using S3 now.

By the way, I would question if dedupe is a good idea for multi-year data archival in the first place, unless of course such archival is done "as a check box" and the real use case is "write once read never" :D because using dedupe means that a single data corruption bug or storage-level data loss will render your entire archive useless.

Thanks!
Post Reply

Who is online

Users browsing this forum: Bing [Bot], Semrush [Bot], tyler.jurgens and 201 guests