Discussions related to using object storage as a backup target.
Post Reply
mark49808
Enthusiast
Posts: 83
Liked: 14 times
Joined: Feb 02, 2017 6:31 pm
Contact:

Very long-term immutable backups

Post by mark49808 »

I'm reviewing the documentation but i cant quite wrap my head around this. I'm trying to figure out how to replace WORM tape with Veeam to disk and object (s3 glacier or wasabi)

I have requirements to store some data for up to 7 years due to industry regulations.

One thing i'm struggling with, is how this would look from a veeam perspective. And what it means for requiring a lot of on-prem disk storage.

Can someone enlighten me? Do i need to store it on disk on performance tier for 7 years, since i need "immutability"? Or is there the concept of "transferred immutability" where the data can move from Performance -> Capacity -> Archive tiers all set with immutability where i dont need to maintain full copies on Performance for 7 years?

Some of the data sets are massive, so the change rate is quite high (100% unique data in some cases, daily). I'd run out of room to store it very quickly if this is the case.

Looking for any guidance that can be provided.
Mildur
Product Manager
Posts: 10327
Liked: 2757 times
Joined: May 13, 2017 4:51 pm
Full Name: Fabian K.
Location: Switzerland
Contact:

Re: Very long-term immutable backups

Post by Mildur » 1 person likes this post

Hi Mark

Archive Tier is the best solution for this. It can exactly do what you need.
On Performance Tier, you don't need to have the entire year. Veeams SOBR is taking care of moving the data to next tier and removing it from another one. Configure your Job to keep 7 yearly GFS backups (and of course, shortterm, weekly and monthly) and veeam will do everything for you :)

SOBR with Capacity and Archive Tier on AWS:

1 - Performance Tier: Local Storage
Keep backups for a between 1 and a few weeks. You can configure how long in the next step, when you add the capacity tier with the move offload mode.

2 - Capacity Tier: Amazon S3
- Configure 1-2 months of Object Lock for the Capacity Tier, depending on how long you want to protect data on this tier.
- Use Move Offload Mode (enable Copy also for an instant offload, both policies can be used together) to move restore points from the performance tier after the operational restore window. Veeam will remove the restore points in this step from the performance tier.

3- Archive Tier: Amazon S3 Glacier
- Offload GFS Restore Points to AWS Archive (Move) after 1-2 months.
- Enable Immutability for their entire Retention Time.


Additional recommendation for performance tier:
Use reFS or XFS based Backup Repositories (FastClone) and your full backups doesn't need the entire space.
Product Management Analyst @ Veeam Software
mark49808
Enthusiast
Posts: 83
Liked: 14 times
Joined: Feb 02, 2017 6:31 pm
Contact:

Re: Very long-term immutable backups

Post by mark49808 »

Thanks @Mildur that seems to make sense. I'm still a bit lost on how I configure the retention policy in this case. "Configure your Job to keep 7 yearly GFS backups". Will that mean i'll be able to go back to a random Wednesday 3 years ago to restore the backup since it was pushed to object storage (capacity/archive)? Or will I be stuck with the Yearly full of that year, and thats it (since I have a yearly full, and the 12 monthly have already rolled off)?

It seems per the instructions I set the Backup Policy for say 14 days retention.
I then set the Capacity Tier in the SO Repo to "move backups to object storage as they age out..." (older than 14 days)

So what happens after 14 days? Wouldn't the policy "clean up" things? If my requirement is to keep all the backups, but only for a certain amount on performance tier (due to cost) how do i do that? It seems GFS somewhat accomplishes that, but from my understanding it does not seem to keep every backup. So going back to a random Tuesday 5 years ago is not possible as i'll at best have the weekly full. No concept of a daily in GFS?

Or maybe i'm thinking through this wrong.

The context here would be that I have a file system that i need to back up for regulatory purposes, but that file system is purged routinely, assuming its been sent to WORM tape. However, if i cant recover to any random day in the past, then i'll be losing data since the source is gone. So this is more of an archiving question i suppose.
Mildur
Product Manager
Posts: 10327
Liked: 2757 times
Joined: May 13, 2017 4:51 pm
Full Name: Fabian K.
Location: Switzerland
Contact:

Re: Very long-term immutable backups

Post by Mildur »

Will that mean i'll be able to go back to a random Wednesday 3 years ago to restore the backup since it was pushed to object storage (capacity/archive)? Or will I be stuck with the Yearly full of that year, and thats it (since I have a yearly full, and the 12 monthly have already rolled off)?
GFS Restore Points are never restore points from random days. Your yearly gfs restore point is the first sunday of the year. That‘s the default setting for the yearly GFS restore point.

If you set short term retention to 14 days, only 14-21 daily backups will be available to restore from performance and capacity tier. Only GFS Restore Points will be retained after the 14-21 days.
So going back to a random Tuesday 5 years ago is not possible as i'll at best have the weekly full. No concept of a daily in GFS?
Veeam has two retention policies:
Short Term Retention: Daily Restore Points
Long Term Retention (GFS): weekly, monthly, yearly Restore Points

There is indeed no daily gfs. If you want to keep 7 years of daily restore points, then you need to configure your short term retention to 2555 days. I‘m not sure if that is even possible to configure.

On performance and capacity tier, you can store daily and GFS restore points.
On archive tier, you can only store GFS restore points.

With the move policy and operational restore window of 2-3 weeks, you could move them away from the performance tier to the capacity tier, but not to the Archive tier.
Weekly GFS has the lowest time period between two gfs RPs and could be moved to the archive tier.
The context here would be that I have a file system that i need to back up for regulatory purposes, but that file system is purged routinely, assuming its been sent to WORM tape. However, if i cant recover to any random day in the past, then i'll be losing data since the source is gone. So this is more of an archiving question i suppose.
Looks like Veeam NAS Backup Feature could help you with that. But it doesn‘t support immutability or backup copy jobs to object storage today. Only archived files (deleted files or older file versions) will be copied to object storage.
NAS Backup jobs (and NAS Copy Jobs) to object storage are planned for V12.
Product Management Analyst @ Veeam Software
mark49808
Enthusiast
Posts: 83
Liked: 14 times
Joined: Feb 02, 2017 6:31 pm
Contact:

Re: Very long-term immutable backups

Post by mark49808 »

Thanks for the clarification. That is a disappointing limitation and makes my use case a non-starter unfortunately. WORM tape still lives on then as it seems its the only way to really achieve this. Or I have to use a different product as it seems Veeam is not a good archiving solution for this specific use case. Its great for backups, but for preserving and offloading data to object how i want to is not quite going to work.

"On archive tier, you can only store GFS restore points." - seems like an odd limitation. Why not let me archive off the data that I want?
Mildur
Product Manager
Posts: 10327
Liked: 2757 times
Joined: May 13, 2017 4:51 pm
Full Name: Fabian K.
Location: Switzerland
Contact:

Re: Very long-term immutable backups

Post by Mildur »

I don‘t know how S3 Glacier works exactly.
But it‘s not a normal bucket where veeam can access data blocks instantly. It‘s cheap, but have longer retrieval times. I assume thats the reason why veeam decided to only offload gfs restore points to archive tier. Probably someone from veeam have a better explanation that me.
It‘s best to wait for someone from veeam to give a feedback about that use case.
Product Management Analyst @ Veeam Software
mark49808
Enthusiast
Posts: 83
Liked: 14 times
Joined: Feb 02, 2017 6:31 pm
Contact:

Re: Very long-term immutable backups

Post by mark49808 »

I suppose i could do a long retention time (2555 days) and move to object (Wasabi) after 14 days. But pricing there isn't attractive...was hoping for Glacier Deep Archive pricing which makes this workable.

Would definitely appreciate a Veeam response to how this could be configured or how to do long term archives where you need to persist every backup.
Gostev
Chief Product Officer
Posts: 32239
Liked: 7603 times
Joined: Jan 01, 2006 1:01 am
Location: Baar, Switzerland
Contact:

Re: Very long-term immutable backups

Post by Gostev »

Strangely NAS backup jobs don't support Glacier as an archive repository. @Dima P. is it coming in V12 or later?
Dima P.
Product Manager
Posts: 14820
Liked: 1772 times
Joined: Feb 04, 2013 2:07 pm
Full Name: Dmitry Popov
Location: Prague
Contact:

Re: Very long-term immutable backups

Post by Dima P. »

Nas backup does not have GFS and is running in forever forward incremental, so Glacier support is planned for post v12. Thanks!
dmason
Novice
Posts: 3
Liked: never
Joined: May 26, 2020 6:16 pm
Full Name: Don Mason
Contact:

Re: Very long-term immutable backups

Post by dmason »

I know this is an old thread, but wondering if OP was able to use Veeam & S3 to keep 7 years of immutable backups? We've got a similar need to keep 8 years of immutable backups. We're limited on the Veeam side to only 999 days of immutability on the S3 repository. We're backing up 100TB+ from a couple of filers and do not think GFS is an option with that type of backup. thanks in advance
sfirmes
Veeam Software
Posts: 321
Liked: 150 times
Joined: Jul 24, 2018 8:38 pm
Full Name: Stephen Firmes
Contact:

Re: Very long-term immutable backups

Post by sfirmes »

@dmason have you looked into using the Archive Tier?
Steve Firmes | Senior Solutions Architect, Product Management - Alliances @ Veeam Software
dmason
Novice
Posts: 3
Liked: never
Joined: May 26, 2020 6:16 pm
Full Name: Don Mason
Contact:

Re: Very long-term immutable backups

Post by dmason »

Sorry @sfirmes, I did not see your reply. I think even with using Archive Tier, we're still not able to set the 8 years of immutability on the S3 bucket properly because of the 999 days limit with Veeam. Is that incorrect?
Gostev
Chief Product Officer
Posts: 32239
Liked: 7603 times
Joined: Jan 01, 2006 1:01 am
Location: Baar, Switzerland
Contact:

Re: Very long-term immutable backups

Post by Gostev »

Archived backups are automatically made immutable for the entire remaining duration of their retention policy. This is not configurable.
Post Reply

Who is online

Users browsing this forum: No registered users and 5 guests