Comprehensive data protection for all workloads
Post Reply
grant.gardner
Lurker
Posts: 2
Liked: never
Joined: May 23, 2024 10:22 am
Full Name: Grant Gardner
Contact:

Retain data after server retired

Post by grant.gardner »

I have a large file server that is being replaced that currently has about 2TB of data stored in the Cap. Tier using GFS retention. What is the proper way of retaining this old data forever in the Cap Tier but deleting it from local repo as I need the space for the new server. I see that if I delete the backup job, it says it will keep old data and orphan the disk. Does this orphan both the local and the cap disk? If so, can I then delete the local orphan copy and just keep the cap copy? Data is copied/moved to cap tier immediately after backups.

I just setup a new job with a test server so I can see how things work, but didnt know what the recommended way was of keeping these old backups and not risking them getting deleted.

Thanks,

Grant
david.domask
Veeam Software
Posts: 2112
Liked: 509 times
Joined: Jun 28, 2016 12:12 pm
Contact:

Re: Retain data after server retired

Post by david.domask »

Hi Grant, welcome to the forums.

It sounds like you want to enable Move backups to object storage as they age out of the operational restore window. This would move backups from the Performance Tier extents to Capacity Tier and clean up the Performance Tier backups that had been moved.

Before doing manual deletes, is this feasible for your environment?
David Domask | Product Management: Principal Analyst
grant.gardner
Lurker
Posts: 2
Liked: never
Joined: May 23, 2024 10:22 am
Full Name: Grant Gardner
Contact:

Re: Retain data after server retired

Post by grant.gardner »

We already have that enabled. I want to know how to disable/delete a job and keep all of the cap tier data and remove all local data.
Post Reply

Who is online

Users browsing this forum: Baidu [Spider], bct44, Thomas_ and 119 guests