Comprehensive data protection for all workloads
Post Reply
McClane
Expert
Posts: 106
Liked: 11 times
Joined: Jun 20, 2009 12:47 pm
Contact:

GFS space requirements

Post by McClane »

Hi,

I working my way through the backup copy jobs and the GFS rotation schemes at the moment. The idea behind that is great, but as far as I understand it, if I want to keep a 12 monthly backups, I will get 12 full backups by the time the year is over. Wouldn't it be better for archiving porpuses that the monthly backups were incremantal?
In v6.5 I had an additional backup job that ran once on the last day of the month incremental. After 12 month only about 2 full backups of space were required. I could still do it that way, but I won't get the benefits of only backup once from the source then.

Greetings,

Manuel
Gostev
Chief Product Officer
Posts: 31814
Liked: 7302 times
Joined: Jan 01, 2006 1:01 am
Location: Baar, Switzerland
Contact:

Re: GFS space requirements

Post by Gostev »

Hi, I suppose you could create a Backup Copy jobs with copy interval of 1 month, and not enable GFS on that (just set most recent restore points to keep to 12) to achieve what you like.

Generally speaking though, GFS concept of data archiving specifically requires full backups for the purpose of each restore point to be self-contained and self-sufficient for recovery. Incremental backup does not provide the same level of reliability, as a single bad incremental backup will ruin all dependent incrementals. The reliability comes at cost of storage, of course.

Thank you.
McClane
Expert
Posts: 106
Liked: 11 times
Joined: Jun 20, 2009 12:47 pm
Contact:

Re: GFS space requirements

Post by McClane »

The archive is only for quick access to older backups of the last year. We also have copies for long term storage.
I already thought about that, but the GUI only give me the option to copy every minute, hour or day. Is it possible to diasble the job and let it sync every month through a powershell command?
Gostev
Chief Product Officer
Posts: 31814
Liked: 7302 times
Joined: Jan 01, 2006 1:01 am
Location: Baar, Switzerland
Contact:

Re: GFS space requirements

Post by Gostev »

You could set 28 or 30 days, that should do it. Or PowerShell.
McClane
Expert
Posts: 106
Liked: 11 times
Joined: Jun 20, 2009 12:47 pm
Contact:

Re: GFS space requirements

Post by McClane »

Ah, now I got it. I set it to 30 days. We will see what happens.
It seems I cannot start a disabled job by powershell. And I have not tried to start one if delete everything in the schedule time table.
tsightler
VP, Product Management
Posts: 6035
Liked: 2860 times
Joined: Jun 05, 2009 12:57 pm
Full Name: Tom Sightler
Contact:

Re: GFS space requirements

Post by tsightler »

I was thinking about that as well. I think the easiest answer would be to create two scripts, one to disable the job and one to enable it. Those are simple "one-liners" in Powershell like the following:
Disable Job

Code: Select all

(Get-VBRJob -Name "<Job_Name>").DisableScheduler()
Enable Job

Code: Select all

(Get-VBRJob -Name "<Job_Name>").EnableScheduler()
So my thought was to call the "Disable Job" script as a post-job activity from the Advanced settings in the Copy Job, then create a Windows Task scheduler that on the last day of the month (or the first day, however you want) calls the enable script. The job would run through one time, copying the most recent point, then disable itself when complete. This is all a theory, I haven't tried it. I see about setting up a simple test in my lab.

I also think there may be an even easier way if you want to get the exact start/end of month, call a Powershell from post-job activity that updates the next copy interval to be on the start/end of the next month.
Bunce
Veteran
Posts: 259
Liked: 8 times
Joined: Sep 18, 2009 9:56 am
Full Name: Andrew
Location: Adelaide, Australia
Contact:

Re: GFS space requirements

Post by Bunce » 1 person likes this post

McClane wrote:Hi,

I working my way through the backup copy jobs and the GFS rotation schemes at the moment. The idea behind that is great, but as far as I understand it, if I want to keep a 12 monthly backups, I will get 12 full backups by the time the year is over. Wouldn't it be better for archiving porpuses that the monthly backups were incremantal?
In v6.5 I had an additional backup job that ran once on the last day of the month incremental. After 12 month only about 2 full backups of space were required. I could still do it that way, but I won't get the benefits of only backup once from the source then.
We're going to keep using separate jobs so we can use incrementals. Can't see any advantage in using this feature.

Either Incrementals are safe or they aren't - We've got surebackup to test them anyway and it should be up to the customer to decide.

Maintaining a full for each adds exponential storage requirements and is never going to fly for us.
Post Reply

Who is online

Users browsing this forum: Bing [Bot] and 63 guests