Comprehensive data protection for all workloads
Post Reply
beckermanex
Novice
Posts: 5
Liked: never
Joined: Mar 12, 2014 7:44 pm
Full Name: Erich Becker
Contact:

Just Added Cloud - Best Plan to Send Data to Glacier

Post by beckermanex »

Hi,

We just added the Cloud Edition to our installation, we tested it and loved the functionality as our co-location backup respository is not living up to the standards I'd like. We'll keep using it, but having another copy of our data in another location is never a bad thing.

We have one main job that runs nightly backing up 16 VMs with a Synthetic Full every Sunday night. Each incremental is between 11 and 30GB and we're keeping 240 restore points.

What I want to do is send a copy of our data offsite, a full copy, our first seed of the data, of which the incremental are generated from was a long time ago.

What's the best way to send a full copy of everything offsite, then send incrementals that are run nightly, probably with a 20 restore point purge on the job.

Thanks!

Erich
veremin
Product Manager
Posts: 20415
Liked: 2302 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by veremin »

What I want to do is send a copy of our data offsite, a full copy, our first seed of the data, of which the incremental are generated from was a long time ago.
Hi, Erich,

With Cloud Edition in place, all you need to do is create a corresponding backup plan and specify the required repository that stores long backup chain as a "backup source" for it.

Thanks.
beckermanex
Novice
Posts: 5
Liked: never
Joined: Mar 12, 2014 7:44 pm
Full Name: Erich Becker
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by beckermanex »

I get that part, but, by doing that, will I be capturing ALL files contained within ALL of my VMs if my current backup repository looks like it contains all the incrementals (no single backup is over 45GB, some VMs are 500GBs). I just want to be sure I am uploading everything properly to the cloud, and when doing so, I want to set my purge settings so that only 30-40 restore points are available via Glacier whereas I'd have 240 locally.
veremin
Product Manager
Posts: 20415
Liked: 2302 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by veremin »

If you choose repository as a backup source for a backup plan, then, restore points present there should be copied to cloud location. Also, you can put into use purge options in order to control the number of file versions kept by Cloud Edition.

Thanks.
beckermanex
Novice
Posts: 5
Liked: never
Joined: Mar 12, 2014 7:44 pm
Full Name: Erich Becker
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by beckermanex »

Let me phrase it slightly differently. I'm going to use my repository that has 240 points, I'm going to set the Cloud Backup Job to start with March 1st, upload all backup files from the repository from then to the cloud and set a purge limit of 30 (so I'll have 12 uploaded and 18 more to upload as the month goes on). If a file was last updated on December 20th, how would a copy of that file be uploaded to the cloud if I'm only going to put up my last 12 incrementals moving forward? Sorry for asking the same question so many different ways/times, I just want to be sure I have everything uploaded correctly should we ever need it.

I'm guessing the answer is, its not possible, we'd only have the files that were updated over the last 12 days uploaded to the cloud with my scenario above.
veremin
Product Manager
Posts: 20415
Liked: 2302 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by veremin »

I'm going to use my repository that has 240 points, I'm going to set the Cloud Backup Job to start with March 1st,
If in the settings of backup plan you set to copy files modified since "March 1st", then, only those files will be copied to the chosen cloud. Otherwise, the whole repository will be transferred to offsite location.
If a file was last updated on December 20th, how would a copy of that file be uploaded to the cloud if I'm only going to put up my last 12 incrementals moving forward?
You can specify backup plan to copy files modified since "December, 20th".

When the latest .vbk file was created? On December, 20th? I would recommend copying this file, as well as, all increments, depending on it. Because, copying only increments would leave you with nothing in case of disaster.

Thanks.
beckermanex
Novice
Posts: 5
Liked: never
Joined: Mar 12, 2014 7:44 pm
Full Name: Erich Becker
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by beckermanex »

v.Eremin wrote:
When the latest .vbk file was created? On December, 20th? I would recommend copying this file, as well as, all increments, depending on it. Because, copying only increments would leave you with nothing in case of disaster.

Thanks.
My last full backup file (vbk) was created on 3/10, would it be safe to then upload that file, and all incremental after this date in order to capture every file, and then every change after that date, but, if I set a purge limit, once I reach that limit the full backup would be the first to drop off the server, correct? How can I stop this without just having to keep EVERY restore point indefinitely?
veremin
Product Manager
Posts: 20415
Liked: 2302 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by veremin »

How often full backup is performed? If I were you, I would probably match backup plan purge settings with retention settings of source backup job. This way, you wouldn't left with just bunch of of increments. Thanks.
beckermanex
Novice
Posts: 5
Liked: never
Joined: Mar 12, 2014 7:44 pm
Full Name: Erich Becker
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by beckermanex »

v.Eremin wrote:How often full backup is performed? If I were you, I would probably match backup plan purge settings with retention settings of source backup job. This way, you wouldn't left with just bunch of of increments. Thanks.
We are running this full job as Incremental with Synthetic Fulls (forever-incremental) created on Sunday nights.

Should I be using reversed incremental for this job so that I have a full backup at the most recent state, or just schedule an Active Full Backup and for the 1st of the month and match that to my purge settings?
foggy
Veeam Software
Posts: 21139
Liked: 2141 times
Joined: Jul 11, 2011 10:22 am
Full Name: Alexander Fogelson
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by foggy »

Scheduling at least monthly active fulls is a good idea by itself, regardless of the cloud backups at all.
veremin
Product Manager
Posts: 20415
Liked: 2302 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Just Added Cloud - Best Plan to Send Data to Glacier

Post by veremin »

We are running this full job as Incremental with Synthetic Fulls (forever-incremental) created on Sunday nights.
So the latest full backup was created on March, 9th, right? And the previous one on March, 2nd? If so, you can set backup plan to start copying the files modified since March 2nd.

If I got everything correctly, the full backup is created on weekly basis, so, with the described scenario (purge files older than 30 days) you won't face situation when there are nothing but increments stored in the cloud - there will be always some full backups and depending increments you can restore from.

Thanks.
Post Reply

Who is online

Users browsing this forum: Bing [Bot], Semrush [Bot] and 85 guests