Comprehensive data protection for all workloads
Post Reply
gkennedy
Influencer
Posts: 10
Liked: 10 times
Joined: Aug 01, 2013 3:48 am
Full Name: Gavin Kennedy
Contact:

Large Cloud Backup Issue

Post by gkennedy »

Hi all,

I have a case open with support (02057156) about our cloud backup job failing to run the incremental.

The initial full backup works perfectly, and is around 8TB in size. Support are saying the following:

"the full backup is too big to be opened again. That's why Veeam writes it successfully, but fails to use it for incremental job runs (Veeam needs to append metadata to it about incrementals). This is also to do with encryption and how we interact with linux repositories and linux limitations, so there's no easy fix right now".

Support and my cloud provider have asked me to break my job up into 10 Veeam jobs to get the size down.

I'm just not buying it. Does anyone out there have large cloud backup jobs similar to ours that are working fine?

To me, the implications on de-duplication will be reasonable significant, considering our VMs have a reasonable amount of similar data.

Can anyone chime in on this? What is the actual limitations referred to above for full backup file sizes? Perhaps I could then plan the backup jobs, rather than simply "Creating 10 jobs" which seems like an arbitrary number.

Shouldn't there be some handling built in, rather than writing a massive file over the cloud that it knows it won't be able to use?

Cheers,
Gavin
jdavidson_waters
Service Provider
Posts: 43
Liked: 15 times
Joined: May 07, 2013 2:50 pm
Full Name: James Davidson
Location: Northeast UK
Contact:

Re: Large Cloud Backup Issue

Post by jdavidson_waters » 1 person likes this post

Per-VM backup files would help you here. Maybe ask your Service Provider if they can provide you with a cloud repo that has this setting enabled.
@jam_davidson
Gostev
Chief Product Officer
Posts: 31460
Liked: 6648 times
Joined: Jan 01, 2006 1:01 am
Location: Baar, Switzerland
Contact:

Re: Large Cloud Backup Issue

Post by Gostev »

According to what you've quoted, the issue is not with the backup file size itself (8TB is pretty small to be honest), but rather metadata size. Looks like a large amount of application metadata is causing issues. Per-VM backup file chains should help indeed.
gkennedy
Influencer
Posts: 10
Liked: 10 times
Joined: Aug 01, 2013 3:48 am
Full Name: Gavin Kennedy
Contact:

Re: Large Cloud Backup Issue

Post by gkennedy »

Thanks for that.

The cloud provider does not want to enable Per-VM backup files because they'd lose dedupe for all of their other customers. They do not seem prepared to create a repository just for us. Dedupe is also a factor for us, as we obviously pay per GB for the repository, and have many GB of similar VMs.

Is this an "issue" or a "limitation" whereby the metadata cannot be read/written once it's a certain size?

How do I proceed? Split my job into 10 jobs? Seems crazy.

Cheers
Gavin
Post Reply

Who is online

Users browsing this forum: Google [Bot], Ivan239 and 292 guests