Hi all,
I have a case open with support (02057156) about our cloud backup job failing to run the incremental.
The initial full backup works perfectly, and is around 8TB in size. Support are saying the following:
"the full backup is too big to be opened again. That's why Veeam writes it successfully, but fails to use it for incremental job runs (Veeam needs to append metadata to it about incrementals). This is also to do with encryption and how we interact with linux repositories and linux limitations, so there's no easy fix right now".
Support and my cloud provider have asked me to break my job up into 10 Veeam jobs to get the size down.
I'm just not buying it. Does anyone out there have large cloud backup jobs similar to ours that are working fine?
To me, the implications on de-duplication will be reasonable significant, considering our VMs have a reasonable amount of similar data.
Can anyone chime in on this? What is the actual limitations referred to above for full backup file sizes? Perhaps I could then plan the backup jobs, rather than simply "Creating 10 jobs" which seems like an arbitrary number.
Shouldn't there be some handling built in, rather than writing a massive file over the cloud that it knows it won't be able to use?
Cheers,
Gavin
-
- Influencer
- Posts: 10
- Liked: 10 times
- Joined: Aug 01, 2013 3:48 am
- Full Name: Gavin Kennedy
- Contact:
-
- Service Provider
- Posts: 43
- Liked: 15 times
- Joined: May 07, 2013 2:50 pm
- Full Name: James Davidson
- Location: Northeast UK
- Contact:
Re: Large Cloud Backup Issue
Per-VM backup files would help you here. Maybe ask your Service Provider if they can provide you with a cloud repo that has this setting enabled.
@jam_davidson
-
- Chief Product Officer
- Posts: 31814
- Liked: 7302 times
- Joined: Jan 01, 2006 1:01 am
- Location: Baar, Switzerland
- Contact:
Re: Large Cloud Backup Issue
According to what you've quoted, the issue is not with the backup file size itself (8TB is pretty small to be honest), but rather metadata size. Looks like a large amount of application metadata is causing issues. Per-VM backup file chains should help indeed.
-
- Influencer
- Posts: 10
- Liked: 10 times
- Joined: Aug 01, 2013 3:48 am
- Full Name: Gavin Kennedy
- Contact:
Re: Large Cloud Backup Issue
Thanks for that.
The cloud provider does not want to enable Per-VM backup files because they'd lose dedupe for all of their other customers. They do not seem prepared to create a repository just for us. Dedupe is also a factor for us, as we obviously pay per GB for the repository, and have many GB of similar VMs.
Is this an "issue" or a "limitation" whereby the metadata cannot be read/written once it's a certain size?
How do I proceed? Split my job into 10 jobs? Seems crazy.
Cheers
Gavin
The cloud provider does not want to enable Per-VM backup files because they'd lose dedupe for all of their other customers. They do not seem prepared to create a repository just for us. Dedupe is also a factor for us, as we obviously pay per GB for the repository, and have many GB of similar VMs.
Is this an "issue" or a "limitation" whereby the metadata cannot be read/written once it's a certain size?
How do I proceed? Split my job into 10 jobs? Seems crazy.
Cheers
Gavin
Who is online
Users browsing this forum: Baidu [Spider], Bing [Bot], Semrush [Bot] and 64 guests