BCJ compression level, and Repository dedupe settings

Hyper-V specific discussions

BCJ compression level, and Repository dedupe settings

Veeam Logoby graycj » Tue Aug 18, 2015 3:01 pm

Hello Veeam!

We're using a BCJ to backup to an offsite 2012 R2 server, which is slowley running out of space, and we're considering enabling deduplication on it.

I'm wondering what the implications will be for the current BCJ jobs and the backup files which are already in the repository if I make the following changes:

Change the compression level for the BCJ job from Auto to Dedupe-friendly.

Change (enable) the Repository Storage Compatibility settings to align backup file data blocks and decompress backup data blocks before storing.

I assume that the BCJ job will carry on as normal, and all existing backup files will remain as is, but future files will be more dedupe friendly with better storage efficiency?

Thanks for your help,
Carl
graycj
Service Provider
 
Posts: 7
Liked: 1 time
Joined: Thu Jan 07, 2010 10:57 am
Location: UK
Full Name: Carl Gray

Re: BCJ compression level, and Repository dedupe settings

Veeam Logoby foggy » Tue Aug 18, 2015 4:28 pm

Carl, there's no sense in changing compression level in the job if you enable decompression on repository. However, just FYI, if you change compression level in the backup copy job (and keep repository settings intact), new incrementals will contain data blocks compressed according to the new setting, as you've correctly assumed. Full backup will contain a mix of blocks with different compression levels once the merge process occurs.
foggy
Veeam Software
 
Posts: 14762
Liked: 1083 times
Joined: Mon Jul 11, 2011 10:22 am
Full Name: Alexander Fogelson

Re: BCJ compression level, and Repository dedupe settings

Veeam Logoby PTide » Wed Aug 19, 2015 11:43 am

Hi,

May I ask you what is the source of your BCJ? Please keep in mind, that "Auto" compression level means that the BCJ inherits the compression level of the parent regular backup job. Under some circumstances (source contains backups with different compression ratios) with BCJ "Auto" compression option the amount of traffic between sites might be larger than if you used a strict-defined option. The better option over "dedupe-friendly" would be to use "optimal", because it can significantly reduce the amount of data to be transferred over network, while giving not that much of CPU overhead, compared to "dedupe-driendly". Of course, everything depends on your environment, so please refer to this explanation.

Thank you.
PTide
Veeam Software
 
Posts: 3022
Liked: 247 times
Joined: Tue May 19, 2015 1:46 pm


Return to Microsoft Hyper-V



Who is online

Users browsing this forum: No registered users and 10 guests