If you had unlimited storage, would disabling compression..

VMware specific discussions

If you had unlimited storage, would disabling compression..

Veeam Logoby jbarrow.viracoribt » Fri Nov 20, 2015 1:14 pm

If you had unlimited storage, would disabling compression be a way to speed up your job creation and recovery time? I mean, just like creating or decompressing a zip file. The higher the level of compression, the longer that process takes.
jbarrow.viracoribt
Expert
 
Posts: 184
Liked: 18 times
Joined: Fri Feb 15, 2013 9:31 pm
Full Name: Jonathan Barrow

Re: If you had unlimited storage, would disabling compressio

Veeam Logoby Shestakov » Fri Nov 20, 2015 1:34 pm

Yes, that`s why we recommend to disable in-job compression if you use dedupe repository. But if you are eager to speed up your job, it`s better to start from bottleneck analysis.
Thanks!
Shestakov
Veeam Software
 
Posts: 4861
Liked: 396 times
Joined: Wed May 21, 2014 11:03 am
Location: Saint Petersburg
Full Name: Nikita Shestakov

Re: If you had unlimited storage, would disabling compressio

Veeam Logoby dellock6 » Sun Nov 22, 2015 9:24 am 2 people like this post

I wouldn't be so sure that disabling compression may help to improve performance...let me explain.

These days, CPU are really powerful, and compression algorithms like lz4 (the one used by default in Veeam) are light on the CPU. On the other side, spinning disks are usually the slowest component in the entire datacenter, orders of magnitude slower than cpu, memory, even network.
So, by enabling compression, and assuming a 2x data reduction thanks to it, you are going to write half of data to the slowest component of the infrastructure, thus lowering the load on it.
On restore, same concepts can be applied: you need to read half the amount of data from the storage, and the fast cpu you have can speed up the decompression of the blocks you are reading, probably faster than having to read double the amount of an uncompressed block.

Sending uncompressed data to dedupe devices is a corner case, since those devices work better with uncompressed data, and can reach better dedupe ratio.
Luca Dell'Oca
EMEA Cloud Architect @ Veeam Software

@dellock6
http://www.virtualtothecore.com
vExpert 2011-2012-2013-2014-2015-2016
Veeam VMCE #1
dellock6
Veeam Software
 
Posts: 5047
Liked: 1330 times
Joined: Sun Jul 26, 2009 3:39 pm
Location: Varese, Italy
Full Name: Luca Dell'Oca

Re: If you had unlimited storage, would disabling compressio

Veeam Logoby Gostev » Sun Nov 22, 2015 1:21 pm 1 person likes this post

Luca is almost correct, except in the last line.

Shestakov wrote:Yes, that`s why we recommend to disable in-job compression if you use dedupe repository. But if you are eager to speed up your job, it`s better to start from bottleneck analysis. Thanks!

That is not correct. Even with dedupe repository, our recommendation is to keep the default compression enabled in the job, and use the "decompress before saving" option on repository instead.

jbarrow.viracoribt wrote:The higher the level of compression, the longer that process takes.

That is correct. Higher level of compression will slow down the job. However, the default one is specifically optimized for low CPU usage and fastest processing, so you essentially get 2x reduction of data that needs to be moved around almost "for free", so you do want to have this one enabled in all scenarios.
What's New in v7 wrote:Hardware-accelerated compression. A new default compression level with a proprietary algorithm implementation leverages advanced CPU instruction sets (SSE extensions). This reduces backup proxy CPU usage up to 10 times when compared to the previous default compression level.
Gostev
Veeam Software
 
Posts: 21396
Liked: 2349 times
Joined: Sun Jan 01, 2006 1:01 am
Location: Baar, Switzerland

Re: If you had unlimited storage, would disabling compressio

Veeam Logoby Delo123 » Mon Nov 23, 2015 7:20 pm 1 person likes this post

Gostev wrote:Shestakov wrote:Yes, that`s why we recommend to disable in-job compression if you use dedupe repository. But if you are eager to speed up your job, it`s better to start from bottleneck analysis. Thanks!

That is not correct. Even with dedupe repository, our recommendation is to keep the default compression enabled in the job, and use the "decompress before saving" option on repository instead.


I can confirm default compression & decompress before saving gives best performance and savings on MS 2012R2 dedupe appliances.
Delo123
Expert
 
Posts: 351
Liked: 97 times
Joined: Fri Dec 28, 2012 5:20 pm
Full Name: Guido Meijers


Return to VMware vSphere



Who is online

Users browsing this forum: No registered users and 12 guests