Host-based backup of VMware vSphere VMs.
Post Reply
jbarrow.viracoribt
Expert
Posts: 184
Liked: 18 times
Joined: Feb 15, 2013 9:31 pm
Full Name: Jonathan Barrow
Contact:

If you had unlimited storage, would disabling compression..

Post by jbarrow.viracoribt »

If you had unlimited storage, would disabling compression be a way to speed up your job creation and recovery time? I mean, just like creating or decompressing a zip file. The higher the level of compression, the longer that process takes.
Shestakov
Veteran
Posts: 7328
Liked: 781 times
Joined: May 21, 2014 11:03 am
Full Name: Nikita Shestakov
Location: Prague
Contact:

Re: If you had unlimited storage, would disabling compressio

Post by Shestakov »

Yes, that`s why we recommend to disable in-job compression if you use dedupe repository. But if you are eager to speed up your job, it`s better to start from bottleneck analysis.
Thanks!
dellock6
VeeaMVP
Posts: 6137
Liked: 1928 times
Joined: Jul 26, 2009 3:39 pm
Full Name: Luca Dell'Oca
Location: Varese, Italy
Contact:

Re: If you had unlimited storage, would disabling compressio

Post by dellock6 » 2 people like this post

I wouldn't be so sure that disabling compression may help to improve performance...let me explain.

These days, CPU are really powerful, and compression algorithms like lz4 (the one used by default in Veeam) are light on the CPU. On the other side, spinning disks are usually the slowest component in the entire datacenter, orders of magnitude slower than cpu, memory, even network.
So, by enabling compression, and assuming a 2x data reduction thanks to it, you are going to write half of data to the slowest component of the infrastructure, thus lowering the load on it.
On restore, same concepts can be applied: you need to read half the amount of data from the storage, and the fast cpu you have can speed up the decompression of the blocks you are reading, probably faster than having to read double the amount of an uncompressed block.

Sending uncompressed data to dedupe devices is a corner case, since those devices work better with uncompressed data, and can reach better dedupe ratio.
Luca Dell'Oca
Principal EMEA Cloud Architect @ Veeam Software

@dellock6
https://www.virtualtothecore.com/
vExpert 2011 -> 2022
Veeam VMCE #1
Gostev
Chief Product Officer
Posts: 31460
Liked: 6648 times
Joined: Jan 01, 2006 1:01 am
Location: Baar, Switzerland
Contact:

Re: If you had unlimited storage, would disabling compressio

Post by Gostev » 1 person likes this post

Luca is almost correct, except in the last line.
Shestakov wrote:Yes, that`s why we recommend to disable in-job compression if you use dedupe repository. But if you are eager to speed up your job, it`s better to start from bottleneck analysis. Thanks!
That is not correct. Even with dedupe repository, our recommendation is to keep the default compression enabled in the job, and use the "decompress before saving" option on repository instead.
jbarrow.viracoribt wrote:The higher the level of compression, the longer that process takes.
That is correct. Higher level of compression will slow down the job. However, the default one is specifically optimized for low CPU usage and fastest processing, so you essentially get 2x reduction of data that needs to be moved around almost "for free", so you do want to have this one enabled in all scenarios.
What's New in v7 wrote:Hardware-accelerated compression. A new default compression level with a proprietary algorithm implementation leverages advanced CPU instruction sets (SSE extensions). This reduces backup proxy CPU usage up to 10 times when compared to the previous default compression level.
Delo123
Veteran
Posts: 361
Liked: 109 times
Joined: Dec 28, 2012 5:20 pm
Full Name: Guido Meijers
Contact:

Re: If you had unlimited storage, would disabling compressio

Post by Delo123 » 1 person likes this post

Gostev wrote: Shestakov wrote:Yes, that`s why we recommend to disable in-job compression if you use dedupe repository. But if you are eager to speed up your job, it`s better to start from bottleneck analysis. Thanks!

That is not correct. Even with dedupe repository, our recommendation is to keep the default compression enabled in the job, and use the "decompress before saving" option on repository instead.
I can confirm default compression & decompress before saving gives best performance and savings on MS 2012R2 dedupe appliances.
Post Reply

Who is online

Users browsing this forum: No registered users and 84 guests