Comprehensive data protection for all workloads
Post Reply
mmonroe
Enthusiast
Posts: 75
Liked: 3 times
Joined: Jun 16, 2010 8:16 pm
Full Name: Monroe
Contact:

Backup Copy Job - Compression Suggestions

Post by mmonroe »

We currently use the Backup Copy Job function to create a alternate set of backup files onto a Synology NAS. The files are then mirror'ed out to Amazon S3 each night. I have plenty of room on the NAS and the speed has been excellent. The time to copy out to S3 has been pretty good as well.

That being said, I got to thinking (oh no) that maybe I should consider tinkering with the compression settings on the Backup Copy Job to see if a greater reduction in size could be gained without too much loss in backup time. Smaller backup copy job files would mean quicker S3 copies and less S3 charges due to reduced storage.

We currently have the default of "auto" for the backup copy jobs. I am not sure just what "auto" does. Does it go to "extreme" if the server speed is good? or is "auto" a fixed level of compression and dedup?

I am running some test Backup Copy Jobs to see the size and time differences between auto, optimal, high and extreme. I plan to watch the CPU during things. Since it is copying to a NAS, I suspect the limiting link will be the network versus the primary server or the disk arrays.

Can someone offer some details as to what the difference compression levels on a backup copy job actually do. their differences and whatnot and what "auto" specific does?

Thanks in advance,

MarkM
foggy
Veeam Software
Posts: 21139
Liked: 2141 times
Joined: Jul 11, 2011 10:22 am
Full Name: Alexander Fogelson
Contact:

Re: Backup Copy Job - Compression Suggestions

Post by foggy »

Mark, "Auto" compression level means that the backup copy job inherits the compression level of the parent regular backup job. Please find the description of all available levels in the corresponding user guide section.

I would add that "Optimal" compression level in v7 is an implementation of our proprietary algorithm that leverages advanced CPU instruction sets (SSE extensions) and that setting "Extreme" level will literally put extremely high load on the server responsible for compression (backup repository or its proxying server), however will allow to save only about 2%-3% of space at the expense of double CPU load.
Post Reply

Who is online

Users browsing this forum: Google [Bot] and 43 guests