We have some legacy physical servers with scripts we didn't write and don't have access to.
These scripts copy a huge .zip or tar.gz file to a share of a VM that's backed up by Veeam.
We notice this causes huge data transfers in the backup (forward incremental), while only a small amount of data in the .zip file is actually changed that day.
Would it be worth to unzip the files with a script before the back-up or will Veeam dedupe this files anyway?
Of course I could just try it but if one of you guys already tried this ...
It doesn't really whether the content of files that are copied to the said VM changes only slightly on everyday basis. Veeam Backup and Replication is an image-based backup solution and cares about virtual disks changed blocks. And it seems that the copy operation does affect that significantly. Thanks.