-
- Novice
- Posts: 8
- Liked: 1 time
- Joined: Jan 17, 2021 9:16 am
- Contact:
Question about best practice compression level on an external data domain
Hello,
I have a question about the job setting regarding storage optimization.
I would like to set up a job (volume backup of the entire hard disk) that periodically deduplicates a Windows 7/10 to an external data domain via shared folder (SMB1).
The Data Domain is located in the local network.
Periodically means an incremental backup once a week and then a new full backup every month.
Retention period approx. 3 months.
What are the best settings to leave the deduplication on the Data Domain? As far as I know, there must be no compression in the image of Veeam or am I wrong?
In my opinion, the following settings are necessary for this under "Advanced Settings" -> "Storage":
Compression Level:
- Dedupe-friendly
Storage optimization:
- LAN target
Or am I wrong here?
Veeam version used: Veeam Agents for Microsoft Windows Build: 4.0.1.2169
Support Case No. 04589861
BR and many thanks
I have a question about the job setting regarding storage optimization.
I would like to set up a job (volume backup of the entire hard disk) that periodically deduplicates a Windows 7/10 to an external data domain via shared folder (SMB1).
The Data Domain is located in the local network.
Periodically means an incremental backup once a week and then a new full backup every month.
Retention period approx. 3 months.
What are the best settings to leave the deduplication on the Data Domain? As far as I know, there must be no compression in the image of Veeam or am I wrong?
In my opinion, the following settings are necessary for this under "Advanced Settings" -> "Storage":
Compression Level:
- Dedupe-friendly
Storage optimization:
- LAN target
Or am I wrong here?
Veeam version used: Veeam Agents for Microsoft Windows Build: 4.0.1.2169
Support Case No. 04589861
BR and many thanks
-
- Product Manager
- Posts: 2578
- Liked: 707 times
- Joined: Jun 14, 2013 9:30 am
- Full Name: Egor Yakovlev
- Location: Prague, Czech Republic
- Contact:
Re: Question about best practice compression level on an external data domain
You are right, dedupe-friendly compression and LAN target should do the trick.
/Cheers!
/Cheers!
-
- Novice
- Posts: 8
- Liked: 1 time
- Joined: Jan 17, 2021 9:16 am
- Contact:
Re: Question about best practice compression level on an external data domain
Thanks for your answer Egor.
The only question I have is whether compression level "None" is not better suited to the storage capacity than "Dedupe-friendly"?
The only question I have is whether compression level "None" is not better suited to the storage capacity than "Dedupe-friendly"?
-
- VP, Product Management
- Posts: 7076
- Liked: 1510 times
- Joined: May 04, 2011 8:36 am
- Full Name: Andreas Neufert
- Location: Germany
- Contact:
Re: Question about best practice compression level on an external data domain
Hi,
it depends.
The optimal settings are:
Job:
Dedup disabled
Compresseion Dedup friendly (removes whitespace without affecting deduplication)
Block Size: Local target (large blocks) (4MB Block size to have more sequential processing within the dedup storage).
Use Active Full + Incremental (as you use an SMB share and not DDboost, I suggest you switch to NFS share as there are some optimizations within Datadomain that are not available within SMB. For example automatic Cache adoption to random/sequential workload which can boost restore by multiple times)
Repository:
Block Allignement: Disabled
Uncompression: Disabled
This is the optimal processing in case of speed. As part of the data reduction is done within Veeam, the data reduction values shown at DataDomain level are reduced. This is expected and will not have an effect on the overall needed capacity on the DataDomain, just the ration shown there.
If you want to optimize it for that value then you need to switch the following 2 values from above:
Job-Compression: Optimal
Repository - Uncompression: Enabled
The "LAN target" setting is contra productive as it will reduce the block size to 512MB which slows down processing for the DataDomain. WAN, LAN, Local naming is still there for historical reasons and has nothing to do anymore with the actual data transfer method.
To know more about dedup settings, please go to: https://www.veeam.com/kb1745
it depends.
The optimal settings are:
Job:
Dedup disabled
Compresseion Dedup friendly (removes whitespace without affecting deduplication)
Block Size: Local target (large blocks) (4MB Block size to have more sequential processing within the dedup storage).
Use Active Full + Incremental (as you use an SMB share and not DDboost, I suggest you switch to NFS share as there are some optimizations within Datadomain that are not available within SMB. For example automatic Cache adoption to random/sequential workload which can boost restore by multiple times)
Repository:
Block Allignement: Disabled
Uncompression: Disabled
This is the optimal processing in case of speed. As part of the data reduction is done within Veeam, the data reduction values shown at DataDomain level are reduced. This is expected and will not have an effect on the overall needed capacity on the DataDomain, just the ration shown there.
If you want to optimize it for that value then you need to switch the following 2 values from above:
Job-Compression: Optimal
Repository - Uncompression: Enabled
The "LAN target" setting is contra productive as it will reduce the block size to 512MB which slows down processing for the DataDomain. WAN, LAN, Local naming is still there for historical reasons and has nothing to do anymore with the actual data transfer method.
To know more about dedup settings, please go to: https://www.veeam.com/kb1745
-
- Novice
- Posts: 8
- Liked: 1 time
- Joined: Jan 17, 2021 9:16 am
- Contact:
Re: Question about best practice compression level on an external data domain
Thank you for the detailed answer.
I have now changed the settings to dedupe-friendly and local target (large blocks).
I have now changed the settings to dedupe-friendly and local target (large blocks).
-
- VP, Product Management
- Posts: 7076
- Liked: 1510 times
- Joined: May 04, 2011 8:36 am
- Full Name: Andreas Neufert
- Location: Germany
- Contact:
Re: Question about best practice compression level on an external data domain
Important point the block size change will not be activated after an Active Full (synthetic full will not work).
-
- Novice
- Posts: 8
- Liked: 1 time
- Joined: Jan 17, 2021 9:16 am
- Contact:
Re: Question about best practice compression level on an external data domain
Thanks for the tip.
I have deleted the backup completely again and set up the job again.
I have deleted the backup completely again and set up the job again.
Who is online
Users browsing this forum: No registered users and 39 guests