PS C:\Users\skrause> Get-DedupStatus J:
FreeSpace SavedSpace OptimizedFiles InPolicyFiles Volume
--------- ---------- -------------- ------------- ------
1.31 TB 4.81 TB 80 79 J:
Tune dedup processing for backup data files—Run the following PowerShell command to set to start optimization without delay and not to optimize partial file writes. Note that by default Garbage Collection (GC) jobs are scheduled every week, and every fourth week the GC job runs in “deep GC” mode for a more exhaustive and time intensive search for data to remove. For the DPM workload, this “deep GC” mode does not result in any appreciative gains and reduces the amount of time in which dedup can optimize data. We therefore disable this deep mode.
Set-ItemProperty -Path HKLM:\Cluster\Dedup -Name DeepGCInterval -Value 0xFFFFFFFF
Tune performance for large scale operations—Run the following PowerShell script to:
Disable additional processing and I/O when deep garbage collection runs
Reserve additional memory for hash processing
Enable priority optimization to allow immediate defragmentation of large files
Set-ItemProperty -Path HKLM:\Cluster\Dedup -Name HashIndexFullKeyReservationPercent -Value 70
Set-ItemProperty -Path HKLM:\Cluster\Dedup -Name EnablePriorityOptimization -Value 1
These settings modify the following:
HashIndexFullKeyReservationPercent: This value controls how much of the optimization job memory is used for existing chunk hashes, versus new chunk hashes. At high scale, 70% results in better optimization throughput than the 50% default.
EnablePriorityOptimization: With files approaching 1TB, fragmentation of a single file can accumulate enough fragments to approach the per file limit. Optimization processing consolidates these fragments and prevents this limit from being reached. By setting this registry key, dedup will add an additional process to deal with highly fragmented deduped files with high priority.
nmdange wrote:By setting the file age to 1 day, it can't process the full backup which is always updated on the current day.
nmdange wrote:it's possible you won't see any benefit from deduping the backups because Veeam is already doing its own compression and dedupe and your backup job has only a single full backup file. You won't see massive savings unless you have multiple full backup files on disk.
nmdange wrote:Not related to your issue specifically, but you would also want your backup volume to be enabled for large file records. This is done either with "format /L" in cmd or "Format-Volume -EnableLargeFRS" in PowerShell.
Users browsing this forum: Google [Bot] and 56 guests