Comprehensive data protection for all workloads
Post Reply
Scott46550
Novice
Posts: 4
Liked: never
Joined: Dec 02, 2020 1:40 pm
Full Name: Scott Thompson
Contact:

Data Domain Poor Compression

Post by Scott46550 »

I am using Dell Data Domain DD2200 appliance and am getting very poor compression from the VEEAM backup. No idea why it has changed and was wondering if there is anyone that has had the same problem. Thank you.
Vitaliy S.
VP, Product Management
Posts: 27121
Liked: 2722 times
Joined: Mar 30, 2009 9:13 am
Full Name: Vitaliy Safarov
Contact:

Re: Data Domain Poor Compression

Post by Vitaliy S. »

Hi Scott,

What are the compression settings for your backup job or backup copy jobs? If you see poor compression, what is the dedupe rate? Do you have the decompression option enabled in the repo settings?

Thanks!
foggy
Veeam Software
Posts: 21073
Liked: 2115 times
Joined: Jul 11, 2011 10:22 am
Full Name: Alexander Fogelson
Contact:

Re: Data Domain Poor Compression

Post by foggy »

Also, how do you tell it is poor and what do you mean by "it has changed" - was it ok before?
Scott46550
Novice
Posts: 4
Liked: never
Joined: Dec 02, 2020 1:40 pm
Full Name: Scott Thompson
Contact:

Re: Data Domain Poor Compression

Post by Scott46550 »

I am using Forever Forward Incremental, Enable inline data deduplication unchecked. Compression level: Optimal. Storage optimization: Local target (large blocks)Dedup rate on Veeam is 1x. On the Data Domain the rate is 1.6-2.2x. Before with the old job it was 37.3x. No changes made to the Data Domain except different job same Mtree path. Decompress option in Veeam repo settings is checked.
foggy
Veeam Software
Posts: 21073
Liked: 2115 times
Joined: Jul 11, 2011 10:22 am
Full Name: Alexander Fogelson
Contact:

Re: Data Domain Poor Compression

Post by foggy »

Are other settings for these jobs also similar? What about encryption?
Scott46550
Novice
Posts: 4
Liked: never
Joined: Dec 02, 2020 1:40 pm
Full Name: Scott Thompson
Contact:

Re: Data Domain Poor Compression

Post by Scott46550 »

Settings are the same. No encryption.
foggy
Veeam Software
Posts: 21073
Liked: 2115 times
Joined: Jul 11, 2011 10:22 am
Full Name: Alexander Fogelson
Contact:

Re: Data Domain Poor Compression

Post by foggy »

Since everything looks good from the Veeam B&R perspective, I'd ask the storage vendor support to take a look.
janezk
Enthusiast
Posts: 55
Liked: 11 times
Joined: Jul 25, 2016 10:42 am
Full Name: Janez K
Location: Slovenija
Contact:

Re: Data Domain Poor Compression

Post by janezk »

HI,
It's worth mentioninig that with DD and forever forward incremental backup you should keep in mind that DD has a limitation to 60 restore points:
"The length of forward incremental and forever forward incremental backup chains (chains that contain one full backup and a set of subsequent incremental backups) cannot be greater than 60 restore points. To overcome this limitation, schedule full backups (active or synthetic) to split the backup chain into shorter series."
https://helpcenter.veeam.com/docs/backu ... ml?ver=100
I had a case, that for some reason (don't remeber why) on DD remained backup files, that were not seen in Veeam, and I had some issues with the backup.
If something suddenly happened, maybe you should take a look at the files that are on the DD directly.
Helge.T
Veeam Software
Posts: 205
Liked: 19 times
Joined: Dec 09, 2019 12:22 pm
Full Name: Helge Tengstedt
Contact:

Re: Data Domain Poor Compression

Post by Helge.T »

I've seen something similar happen (drastic drop in dedup rate on DD). It was simply down to a change in one of the MS-SQL VM's that was backed up. SQL admin had started to use maintenance plans and dump a 500GB .BAK file to a local directory on the VM every day, that accounted for it. So, it could be a thing to check.
SE-1
Influencer
Posts: 22
Liked: 5 times
Joined: Apr 07, 2015 1:42 pm
Full Name: Dirk Slechten
Contact:

Re: Data Domain Poor Compression

Post by SE-1 »

Do you use data domain replication via mtree?
Do you see physical space increasing & dedup rate decreasing?

if that is the case, check for old snapshosts example sync_reserve
by default they are kept for 1 year, it will consume up space.
when mtree's are in sync, it is safe to delete those, and then run a cleaning job (filesys clean start)

Another possibility are DB dumps, some db backup jobs (like oracle) need to be tuned for dd.

It is a best practice to keep an active full once a month and weekly synthetics.
Regarding veeam settings; you need to un check deduplication & compression from veeam
let the hardware handle it
stevekarra
Technology Partner
Posts: 42
Liked: 6 times
Joined: May 02, 2019 9:19 pm
Full Name: Steve Karra
Contact:

Re: Data Domain Poor Compression

Post by stevekarra »

The recommended best practice with Data Domain is to perform an Active Full periodically, say once every 1-2 weeks. Did you change any of the retention parameters including length of backup chain?

Check out the size of the VBK file (if you're using DD-Boost you can temporarily access the MTree via a CIFS share). If it's growing every day to some unexpected size, chances are the merge process is responsible and performing an Active Full (or Synthetic) should fix it.
Post Reply

Who is online

Users browsing this forum: Bing [Bot] and 125 guests