Comprehensive data protection for all workloads
Post Reply
filippoAdb
Enthusiast
Posts: 30
Liked: never
Joined: Apr 08, 2021 12:59 pm
Contact:

File server backup best practices

Post by filippoAdb »

All my VM backups run once a day with a retention of 14 points.
Some of these VM are file server and I think this backup strategy is not adequate.
For example if a user erroneously deletes a folder and he notices it only after a month that documents are lost.
A backup expert once told me that VM backups are general purpose, you need to integrate them with an application specific backup.
Should I add a file share backup ? Is this the right way or should I configure GFS retention ?
In the past I was afraid by GFS retention because I would have n copy of all the files on the share which is a waste of space. Ideally I need a sigle copy of every file version, with the deleted ones, till a reasonable amount of time ago.
Now I have a deduplicating repository, so perhaps the GFS could be a good compromise.
Another option is to add a special job only for the file share disks avoiding the one with OS. That way the incremental changes would be limited and I could increase the retention point.
What I'm looking for is some best practices to follow.
Mildur
Product Manager
Posts: 11549
Liked: 3240 times
Joined: May 13, 2017 4:51 pm
Full Name: Fabian K.
Location: Switzerland
Contact:

Re: File server backup best practices

Post by Mildur »

Hi Filippo

If you need to protect different file versions of your files, a NAS backup job is more suitable.
GFS volume level backups only store a weekly, monthly or yearly snapshot of your fileserver. If a file was modified every hour, you won't have those versions between two GFS backups in your repository.

NAS backup jobs are forever incremental. A NAS backup job won't protect necessarily all file versions. It depends on how often you run the job. We only backup the file version which exists when the backup job has processed the share/managed file server. For the NAS backup job you can leverage archive storage to keep older (or all) file versions on cheaper storage.

Best,
Fabian
Product Management Analyst @ Veeam Software
pfeifix
Novice
Posts: 9
Liked: never
Joined: Oct 22, 2023 3:32 pm
Contact:

Re: File server backup best practices

Post by pfeifix »

A follow-up question: What happens if the directories of the files on the NAS change frequently and the files are just moved between folders? Is there any sort of deduplication running on Veeam repo side after the moved files are transferred by a job run?

Using a volume-based backup this should be the case but does a file-level/NAS backup also provide some kind of handling to ensure backup file sizes are not constantly increasing when files only get moved on the source?
david.domask
Product Manager
Posts: 3406
Liked: 807 times
Joined: Jun 28, 2016 12:12 pm
Contact:

Re: File server backup best practices

Post by david.domask »

Hi pfeifix,
>What happens if the directories of the files on the NAS change frequently and the files are just moved between folders?
The backup job will detect these as new files and process them again. There is not deduplication across the Unstructured Data backup at this time.

Out of curiosity, how much data / how often is a move of just the files happening?
David Domask | Product Management: Principal Analyst
Post Reply

Who is online

Users browsing this forum: AdsBot [Google], Bing [Bot], Google [Bot], saschak, Semrush [Bot] and 276 guests