Maintain control of your Microsoft 365 data
Post Reply
Arichardson
Novice
Posts: 3
Liked: never
Joined: Nov 17, 2025 11:19 am
Full Name: Alex Richardson
Contact:

Questions surrounding backup of large SharePoint libraries

Post by Arichardson »

We are currently in the process of migrating our NTFS data into SharePoint. On the existing NTFS drive each department has their own folder and these have been migrated into a document library of a Team site, one Team site/library per department. We now have 19x Team Sites each with a document library that contains all of the files from the respective NTFS folder. Some of these libraries contain more than 300,000 items, and over 1,000,000 items in one case.

I have been reading the following article regarding configuration maximums and am now concerned about how these backups will perform:

https://bp.veeam.com/vb365/guide/design/maxconfig

There are a few points here, the first is: "First and foremost, we recommend adhering to the limits specified by Microsoft for SharePoint, paying special attention to the guideline that suggests, “For optimum performance, we recommend storing no more than 300,000 files in a single OneDrive or team site library." As previously mentioned, some of the site libraries are well over the 300,000 files recommended limit. The point that follows suggests counting every 5GB of SharePoint data as one object for your sizing to accurately reflect the impact of larger SharePoint sites on Veeam Backup for Microsoft 365 “VB365” infrastructure. By this logic (in terms of size) the sites fall well below the object limit of 300,000.

What are the implications of backing up a SharePoint site containing over 300,000 items with VB365, is this even possible? I have also read elsewhere that once a site library exceeds 300,000 items, incremental backups are no longer supported and a full backup would be required for every run, is this a true statement? If so this just isn't a viable solution for us as we need daily backup of this data and some of the jobs will take several days. Is anyone else out there backing up sites exceeding the 300,000 file limit using VB365 and what has your experience been like?

Are there any workable solutions that would not require us to re-structure our sites completely? We are a week out from going live with the SharePoint sites and this is currently a major blocker for us!
Mike Resseler
Product Manager
Posts: 8293
Liked: 1361 times
Joined: Feb 08, 2013 3:08 pm
Full Name: Mike Resseler
Location: Belgium
Contact:

Re: Questions surrounding backup of large SharePoint libraries

Post by Mike Resseler »

Hi @Arichardson

First, no... I will not be a full backup every run.

Second, what will the consequences be... Ideally, since you are a week away, I assume you already have done at least one big site, which means that you should start backing that one up now. What I fear is that it will take months before such sites will be protected. You will hit throttling and data limitations per day that you will be able to protect. We have customers that succeeded in doing this (but not actually for 19 different large libraries...).

The problem is the list view limit in SharePoint Online which we need. That is set at 5000 (be aware, your customers will suffer from that also while viewing the team site). That means we need to backup a 300k items list in 60 batches, so it will take a very long time. Because of the 60 batches (even in incremental mode, we need to verify the changes) this means that potentially th 24 hour limit is going to be difficult to maintain.

But this will depend on many different things, and it is as good as impossible to guess now if this is feasible or not. My advice on this matter... Create a separate backup job for 1 of these large sites, and look at the statistics to see how it performs as a test for a week or longer.
Arichardson
Novice
Posts: 3
Liked: never
Joined: Nov 17, 2025 11:19 am
Full Name: Alex Richardson
Contact:

Re: Questions surrounding backup of large SharePoint libraries

Post by Arichardson »

Hi Mike

Thanks for your response. All 19 of the sites already exist in SharePoint and contain all of the data as of last Monday, the plan is to run a delta over the coming weekend to bring over any changes that have occurred on the NTFS drive since the last delta was completed. To be clear not all of the sites exceed the 300k item limit, 5 of the 19 sites exceed the limit. We kicked off a job late last week which which includes all 19 sites, the only sites which are yet to complete on that job are the 5 which exceed 300k items. One of the sites that has already completed contained 294k items.

Regarding your point on the list view limit, i would expect then that any site that is larger than 5k files will need to be done in batches, and for the site mentioned above would have been completed in 59 batches? Why does it become more of an issue when we exceed 60 batches/300k items+?

I am struggling to understand why the initial runs would take months to complete when we have already managed to back up a site containing close to 300k files in a matter of days, are you able to clarify this? I have thought about setting up a separate job that covers just one of the large sites and seeing how it performs, so perhaps we will go ahead and do that. Appreciate your input!
mrowell
Novice
Posts: 8
Liked: 5 times
Joined: Nov 25, 2015 2:19 am
Full Name: Marcus Rowell
Contact:

Re: Questions surrounding backup of large SharePoint libraries

Post by mrowell »

@Arichardson - I would strongly push back on whoever decided your SharePoint architecture. Treating SharePoint like a file share is well know as a recipe for disaster. This will most likely be a terrible experience for your users as any document library over 5000 items will impact user experience (eg views breaking) and over 300,000 items synced via OneDrive to client machines starts to become very slow and will result in way more sync errors. You and your users will be running into issues for ever. I'd stop the rollout before it goes live. I'm sure less than an hour with a good SharePoint consultant will have your org changing their plans.
Arichardson
Novice
Posts: 3
Liked: never
Joined: Nov 17, 2025 11:19 am
Full Name: Alex Richardson
Contact:

Re: Questions surrounding backup of large SharePoint libraries

Post by Arichardson »

Just to clarify the doc library does not just have 300k files at the root, it contains a copy of the folder structure from the NTFS drive where very few if any folders will exceed 5000 items in a single folder. Surely the list view limit is only relevant if we want to list more than 5000 files in a singe view which is unlikely? We have already ran a pilot using the entire dataset and users were not reporting any performance issues and for the most part, were able to run through all of their processes in the same way that they would using the existing NTFS drive. Maybe im missing something here as im no expert on SharePoint, but we needed to maintain the existing structure and this is the way we were advise to achieve it. If we cannot reliably back up the data on a daily basis though then that is definitely an issue - we will discuss internally tomorrow. Thanks for your response.
Post Reply

Who is online

Users browsing this forum: raftbone and 20 guests