Maintain control of your Microsoft 365 data
Post Reply
nroyepic
Service Provider
Posts: 21
Liked: 2 times
Joined: Apr 25, 2023 7:03 pm
Full Name: Nick Roy
Contact:

Migrate v365 smb data to new repository failures

Post by nroyepic »

T#07846743

We have a new storage cluster, we're trying to decom the old one and our clients have like 5 out of 7 years left on retention so we can't wait it out easily. Tried to use windows file move feature but received
Error when using SMB copy with windows -> Files in use

I know I could stop veeam 365 services but we have some critical activity and would interrupt 100 clients activities potentially. Let alone the time it takes to copy 20tb over per client to the new location.

Created veeam case, received a script https://www.veeam.com/kb3067 https://www.veeam.com/download_add_pack ... 365/kb3067
But when running it, I get an error

Get-VBOEntityData : JetError -529, JET_errLogDiskFull, Log disk full
At C:\Scripts\kb3067.ps1:12 char:80
+ ... Null) -and (Get-VBOEntityData -Repository $_ -Type Organization -Name ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Get-VBOEntityData], FaultException
+ FullyQualifiedErrorId : System.ServiceModel.FaultException,Veeam.Archiver.PowerShell.Cmdlets.DataManagement.Ge
tVBOEntityData

Get-VBOEntityData : JetError -529, JET_errLogDiskFull, Log disk full
At C:\Scripts\kb3067.ps1:12 char:80
+ ... Null) -and (Get-VBOEntityData -Repository $_ -Type Organization -Name ...
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [Get-VBOEntityData], FaultException
+ FullyQualifiedErrorId : System.ServiceModel.FaultException,Veeam.Archiver.PowerShell.Cmdlets.DataManagement.Ge
tVBOEntityData

we have 26gb free on the C: where we're trying to run it from (on the veeam 365 server itself) and 34tb free on the caching drive on the same server. The repository itself and drive we're trying to save everything to has 40tb free.

Veeam support advised I try moving one use and one file at a time but that would be way too tedious for 100tb worth of data with ~7000 sharepoint objects, and users combined
Post Reply

Who is online

Users browsing this forum: mdippold and 6 guests