I ended up doing something similar, but thank you for the tip on Invoke-Command -Session, that helps being able to monitor the jobs in the queue.
Can I ask what you ended up doing?
I am in a very similar situation: I want to export very particular GFS restore points from one repository (ReFS, so block cloned) to another repository (NTFS DeDup, so needs uncompressed data), and the Export-VBRRestorePoint seems to be the best way of achieving this.
The Restore Points only need kept for a finite period of time, so "exporting" them is acceptable (they show differently in the GUI, etc. but this is fine for us).
VeeaMover would be ideal, however that is for moving entire backup chains around, not "cherry picking" individual GFS restore points.
Using the File Copy job is also simple to set up, but it doesn't take in to consideration the source/target repositories, so it doesn't "inflate" the backups in to the DeDup repository which impacts the ability of the deduplication process.
As the DeDup is a background task, I need to be able to stagger the data moves, so I need a way to be able to trigger moves of particular GFS Restore Points between the repositories every day or so over the next week or two (or three).
Using the Export function in the GUI is great as you can bulk select all the VMs in the repository, however selecting the required Restore Point
per VM is a bit tedious

but it does schedule all the VMs to Export and only runs them based on Repository Resource throttling, which is nice
So I've dropped to PowerShell, and constructed a method to select the required Restore Points, but I've hit the issue that you did that it runs "interactively". I can cancel the PowerShell command when the first one in the pipeline is running, but that then cancels the whole thing, however the one that was running when I cancel stays running in the background in the Backup Console
Other commands have the "
RunAsync" parameter that the Export-VBRRestorePoint is missing
Here is the crude code as it stands:
Code: Select all
$mySourceBackupJob = "My Archive"
$myDestinationRespoitoryName = "My Repository"
$myYear = 2024
$myMonth = 2
$myStartDate = Get-Date -Year $myYear -Month $myMonth -Day 1 -Hour 0 -Minute 0 -Second 0
$myEndDate = Get-Date -Year $myYear -Month $myMonth -Day 10 -Hour 0 -Minute 0 -Second 0
$mySourceBackupJobId = (Get-VBRBackup | Where-Object {($_.Name -eq $mySourceBackupJob) -and ($_.JobId -ne "00000000-0000-0000-0000-000000000000")} | Sort-Object -Property CreationDate | Select-Object -Last 1).Id
$myDestinationRespoitory = Get-VBRBackupRepository -Name $myDestinationRespoitoryName
Get-VBRBackup -Id $mySourceBackupJobId | Get-VBRRestorePoint | Where-Object {($_.Type -eq "Full") -and ($_.CreationTime -ge $myStartDate) -and ($_.CreationTime -le $myEndDate)} | Export-VBRRestorePoint -Repository $myDestinationRespoitory
Notes:
- Selection of the correct BackupJob Id is convoluted, because Veeam appears to remember deleted backup jobs of the same name (that don't show in the GUI), so I try to filter them out by the JobId and then selecting the most recently created one.
- I want GFS restore points that were created at the beginning of the month, so I have some crude checks to see if it falls in the first 10 days of the month, because it might have been created on the 2nd day of the month due to processing time, other backups scheduled beforehand, etc. 10 days is wide enough to catch them all, but not too wide to grab the next months restore points, and I don't have to fiddle around with the varying lengths of months

I have also noticed that Export-VBRRestorePoint ignores any Traffic Throttling rules between agents when going from Repository to Repository [even when
unticking "Never throttle any restore activity"], whereas File Copy jobs do respect the Traffic Throttling rules.