We purchased two NAS devices to have in a remote site to start having offline backups. Every Monday, Wednesday, Friday, one device is plugged into the network, and on Tuesday and Thursday, the other is plugged in.
I created two copy jobs - both identical except for the target repository and the days it's allowed to transfer data - however I'm noticing some issues with this.
First is the jobs continuously run, and even though I disabled transfer on certain days, it still is seeking out the target and sending failure notifications.
The second issue I am seeing is on the second job, I'm getting Error: The network path was not found ..... Agent failed to process method
I'm assuming it's something to do with cataloging and I can only have one copy job?
Does anyone have suggestions to work around these issues? We'd really like to get this implementing as a last resort for ransomware protection.
-
- Novice
- Posts: 8
- Liked: never
- Joined: Apr 13, 2012 3:00 pm
- Full Name: John Sheehan
- Contact:
-
- Product Manager
- Posts: 14840
- Liked: 3086 times
- Joined: Sep 01, 2014 11:46 am
- Full Name: Hannes Kasparick
- Location: Austria
- Contact:
Re: Staggering Copy Jobs
Hello,
are you using GFS on the BCJ? If not, then configuring the repository with "rotated drives" could help https://helpcenter.veeam.com/docs/backu ... l?ver=95u4
To avoid the "continuously" run, you could script a "disable job".
In future (V10), the the "immediate" backup copy job might also help. It does not run continuously.
Best regards,
Hannes
are you using GFS on the BCJ? If not, then configuring the repository with "rotated drives" could help https://helpcenter.veeam.com/docs/backu ... l?ver=95u4
To avoid the "continuously" run, you could script a "disable job".
In future (V10), the the "immediate" backup copy job might also help. It does not run continuously.
Best regards,
Hannes
Who is online
Users browsing this forum: Google [Bot] and 85 guests