-
- Influencer
- Posts: 11
- Liked: 1 time
- Joined: Nov 12, 2019 10:45 pm
- Full Name: Jon Pollock
- Contact:
Break up a large NAS job in to multple pieces
I'm wondering if anyone has an idea to break apart large NAS jobs into smaller pieces for easier management? An example is this: one single NetApp NAS share with 8k subfolders in the root. The total share contains 110TB, 12M files. I would love a way to break that up - say job 1 with all subfolders starting with the letters A - L, and job 2 with all subfolders M-Z. Is there a way to use expressions in the filters? I assume adding a filter of A* wouldn't just apply to the 1st level folder.
Any idea is welcome. I'm in the testing stage right now, so I can try and test anything.
Any idea is welcome. I'm in the testing stage right now, so I can try and test anything.
-
- Product Manager
- Posts: 15127
- Liked: 3232 times
- Joined: Sep 01, 2014 11:46 am
- Full Name: Hannes Kasparick
- Location: Austria
- Contact:
Re: Break up a large NAS job in to multple pieces
Hello,
for only 100TB / 12M files I would not want it because it's too much work for probably little benefit. But if you really want to do it, then customers do that with PowerShell. There is no option in the UI.
You might see a bit better performance if you split jobs, but we work on fixing that issue in a future version (meaning 1 large job should have same performance like multiple smaller ones).
Best regards
Hannes
for only 100TB / 12M files I would not want it because it's too much work for probably little benefit. But if you really want to do it, then customers do that with PowerShell. There is no option in the UI.
You might see a bit better performance if you split jobs, but we work on fixing that issue in a future version (meaning 1 large job should have same performance like multiple smaller ones).
Best regards
Hannes
-
- Influencer
- Posts: 11
- Liked: 1 time
- Joined: Nov 12, 2019 10:45 pm
- Full Name: Jon Pollock
- Contact:
Re: Break up a large NAS job in to multple pieces
Oops... I forgot the point of my question. We run copy jobs that copy the repos to tape on a weekly basis. It takes ~130-138 hours (5-6 days) to write a tape set. I have two repos > 100TB and a few smaller ones. The tape library has 6 drives and 4 of them sit idle most of the time. My thinking is that if I can split the repos into smaller pieces, I can divide them between the tape drives and get the jobs done faster. Exacerbating the problem is that while the tape jobs are running, new source backup jobs can't, so I end up with only 1-2 new restore points a week when I would like to see at least 5 dailies per week.
The data is also growing fast, so it won't be long until the tape jobs run longer than a week.
The data is also growing fast, so it won't be long until the tape jobs run longer than a week.
-
- Product Manager
- Posts: 15127
- Liked: 3232 times
- Joined: Sep 01, 2014 11:46 am
- Full Name: Hannes Kasparick
- Location: Austria
- Contact:
Re: Break up a large NAS job in to multple pieces
okay, if you want to split to multiple tape jobs, then creating multiple NAS jobs can either be done manually or with scripting. Regular expressions are not supported.
-
- Influencer
- Posts: 11
- Liked: 1 time
- Joined: Nov 12, 2019 10:45 pm
- Full Name: Jon Pollock
- Contact:
Re: Break up a large NAS job in to multple pieces
Can you give me a hint about how to get started with that idea?
-
- Influencer
- Posts: 11
- Liked: 1 time
- Joined: Nov 12, 2019 10:45 pm
- Full Name: Jon Pollock
- Contact:
Re: Break up a large NAS job in to multple pieces
I tried adding the exclusion \A* (and also /A*), but that doesn't work. Is there another way to write this so that it works?
Who is online
Users browsing this forum: No registered users and 6 guests