Discussions related to exporting backups to tape and backing up directly to tape.
Post Reply
sbishop
Influencer
Posts: 17
Liked: never
Joined: Nov 01, 2015 1:20 pm
Full Name: Steve Bishop
Contact:

Duplicate job to tape - not same amount of processed data

Post by sbishop »

Hi,

We are almost done testing but have ran into an issue with duplicating a job to tape. Currently backing up 18VM (VMware) to a StoreOnce. The processing rate is approx. 70MB/s and the amount of data processed is 3.3TB. Is 70MB/s reasonable?

Now the issue I had on the weekend. I've got a tape job setup to duplicate the above job once its complete. it processed 3.7TB and asked for a second tape.
the process rate was 77MB/s.

How can it process 3.7TB and ask for a second tape when the previous job to disk was 3.3TB?

Should I have the tape job setup to duplicate from backup job or from StoreOnce repository?

Thanks for any help in advance
Sbishop
PTide
Product Manager
Posts: 6551
Liked: 765 times
Joined: May 19, 2015 1:46 pm
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by PTide »

Hi,
Is 70MB/s reasonable?
That depends on the backup mode you use. You can revirew your bottleneck statistics to determine the slowest component.
How can it process 3.7TB and ask for a second tape when the previous job to disk was 3.3TB?
If you specified a repository as source and some other job has put some data there then your backup to tape job will pull all new data as well. Also full backups of those 18VMs could grow in size.
Should I have the tape job setup to duplicate from backup job or from StoreOnce repository?
Depends on how many of other backup jobs are pointing to that repository. If none then you can use the whole repository as source.
sbishop
Influencer
Posts: 17
Liked: never
Joined: Nov 01, 2015 1:20 pm
Full Name: Steve Bishop
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by sbishop »

only 1 backup job to that repository. But the job that ran on the weekend wasn't from the repository but backup job that completed with 3.3TB...shouldn't it duplicate to tape be the same size? that's what is confusing that the tape ran until 3.7TB and asked for a second tape.

I'm testing it right now again but I switch the backup files to "backup repositories" instead of backup job...not sure if that would make a difference or not.

Thanks
PTide
Product Manager
Posts: 6551
Liked: 765 times
Joined: May 19, 2015 1:46 pm
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by PTide »

I'm testing it right now again but I switch the backup files to "backup repositories" instead of backup job...not sure if that would make a difference or not.
Switching to "Backup repository" source will make your job to pull the whole repo, so your backup to tape may get only bigger, not smaller.
only 1 backup job to that repository. But the job that ran on the weekend wasn't from the repository but backup job that completed with 3.3TB...shouldn't it duplicate to tape be the same size? that's what is confusing that the tape ran until 3.7TB and asked for a second tape.
In other words you say that you had 18 VMs backed up to repository and the resulting .vbk file was 3.3 TB in size whereas Backup To Tape Job that used Backup Job as a source has demanded 3.7 Tb and no other backups were made since the only backup job run that had produced 3.3 Tb file, is that correct?
sbishop
Influencer
Posts: 17
Liked: never
Joined: Nov 01, 2015 1:20 pm
Full Name: Steve Bishop
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by sbishop »

Correct.
veremin
Product Manager
Posts: 20400
Liked: 2298 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by veremin »

How did you determine the resulting .vbk size? By looking at how much space it occupies on dedupe storage?
sbishop
Influencer
Posts: 17
Liked: never
Joined: Nov 01, 2015 1:20 pm
Full Name: Steve Bishop
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by sbishop »

The job history.

Job 1 - Full VM backup to storeonce - Processed 3.3TB, read 3.2TB, transferred 3.0TB

Job 2 - duplicate above job to tape - processed 3.7TB, read 3.7TB, transferred 3.7TB...tape 2 is full, insert a valid tape.
veremin
Product Manager
Posts: 20400
Liked: 2298 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by veremin »

If that's the only backup job selected as a source for backups to tape job and .vbk is the only file created by source job, then, the behaviour looks unexpected and should be investigated by support team. Thanks.
Dave338
Enthusiast
Posts: 40
Liked: 5 times
Joined: Jan 27, 2015 12:21 pm
Full Name: David
Contact:

[MERGED]: Copying 2,1TB requires 3 LTO5 tapes

Post by Dave338 »

Hello, I'm writing here to read your opinions.

I have a monthly backup to tape and a weekly one. My weekly one is about 1,6-1,7TB and is requiring two tapes, all normal here... but my monthly one, which is 2,1TB is asking for a third tape.

I have tested with and without hardware compression and result is the same.
Two tapes show entire usage of their 1,4TB (4,6 and 4,7GB left), and the third shows 1,2TB left, so the total usage on tape is about 3TB, while actual size on disk is 2,1TB (is the size that the job is reporting to be copied on tape)

This is very inconvenient. If this behaviour continues when I add new vmachines (soon I'll add a file server), I'll need four tapes and it will break my backup strategy...
The tape backup job is not configured to process any incrementals, is simply taking the full backup of my jobs (I use reverse incremental, so always is copying latest full)

Does anyone knows why is this happening??

I also use the same library (a MSL4048 with 2 LTO5 drives) in a backup exec environtment, and always has worked flawless, with high hardware compression. I read that if the backup files are already compressed, hardware compression does nothing or negative compression, that's why I was not using hardware compression in veeam, but turning it on gives the same result.
Tapes are native 1,5TB (3TB compressed)

Any info would be appreciated. :)

Kind regards.
PTide
Product Manager
Posts: 6551
Liked: 765 times
Joined: May 19, 2015 1:46 pm
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by PTide »

Hi,

You've been nerged to the topic on a similar issue. Please review the thread and if the situation described matches yours kindly open a support case (don't forget to post your case ID here).

Thank you.
Dave338
Enthusiast
Posts: 40
Liked: 5 times
Joined: Jan 27, 2015 12:21 pm
Full Name: David
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by Dave338 »

IS not the same issue.

In my job all processed, read and transferred data are 2,1TB, but on tape are filling more than two tapes, about 3TB.

I've checked previous jobs and past month, for the same amount of data processed (2,1TB) it fitted on two tapes by a 52GB margin, which also was bad (it should had left about 800-900GB free on second tape), but this month the same amount of data ocuppied about 250GB more of tape... only difference is that I updated veeam to latest version.

I don't know if these things are possible with tape, but I've never faced this kind of problems writing to tape with other backup suites.. (having "negative" compression)

Regards.
PTide
Product Manager
Posts: 6551
Liked: 765 times
Joined: May 19, 2015 1:46 pm
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by PTide »

Ok, that seems to be unexpected - please open a case with support so they can take a closer look at your environment and post your case ID here.

Thank you.
Dave338
Enthusiast
Posts: 40
Liked: 5 times
Joined: Jan 27, 2015 12:21 pm
Full Name: David
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by Dave338 »

I've just opened a case with this issue ;)

01204821

Let's see if there is anything to do with that :)

Regards.
alanbolte
Veteran
Posts: 635
Liked: 174 times
Joined: Jun 18, 2012 8:58 pm
Full Name: Alan Bolte
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by alanbolte » 2 people like this post

Because many tape devices do a poor job of properly reporting the end of a tape (hardware EOM), by default we repeatedly try to check the capacity of the tape while writing it. Unfortunately, sometimes this also has unexpected results, and we end up detecting that the tape has little free space even though it is only partially written. Erasing the tapes or cleaning the tape drives can sometimes alleviate the issue. You can also request a registry value from Support that will disable this behavior (but on some devices will produce corrupt tapes without any warning).

At least, this is how I recall it working, I don't have specific internal documentation handy.

Also, you mentioned that you have multiple jobs, and only one job is giving you trouble. I recommend sending Support logs from a known-good job so they can compare it against the job that's using too many tapes.
Dave338
Enthusiast
Posts: 40
Liked: 5 times
Joined: Jan 27, 2015 12:21 pm
Full Name: David
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by Dave338 »

If your answer is for me, all my tape jobs (I have two of them), have the same issue.

I detected yesterday (and sent by email to support), that this started to happen in July. The two previous monthly backups I had reported correct tape usage. 1,8TB data on disk to 1,8TB on tape, but suddenly started to grow on tape. July's backup processed 1,9TB and used 2,4TB on tape.

I've been using new tapes every month because the environtment is new from this year and we have renewed all tapes, so is not a overwriting problem.

I'm trying a new firmware on my library (MSL 4048), to see if something changes.
Regards.
Dave338
Enthusiast
Posts: 40
Liked: 5 times
Joined: Jan 27, 2015 12:21 pm
Full Name: David
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by Dave338 »

Good morning.

After making some tests with support team, update drivers and firmware, etc, it seemed my job was working well, and amount of tape usage was corresponding to backup usage on disk. I deleted and recreated my weekly backup to delete any previous weird data from older updates or something...

Well, the first iteration worked well, but one week later, on the second iteration, my job processed 1,6TB of data and occuped 2TB on tape again. My job is only copying the last full backup always (reverse incremental backups and no incremental processing on the tape job), so is not an incremental or synthetic issue.

I'll try this week to delete manually the tapes before the tape job starts, to see if it makes any difference, but it is an annoying problem.
I've also set up the block size to 256k manually (default for my tapes is 128k).

Regards.
veremin
Product Manager
Posts: 20400
Liked: 2298 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by veremin »

It might stand to reason to reach device manufacturer as well, just to get issue (with file occupying 125% more space on tapes) covered from different angles. Thanks.
Dave338
Enthusiast
Posts: 40
Liked: 5 times
Joined: Jan 27, 2015 12:21 pm
Full Name: David
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by Dave338 »

But my library is working perfectly with backup exec. It compresses and amount of data written is correct.

NOw it's running a monthly backup and it is doing it worse than ever, occuping two entire tapes for a bit more than 1,6TB, and has left 480GB free on the first tape :shock: :shock:

REgards.
veremin
Product Manager
Posts: 20400
Liked: 2298 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by veremin »

As it's been mentioned, information regarding free/occupied space is taken directly from device. Thus, my recommendations regarding contacting device manufacturer. Also, it's worth double checking whether the experienced behaviour wasn't a result of firmware or drivers update.

Anyway, did you keep working with the support team? What was their final answer?

Thanks.
Dave338
Enthusiast
Posts: 40
Liked: 5 times
Joined: Jan 27, 2015 12:21 pm
Full Name: David
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by Dave338 » 2 people like this post

We are in talk...

He detected some errors (veeam related) about tape size not detected properly and applied a hotfix on my system. We will wait until next monthly to see if errors persist or the situation improves.

Regards.
DAvid.
Dima P.
Product Manager
Posts: 14720
Liked: 1703 times
Joined: Feb 04, 2013 2:07 pm
Full Name: Dmitry Popov
Location: Prague
Contact:

Re: Duplicate job to tape - not same amount of processed dat

Post by Dima P. »

Hello and Happy New Year David!

Thanks for the heads up - do not forget to post the testing results.
Post Reply

Who is online

Users browsing this forum: tinto1970 and 22 guests