PowerShell script exchange
Post Reply
Ve2Luk

Missing sFull size

Post by Ve2Luk »

Hello,
recently we've came a cross a slight issue where sFull size is missing in the monthly reports and only incremental job size is covered. Support will address this as a feature request in the future.

In the meanwhile, is there a way to pull such sFull size together with incremental with PowerShell script from VeeamOne or at least from VBR? We want to be able to create monthly report with real size of all backup jobs per client.

regards,
HannesK
Product Manager
Posts: 14837
Liked: 3084 times
Joined: Sep 01, 2014 11:46 am
Full Name: Hannes Kasparick
Location: Austria
Contact:

Re: Missing sFull size

Post by HannesK »

Hello,
just to be sure... what does "sFull" size mean? I cannot find that term anywhere on the forums, the user guide and I cannot find any useful term by "internet search". Can you please provide the support case number to get the details what the request is about?

When you talk about clients... is it about Cloud Connect?

What is "real size" of a backup job? The amount of data that was stored on a repository during a month (sum up every day "transferred" value)? Or is the the space that a job occupies at the end of the month (imagine a backup job with 7 restore points that has less than 30 restore points)? Or something else?

For anyone to help you, more details would be useful.

Best regards,
Hannes
david.domask
Veeam Software
Posts: 2123
Liked: 513 times
Joined: Jun 28, 2016 12:12 pm
Contact:

Re: Missing sFull size

Post by david.domask »

Do you maybe mean Synthetic Full size? Which report are you discussing? Perhaps you can share a case number as I'm not aware of any known issues with any reports :(
David Domask | Product Management: Principal Analyst
HannesK
Product Manager
Posts: 14837
Liked: 3084 times
Joined: Sep 01, 2014 11:46 am
Full Name: Hannes Kasparick
Location: Austria
Contact:

Re: Missing sFull size

Post by HannesK » 1 person likes this post

if it is about synthetic full, then the next question would be which file system is used... XFS / REFS? Because there this value is not available in VBR
Ve2Luk

Re: Missing sFull size

Post by Ve2Luk »

Hello,
my apologies for the confusion and not being clear.

Yes, it's Synthetic Full job. Whenever there's one job to protect multiple VMs with Per-Machine Backup Files and forever incremental configuration, the sFull jobs are ommited from all the reports.

Yes, formally there's no sFull job with that configuration set, but actually there's something which looks like sFull, to close the cycles and meet the retention. So the at backup repo we can see single huge file per each job created periodically, with near full size. It's not covered on any report since formally that's not scheduled job, but it's physical consuming storage space. "Backup on Repository" don't show any values for these jobs/VMs and "Job History" shows only triggered by schedule backups like incremental one and it's ommiting sFulls.

So is there a way to pull with PowerShell from VeeamOne or VBR actual backup size consumption per VM to list all the files at backup repository?
david.domask
Veeam Software
Posts: 2123
Liked: 513 times
Joined: Jun 28, 2016 12:12 pm
Contact:

Re: Missing sFull size

Post by david.domask »

Hey Lukasz,

Got it.

As Hannes mentioned, this can be a bit complex if XFS/ReFS is involved since the "actual" space usage isn't tracked since the block cloning magic hides this from the OS. Basically when we poll for it, it will show as a "full" file size, meaning the full backup size, even if physically it uses less space. You'll see similar behavior with dedupe appliances (Data Domain, Storeonce, etc)

So first question: Do you use XFS/ReFS or dedupe?

If no, then you can just use the GetAllChildrenStorages() method on a Backup object to get a list of the storages (backup files) and their size data is under the Stats property.
David Domask | Product Management: Principal Analyst
Ve2Luk

Re: Missing sFull size

Post by Ve2Luk »

Hello David,
thank you for the reply. It's not about ReFS, but SMB share in this case, it's rather about whole approach based on:
* Forever incremental
* No synthetic full in schedule (but something similar is forced by system accordingly to retention)
* Per-VM configuration
and missing information about full backup size in total repo consumption. There's actually more files on backup repo, than covered with reports.

I'll setup a lab and give a try to GetAllChildrenStorages() or other commands.
david.domask
Veeam Software
Posts: 2123
Liked: 513 times
Joined: Jun 28, 2016 12:12 pm
Contact:

Re: Missing sFull size

Post by david.domask » 1 person likes this post

Heya Lukasz,

Ah, thanks for clarifying, but in this case I'm actually not quite sure what you're trying to track.

With Forever Forward Incremental, there is no full generation process, the oldest increment in the backup chain is merged live into the full backup file. No "extra" full is made.

Perhaps are you discussing Compact Fulls?
There's actually more files on backup repo, than covered with reports.
Can you show me what you mean maybe? Do you mean that the number of backup files on disk is not the same as what is set as retention? If so, I really recommend opening a support case so we can review why there is a discrepancy as with forever forward incremental, what you see on the retention should be what you see on disk, and if not, then there is something not configured right or not quite as expected.
David Domask | Product Management: Principal Analyst
Ve2Luk

Re: Missing sFull size

Post by Ve2Luk »

Hello David,
I'll setup a lab in the next few days and upload some some screenshots to show where to the inconsistency is.
Ve2Luk

Re: Missing sFull size

Post by Ve2Luk »

Hello,
so whole issue is about forever incremental. Regardless of configuration in use, as long as there's no synthetic full in place, the "hidden" full of forever incremental is not shown in any backup capacity report. Leading to situation where you cannot accurately measure what's capacity consumed by each protected server. Unless you check it at the storage level, but here you cannot get the values for each protected server, it's for whole backup job.

So is there a way to get with powershell or any other tool, information about accurate size consumed by each server, in the forever incremental scenario?

regards,
Łukasz.
david.domask
Veeam Software
Posts: 2123
Liked: 513 times
Joined: Jun 28, 2016 12:12 pm
Contact:

Re: Missing sFull size

Post by david.domask »

Hi Lukasz,

Can you explain though, which hidden full are you referring to? Forever forward never produces a full backup ever, not even a hidden one. The only thing I can think of is the Compact Full, but I don't think that is what you mean to track.

To be clear, when the merge happens, no extra file is created -- the data is injected into the existing full backup, so it's an "in-place" update. It's just not quite clear what you are seeing on disk/the UI that you don't see in Powershell.
David Domask | Product Management: Principal Analyst
Ve2Luk

Re: Missing sFull size

Post by Ve2Luk »

Hi David,
I wasn't familiar with that term, but yes, you are right it's about the Compact Full.
And again, yes, it's already in place, but you won't see compact full size in any report, that's the issue.


So let's consider some test scenario, to help you understand what's missing.

There's a job for 3 VMs, configured as forever incremental with 30 days of retention.

For the first 30 days from reports it's possible to get size of 1x first full and 29 x incrementals. Let's it be 20 GB.
So you can be aware, what's the real backup storage consumption for each of protected VMs at this point.

From day 31 to 61 (and onwards) with reports you can get only information about incrementals, because Compact Full is not a scheduled job. So you would be able to see 30x incremental jobs. Let it be 5 GB for each of VMs.

Actual size taken at the backup repository for each of VMs would be the size of Compact full plus incrementals, let's assume that real backup consumption is 25 GB for each of VMs, so 75 GB in total. From the reports would won't be able to see that, only size of the incrementals is reported which is 15 GB.

So, the use case is to get real backup repository consumption per each of servers, VMs etc. at each 1st of the Month. Ideally covering only jobs from the last 30 days, but it must cover the compact full size as well.

"Job History" report is not showing that data and "Backups on Repository" is mostly broken, not showing Full Backups Size, Increments Size and Total Backup Size, so there's no gain in using that.

The question is, is it possible with scripts or other tools to get the real backup repository per server, VM consumption for the scope of retention or last 30 days?

regards,
Łukasz.
david.domask
Veeam Software
Posts: 2123
Liked: 513 times
Joined: Jun 28, 2016 12:12 pm
Contact:

Re: Missing sFull size

Post by david.domask »

Hi Lukasz,

Okay, makes sense now:

Compact full will always be the actual space used by the backup on disk after it runs -- compact just defragments the backup file and removes the "empty" or no longer needed VM data (e.g., for VMs no longer backed up when you have non-per-VM backups).

So in fact, nothing changes; after a job session when a compact is run, simply fetch information about the Restore Points and the Full backup data as you usually would -- this will reflect the actual state of the chain on disk.

If you don't want it through the RestorePoints data, fetch the Backup with Get-VBRBackup and use the GetAllChildrenStorages() method as mentioned before.

Compact full is nothing special here, it just creates a second, compacted full backup, and then links the increments to it, and the original full is cleared after the compacted full is successfully created. So there isn't a special property for it, the Compacted Full _becomes_ the full backup at the start of the Forever Forward chain.
David Domask | Product Management: Principal Analyst
Post Reply

Who is online

Users browsing this forum: No registered users and 10 guests