That tidied things up a bit, but the email repository is heading north of 2TB and growing at a steady clip. Just when i was starting to think about maybe having the email repository split by creating a new repo for new incoming data by year, I got asked to carve it up by company within the group of companies that share this infrastructure.
So, I now need
a) to figure out how to dust off the trusty old Move-VBOEntityData command so that I can migrate all the data for {someuser@company_a.com} into their own repo.
b) iterate for the other ~50 or so users at 'Company A'
c) Figure out if it's more effective to migrate the remaining users into their own new repository, too, or if there's a utility to compress the JET Database used under the hood by VBO?
d) preferably, if it's possible, calculate the amount of data that is going to be moved into the 'Company A' repository ahead of time so I can allocate the right amount of storage (and right amount of associated cost) for Company A's useage.
I think I can handle 'a' and 'b' already, but if someone has script that they've written that accomplishes similar that they're happy to share, I'm always happy to not re-invent the wheel where possible.

For d) I see there's a commandlet that will measure the size of data 'in cloud' that would need to be protected if backing up a given organisation, but is there anything along the lines of a 'Measure-VBOEntityData' that could achieve this without having to actually go through the whole data migration exercise first? I'm wincing at the idea of doing this on a 2TB dataset, let alone some of the far larger datasets I know some of y'all have out there.
Thanks,
Nathan