Discussions related to using object storage as a backup target.
Post Reply
xorman
Novice
Posts: 4
Liked: never
Joined: Feb 09, 2019 12:41 am
Contact:

Capacity Tier for DAS

Post by xorman »

We have a few years old backups that we would like to migrate to a separate repository.
Let's say I have cheaper storage for less critical data and I'm looking for a way to constantly migrate old full backups older than X days to a dedicated DAS. Kind of capacity tier for DAS.

For internal reasons all cloud solutions out of the scope at this point.

Is there a built-in way to do that?


Thanks.
nielsengelen
Product Manager
Posts: 5618
Liked: 1177 times
Joined: Jul 15, 2013 11:09 am
Full Name: Niels Engelen
Contact:

Re: Capacity Tier for DAS

Post by nielsengelen »

Currently the cloud tier only supports object storage so to achieve this you would need to leverage backup copy jobs.
Personal blog: https://foonet.be
GitHub: https://github.com/nielsengelen
xorman
Novice
Posts: 4
Liked: never
Joined: Feb 09, 2019 12:41 am
Contact:

Re: Capacity Tier for DAS

Post by xorman »

Backup copy job wouldn't help in my case as I'm looking for a way to move existing backups from number of repositories to a new one.
Are there any powershell scripts available for this type of tasks?
veremin
Product Manager
Posts: 20270
Liked: 2252 times
Joined: Oct 26, 2012 3:28 pm
Full Name: Vladimir Eremin
Contact:

Re: Capacity Tier for DAS

Post by veremin »

You should be able to script file copy utility (robocopy or similar) to move data that falls out of the specified timeframe, but that will most likely confuse backup server - its configuration database will still have references to restore points that have been already moved to different location. Thanks!
xorman
Novice
Posts: 4
Liked: never
Joined: Feb 09, 2019 12:41 am
Contact:

Re: Capacity Tier for DAS

Post by xorman »

That's unfortunate.

I hope there is something more advance than just file copy utility.


Thanks for your response.
Gostev
Chief Product Officer
Posts: 31456
Liked: 6647 times
Joined: Jan 01, 2006 1:01 am
Location: Baar, Switzerland
Contact:

Re: Capacity Tier for DAS

Post by Gostev »

Regular file systems will not scale for our needs, so DAS is not really an option outside of small environments (we actually do have the code that enables leveraging regular file systems for Capacity Tier, we used that internally for dev labs before while S3 client was being implemented).

Did you consider putting object storage on top of your DAS? They are plenty of free and commercial solutions for this.
xorman
Novice
Posts: 4
Liked: never
Joined: Feb 09, 2019 12:41 am
Contact:

Re: Capacity Tier for DAS

Post by xorman »

Thanks, Gostev
It's actually a fresh idea.

I'll take a look on object storage solutions for local storage.
jmmarton
Veeam Software
Posts: 2092
Liked: 309 times
Joined: Nov 17, 2015 2:38 am
Full Name: Joe Marton
Location: Chicago, IL
Contact:

Re: Capacity Tier for DAS

Post by jmmarton »

Take a look at Minio, an open source S3 object storage solution.

https://www.minio.io/

You can run it on either Windows or Linux and present it as S3 object storage to VBR to then use any on-premises storage for the Capacity Tier. I haven't set it up in my lab yet, but a few colleagues have in order to demonstrate Capacity Tier without having an Azure Blob or S3 account.

Joe
Gostev
Chief Product Officer
Posts: 31456
Liked: 6647 times
Joined: Jan 01, 2006 1:01 am
Location: Baar, Switzerland
Contact:

Re: Capacity Tier for DAS

Post by Gostev »

Linux only with Minio please, see the sticky compatibility list ;)
Post Reply

Who is online

Users browsing this forum: No registered users and 15 guests