I currently have a SOBR setup that we want to offload the capacity tier into azure blob along with the gfs archive ability. My question is, is it best practices when setting up the repos to have a separate storage account for capacity objects and archive objects? or just use the same storage account and separate container folders?
I do not currently have a support ticket, however after some discussions with our account rep I was asked to see if I could get an answer here.
-
- Novice
- Posts: 4
- Liked: 1 time
- Joined: Jan 14, 2016 7:53 am
- Contact:
-
- Product Manager
- Posts: 14836
- Liked: 3083 times
- Joined: Sep 01, 2014 11:46 am
- Full Name: Hannes Kasparick
- Location: Austria
- Contact:
Re: Storage account questions for SOBR using Azure blob storage
Hello,
yes, that's a "how" or "best practice" question, so it's right here on the forums (support is for technical problems only).
As far as I know, there are no official best practices written somewhere. That means, it seems to be relatively irrelevant (I don't see the amount of data in your question, so it probably is irrelevant). I personally would go with multiple storage accounts, simply because that increases some maximum values that exist per storage account. Then it's "future proof" and I don't need to think about it anymore.
Best regards,
Hannes
yes, that's a "how" or "best practice" question, so it's right here on the forums (support is for technical problems only).
As far as I know, there are no official best practices written somewhere. That means, it seems to be relatively irrelevant (I don't see the amount of data in your question, so it probably is irrelevant). I personally would go with multiple storage accounts, simply because that increases some maximum values that exist per storage account. Then it's "future proof" and I don't need to think about it anymore.
Best regards,
Hannes
Who is online
Users browsing this forum: No registered users and 9 guests