Hi
I was wondering if there is some way of configuring the DTU of the database that has been copied over to the Staging Server during an Azure SQL DB backup process.
The azure SQL backup process is quite intensive and currently maxes out the DTU at 100% consistently. Because of this, the backup process takes well over 1 hour to complete, but our requirement is to have a backup every hour. The limiting factor seems to be the DTU, because If I manually increase the DTU (double or triple) I can bring this down within a suitable timeframe.
Thanks
-
- Novice
- Posts: 3
- Liked: 1 time
- Joined: May 09, 2024 12:39 pm
- Full Name: Chen Ko
- Contact:
-
- Veeam Software
- Posts: 170
- Liked: 54 times
- Joined: Oct 04, 2021 4:08 pm
- Full Name: Lyudmila Ezerskaya
- Contact:
Re: Configuring DTU or DB Capacity for Azure SQL Backup using Staging Server
Hi! Currently, there is no option to configure DTU for the Azure SQL backup process. We will explore adding this functionality, but I cannot provide any ETA at this time.
If you could provide some more details on your infrastructure setup, we could use it to enhance our integration for future testing/plans.
If you could provide some more details on your infrastructure setup, we could use it to enhance our integration for future testing/plans.
-
- Novice
- Posts: 3
- Liked: 1 time
- Joined: May 09, 2024 12:39 pm
- Full Name: Chen Ko
- Contact:
Re: Configuring DTU or DB Capacity for Azure SQL Backup using Staging Server
Hi
Thanks for confirming.
Our infrastructure is relatively simple. Using a hub and spoke model, we have a number of subscriptions in azure with resources we need to backup. A separate subscription is used to manage all backups with repositories normally stored elsewhere. Currently we are using Veeam to backup some of our SQL Azure databases in Azure, but if this is successfully we will move onto other resource types, and then move onto setting this up in other cloud providers which we use.
For our databases, we use a staging server for stability as we tend to find that the DTUs normally max out our transactional DB if the backups are done in situ. The backup process is generally quite an intensive process, in fact, many times more so that our day-to-day transaction needs, so what we found out is that we either need to increase the DTU (3 to 4 times) on the original server prior to backup process or find some way of increasing it when the DB is copied over to the staging server so that it can run efficiently. Obviously it would be more cost effective if there was some method of controlling the DTU on the copied DB in the staging server.
Thanks
Thanks for confirming.
Our infrastructure is relatively simple. Using a hub and spoke model, we have a number of subscriptions in azure with resources we need to backup. A separate subscription is used to manage all backups with repositories normally stored elsewhere. Currently we are using Veeam to backup some of our SQL Azure databases in Azure, but if this is successfully we will move onto other resource types, and then move onto setting this up in other cloud providers which we use.
For our databases, we use a staging server for stability as we tend to find that the DTUs normally max out our transactional DB if the backups are done in situ. The backup process is generally quite an intensive process, in fact, many times more so that our day-to-day transaction needs, so what we found out is that we either need to increase the DTU (3 to 4 times) on the original server prior to backup process or find some way of increasing it when the DB is copied over to the staging server so that it can run efficiently. Obviously it would be more cost effective if there was some method of controlling the DTU on the copied DB in the staging server.
Thanks
Who is online
Users browsing this forum: No registered users and 4 guests