The following is the second in a series of posts about the Azure SQL Database DTU Calculator. In this post, I’ll cover some recent updates I’ve made as well as an overview of using the calculator for either a single database or a pool of databases.
Running a single SQL Server database in Azure has been supported for a few years but the concept of a Database Throughput Units (DTU) is relatively new. The Azure SQL team first introduced the concept after some customers complained about performance when using Azure SQL Web and Business editions while other customers raved about performance. You can guess which neighbors were the noisy ones. In order to give all Azure customers a more predictable level of performance, the Azure SQL team introduced a new set of Service Tiers and Performance levels. I won’t go into details regarding all the features of Service Tiers and Performance Levels but it is worth mentioning that each tier and level has an associated cost. If you want more performance, you pay more. If you want less… you get the point.
Introducing tiers and levels to provide more reliable performance was a great first step and works really well for a single database which requires consistent performance. However, many customers have more than a single database and in many cases, the performance of those databases isn’t consistent. There are periods of high intensity followed by periods of low intensity. Initially, developers would need to create a separate Azure SQL Database and try to manage the peaks and lulls of each database separately. However, this type of database management was complicated and not cost effective when when working with several databases or more.
At Build 2015, Microsoft announced the preview of Azure SQL Elastic Databases to support unpredictable database demands and reduce the complexity of managing multiple databases. The on-demand performance scaling of individual databases within an elastic database pool is possible because each database within a pool uses eDTUs from a shared set associated with the pool. Allowing databases that require more performance to consume more DTUs while databases requiring less to consume less.
Regardless of running a single database or a pool of databases, developers migrating to Azure still need to know how their databases will perform after they migrate. The good news is that SQL Azure allows you to scale your database up or down depending on your database needs. The bad news is all of that happens post migration so it’s difficult to get a sense of cost without knowing the Service Tier and Performance Level required for your database workload. That’s where the DTU Calculator can help.
Single Database or Multiple Databases
The instructions for using the DTU calculator are straightforward so I’ll try to focus on some of the things I don’t mention on the main page of the calculator. First, you’ll need to decide if you’re going to run the calculator for a single database or multiple databases. Running the calculator for a single database means your SQL Server only has one database and that database is what you plan to migrate to Azure. If your SQL Server has more than one database but you still only plan to migrate one of those databases, you’ll either need to modify the PowerShell script I provide on the calculator’s main page, or understand that the calculator will likely overestimate your workload because the script is configured to capture totals for CPU, IOPs, and Log on the server– not by individual database.
If you SQL Server has multiple databases and you plan to migrate all of those databases to Azure, then you can run the PowerShell script as provided. The calculator also supports multiple databases on multiple servers. However, you will need to run the script on each server and then upload all the results in the calculator.
As part of the calculator, I’ve provided a PowerShell script to help capture the correct performance counters for the correct period of time. By default, the script captures performance for the server at one second intervals for the duration of an hour. However, you do not need to use the script as provided or at all. Feel free to modify the script as needed or roll your own. The only requirement for uploading results to the calculator is a CSV file with headers:
- Processor – % Processor Time
- Logical Disk – Disk Reads/sec
- Logical Disk – Disk Writes/sec
- Database – Log Bytes Write/sec
Note: Make sure you run the PowerShell script as administrator. If not, you may see errors about missing counters.
Once you have the performance CSV file(s), you are ready to calculate the results. First, enter the number of cores for your SQL Server. The calculator uses the number of cores to normalize the results against the servers running in Azure. After specifying the number of cores, click the browse button to upload your CSV file. If you are running the calculator for multiple databases, use the add and remove icons to add/remove multiple files to the table. Once you have entered the number of cores for each server and uploaded all of your files, click the calculate button.
After clicking the calculate button, you’ll see several charts which provide an analysis of your database resource consumption. The charts depict the percentage of time (based on your measurements) that your database’s resource consumption fits within the limits of each Service Tier and Performance Level. You can review CPU, Iops, and Log individually as well as collectively to better understand which metrics affect the performance of your database.
In this post, I’ve given a little more background regarding database throughput units, Azure SQL Database, and Elastic Databases. I’ve also provided some instructions for using the DTU Calculator for either a single database or a pool of databases. In my next post, I’ll cover analyzing the results.