Introducing the Azure SQL Database DTU Calculator

The following is the first in a series of posts about the Azure SQL Database DTU Calculator.  This post describes the need for the calculator and the process of creating it.

As part of my job at Microsoft, I get the opportunity to work closely with companies across the US.  My particular area of focus is Azure so I spend most of my days talking about a company’s business model, architecture, IT costs, and other technical issues to help them identify areas where Azure can help as well as provide them with a roadmap for migration.  During one such meeting about 6 months ago, I was talking to a company about Azure SQL Database as a potential option for migrating their RackSpace SQL Server to Azure.  If you’re familiar with Azure SQL Database, then you’re probably familiar with the term database throughput units (DTUs).  If not, a DTU is something the Azure SQL team created to describe the relative capacity of a performance level across a blended measure of CPU, memory, reads, and writes.  The article, Azure SQL Database Service Tiers and Performance Levels, contains a more thorough explanation.  Anyway, during that discussion one of the company’s architects asked, “how do I know how many DTUs my database needs.” 

Seems like a reasonable question and at the time I didn’t have a great answer.  I’d read the guidance, Azure SQL Database introduces new near real-time performance metrics, so I was familiar with the process of starting with feature capability, picking the lowest performance level, migrating the database, and then adjusting based on utilization.  However, I could see the concern this caused for the engineers.  Migrating a database from location A to location B isn’t a trivial task and doing so with any level of ambiguity isn’t something any engineer would recommend.  It was at that time I reached out to a friend on the Azure SQL Database team for help.  The conversation went something like this (names have been changed to protect the innocent):

Justin: 

Hey John.  Quick question… Do you have anything to do with or know anything about SQL Database DTUs?

John: 

You bet.

Justin: 

Cool.  In your opinion, would it be possible to write a DTU Calculator which a developer could run against their on-premise database to see which tier of SQL Database they would map to?

John: 

We’ve discussed doing such a thing, and I think it’s a good idea, just haven’t had the time to do it yet.  Feel free to send me email, and I’ll get the right conversation going.

A calculator is born…

One of my main objectives when developing the calculator was to keep it simple – something engineers at any level of expertise could easily use.  With that in mind, I spent the last several months working with the Azure SQL team to understand all the complexities of converting SQL Server performance to database throughput units and developing a website to wrap those complexities in a simple, intuitive interface which can analyze your production database workload and accurately calculate the service tier and performance level needed in Azure SQL Database – all before migration.  I’m happy to announce the Azure SQL Database DTU Calculator does just that!  The calculator is available to use but I’d love to hear your feedback and results.  As I mentioned earlier, this post is just an introduction.  Look for my next post to go into more detail about using the calculator and interpreting the results.

Advertisements

150 thoughts on “Introducing the Azure SQL Database DTU Calculator

  1. Hi Justin. Great job on getting this calculator going. Just checking to see when you’re going to post about details on using the calculator and interpreting the results. Thank you for all your hard work with this.

  2. Hi Justin,

    I tried running the script on a Windows Server 2008 with SQL 2008 and I get the following error. What can I do to resolve it?

    Export -Counter : This cmdlet can be run only on Microsoft Windows 7 and above
    At C:\sql-perfmon.ps1:56 char:19

    Thank you.

  3. Hi Justin – cool util… is it correct that when it prompts for SqlInstance I shoudl put in _total ? If I put in the actual instance name it doesn’t match any of the performance counters for % Processor Time on the server?

    1. _total will work but it will capture CPU for everything running on the server. That may not be a problem if the other processes are using little to no CPU. When developing the tool and running tests, I used ‘sqlserver’ for the SqlInstance to limit what the counter collected.

      1. Thanks for getting back to me… I’ve tried ‘sqlserver’, ‘mssqlserver’ and other variations and nothing seems to pick up/filter the results as you suggest… ??

  4. Looks like you used the wrong performance counter – “\Processor($SqlInstance)\% Processor Time” instead of “\Process($SqlInstance)\% Processor Time” – changed that and it worked. Does this seem right?

    1. Unfortunately, no. The process object is different from the processor object. The process object is calculated using the # of CPUs x 100 whereas the processor object uses 100% as the maximum which is what the DTU calculator assumes. Here’s a link which explains processor vs. process objects:
      http://bit.ly/1L47g7z

  5. Hi Justin,
    I got the error:
    Get-Counter : The \Processor(SRUDB)\% Processor Time performance counter path is not valid.
    At C:\SQLPerformance.ps1:55 char:1
    + Get-Counter -Counter $counters -SampleInterval 1 -MaxSamples 3600 |
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidResult: (:) [Get-Counter], Exception
    + FullyQualifiedErrorId : CounterPathIsInvalid,Microsoft.PowerShell.Commands.GetCounterCommand

    Im using Azure VM DS3, image SQL Server 2014 Standard. input value for driver I tried F: (the current disk I save datafiles in), also tried C:, D: but still the same error.

  6. Hi Justin,

    I am also getting he \Processor(sqlserver)\% Processor Time performance counter path is not valid and if I use _Total I then get Get-Counter : The \Database(svchost)\Log Bytes Write/sec performance counter path is not valid.

    I am running this on Windows Server 2008 R2 SP1 with SQL Server 2008 R2

      1. Hi Justin,

        Still getting Get-Counter : The \Database(svchost)\Log Bytes Write/sec performance counter path is not valid.

  7. Hi Justin,

    I got the same error as Luke. (SQL server 2008 r2). Can it be because my database files are on another logical drive?

    1. I located the counter in perfmon but it seems inactive. Isn’t there a equivalent counter of the sqlserver object? Because those counters do work.

      1. Yes, there is an equivalent counter and you are welcome to use those counters instead. The script is provided as a simple way to get started but you can customize as needed. The import part is you capture the correct columns with headers that match what I’ve outlined in the blog post.

  8. After gathering several hours of performance logs on several databases from a Windows SQL servers.
    I went into the Elastic Databases section and added two log files.
    Then I click on the Calculate and I received an error on the page Error! An error occurred while processing your request.
    If I set only on log file and click Calculate I work’s.
    If I test each individual files into the Elastic Databases section it work’s.
    It look like it’s related to having more than one file.

  9. Hi Justin! Thanks a lot for putting up this page and the powershell script. I ran into an issue and I’m posting it here with a simple fix. The script assumes the default SQL Server instance. If anyone runs the script on a named instance they might get an error like I did
    “get-counter the specified object was not found on the computer”.
    All you need to do is change the SQLServer string in the script’s $counter variable to MSSQL`$. Note that there is tilde before the dollar symbol not an apostrophe. Credit: http://sqlblog.com/blogs/aaron_bertrand/archive/2011/05/03/a-little-powershell-syntax-tip-dealing-with-sql-server-named-instances.aspx

  10. Is there any way to email the results report? Right now the only option seems to be taking a screenshot of the results.

  11. Hi Justin, thanks for the script. I see you are using LogicalDisk (C:) Reads/Writes (i.e. “\LogicalDisk(C:)\Disk Reads/sec”) …is this universal or I need to customize it depending where are the data/log SQL files located?

  12. Hello,
    I obtained this results after executing the Power Shell utility on my 6 core server and SQL Server 2012.

    On “Azure SQL Databse” calculator tab:

    Based on your database utilization, we recommend you migrate your SQL Server workload to Standard – S2. This Service Tier/Performance Level should cover approximately 91.55 %of your utilization.
    NOTE: There is approximately 8.45 % of your workload that falls into a higher Service Tier/Performance Level.

    My question is: If I chose a S2 machine, when I have a workload peak, this 8.45% that my S2 couldn’t serve, what happens? An automatic upgrade to P1, P2 or P4 on demand during the time needed, and after an automatic downgrade to S2?
    Or I will always deal with an S2 and it will queue the query’s and serve it serialized if the workload it’s too heavy?

    On “Elastic Database” calculator tab:
    I load the same result csv on Elastic Database calculator tab and I have this result.

    Based on your database utilization, we recommend you migrate your SQL Server workload to the Standard elastic database pool. This Service Tier should cover approximately 95.72 % of your utilization and require 398 total eDTUs for the pool.

    NOTE: There is approximately 4.28 % of your workload that falls into a higher Service Tier.

    After all, I understand I need an option capable to serve 398 eDTU!
    This is an Elastic Premium 500 database, or P4 for a single database. However the calculator recommends me an “Standard Elastic Database” or S2, which are only 100DTU.

    I would appreciate some help with these results.
    Thank you.

    1. Regarding your first question, please read the following about Azure SQL DB resource limits:
      https://azure.microsoft.com/en-us/documentation/articles/sql-database-resource-limits/

      Regarding your second question, the recommendation is to use Standard elastic database pool with ~398 eDTUs – not DTUs. Pools are not the same as individual performance levels in a single Azure SQL DB.

      If you follow the recommendation you’d pick a Standard Elastic Pool with 400 eDTUs which costs ~$900/month retail.
      https://azure.microsoft.com/en-us/pricing/details/sql-database/

      1. Justin, I love this tool but I’m running into a similar issue and I don’t really understand your reply.

        Sure we can buy a Standard Elastic Pool and allocate 400 eDTUs to it, but what about the limitation of the Standard pool that specifies a 100 eDTU maximum per database?

        Following the recommendation wouldn’t we be paying for 400 eDTUs but only able to use 100. From my understanding we’d need a Premium Elastic Pool in order to allocate 400 eDTUs to a single database.

        Can you elaborate further?

        Kind regards,
        Sean

      2. Correct. If a single database in your pool requires more than 100 eDTUs then you’ll need Premium for that database. The maximum eDTU recommendation from the DTU Calculator indicates maximum eDTUs for the pool. Databases within the pool are under the single database eDTU constraint and all databases will share eDTUs up to the maximum pool size.

  13. I ran the cmdlt on my local laptop (2 cores), executed 4 simple selects against a single table, uploaded the file and it calculated that I needed a Premium P2. huh? how is this calculated?
    “(PDH-CSV 4.0) (Central Standard Time)(360)”,”\\tx60lp519681\processor(_total)\% processor time”,”\\tx60lp519681\logicaldisk(c:)\disk reads/sec”,”\\tx60lp519681\logicaldisk(c:)\disk writes/sec”,”\\tx60lp519681\logicaldisk(c:)\disk read bytes/sec”,”\\tx60lp519681\logicaldisk(c:)\disk write bytes/sec”,”\\tx60lp519681\sqlserver:databases(mdscommondb)\log bytes flushed/sec”
    “02/16/2016 19:54:53.028″,” “,” “,” “,” “,” “,” ”
    “02/16/2016 19:54:54.077″,”27.055663114874907″,”4.7638609867434019″,”12.386038565532845″,”124881.75745088623″,”78051.098406803896″,”0”
    “02/16/2016 19:54:55.117″,”28.603105172706535″,”0″,”9.6190525633798671″,”0″,”110318.99003889102″,”0”
    “02/16/2016 19:54:56.153″,”28.017186485758462″,”0″,”3.8594018124018108″,”0″,”24700.171599371592″,”0”
    “02/16/2016 19:54:57.189″,”29.859825358748438″,”0″,”1.9307824381190164″,”0″,”7908.4848665354912″,”0”
    “02/16/2016 19:54:58.225″,”34.384326476078122″,”0.96535077652280576″,”2.8960523295684171″,”7908.1535612748248″,”39540.767806374126″,”0”
    “02/16/2016 19:54:59.266″,”33.180929393990851″,”0″,”2.8830085117597419″,”0″,”39362.676213893013″,”0”
    “02/16/2016 19:55:00.305″,”41.719372211553733″,”4.8127794977640175″,”3.8502235982112141″,”120250.18341933264″,”102508.35307877536″,”0”
    “02/16/2016 19:55:01.340″,”37.021503448620749″,”135.16167033053696″,”1.9308810047219565″,”8094747.4773316504″,”11863.3328930117″,”0”
    “02/16/2016 19:55:02.344″,”54.464913575922139″,”0.99635168307276867″,”1.9927033661455373″,”16324.225975464242″,”8162.1129877321209″,”0”
    “02/16/2016 19:55:03.381″,”28.774307239513387″,”0″,”2.8940137120318523″,”0″,”43464.227269769042″,”0”
    “02/16/2016 19:55:04.385″,”26.883903735957482″,”0″,”1.9913734819717466″,”0″,”8156.665782156274″,”0”
    “02/16/2016 19:55:05.426″,”32.45285317226859″,”0″,”1.9213437839273246″,”0″,”11804.736208449483″,”0”
    “02/16/2016 19:55:06.462″,”38.930615915227953″,”121.59164993317161″,”2.8950392841231336″,”2512430.89233342″,”43479.62999515063″,”0”
    “02/16/2016 19:55:07.466″,”28.817344802320889″,”15.932775477284306″,”1.9915969346605382″,”176407.69008449183″,”12236.371566554346″,”0”
    “02/16/2016 19:55:08.502″,”48.327477215210649″,”325.3996183024671″,”2.8967325071436245″,”12774667.602703573″,”39550.054497534285″,”0”
    “02/16/2016 19:55:09.538″,”62.284479637336254″,”73.363120782582342″,”14.479563312351779″,”4159488.1650944878″,”126524.35483177149″,”0”
    “02/16/2016 19:55:10.548″,”70.224886443111671″,”156.44258393073659″,”3.9605717450819391″,”10882763.987414271″,”93279.385740169833″,”0”
    “02/16/2016 19:55:11.589″,”58.730730883018055″,”305.41769366798394″,”21.129525977030337″,”19744412.556328077″,”449451.90965540602″,”0”
    “02/16/2016 19:55:12.593″,”67.713426571922383″,”943.06532654374519″,”14.937676766796386″,”61714991.64511691″,”126448.42967604919″,”0”
    “02/16/2016 19:55:13.598″,”73.160873820621674″,”231.01668749198851″,”11.949139008206302″,”15029786.366375329″,”174361.83640774636″,”0”
    “02/16/2016 19:55:14.634″,”65.716173866950299″,”75.228133817009279″,”6.7512427784495506″,”4321782.9885798739″,”185670.74996641141″,”0”
    “02/16/2016 19:55:15.639″,”45.577705284378688″,”0″,”13.927376377024951″,”0″,”110018.31487771025″,”0”
    “02/16/2016 19:55:16.677″,”45.060246001012075″,”1.9272109319242647″,”4.8180273298106622″,”67097.775805875208″,”51310.063851551626″,”0”
    “02/16/2016 19:55:17.686″,”27.598358944756118″,”0″,”2.9735607534097288″,”0″,”64958.425845153331″,”0”
    “02/16/2016 19:55:18.691″,”27.272394189508063″,”0″,”3.9825531914893615″,”0″,”32625.075744680849″,”0”
    “02/16/2016 19:55:19.727″,”43.436975698473411″,”15.445128508773012″,”3.8612821271932529″,”96871.846007024331″,”15815.811592983564″,”0”
    “02/16/2016 19:55:20.763″,”49.142819393371276″,”0″,”15.430435189346859″,”0″,”223679.58850477208″,”0”
    “02/16/2016 19:55:21.768″,”54.102914421993198″,”0″,”3.9831678332470877″,”0″,”36708.874751205163″,”0”
    “02/16/2016 19:55:22.808″,”45.152170779105816″,”0″,”3.8456043266409141″,”0″,”102385.3695924877″,”0”
    “02/16/2016 19:55:23.853″,”41.698926992370701″,”0.95702844435199785″,”11.484341332223973″,”3919.9885080657832″,”133279.60927423663″,”0”
    “02/16/2016 19:55:24.889″,”32.532779869990151″,”0″,”4.8243031890656587″,”0″,”142274.49020937315″,”0”
    “02/16/2016 19:55:25.926″,”29.14860428418552″,”0″,”2.8944179471229488″,”0″,”15807.381215220797″,”0”

    1. I ran your results through the calculator. It looks like you have high CPU which is pushing you up to Premium P2. Since you’re running on your laptop, there may be other CPU intensive processes running on your machine that will skew the results.

  14. Hi Justin, I’m trying to use your powershell performance utility and get the following error

    Get-Counter : Internal performance counter API call failed. Error: c0000bb8.
    At C:\Users\administrator.ALL-PIPE\Desktop\sql-perfmon.ps1:47 char:1
    + Get-Counter -Counter $counters -SampleInterval 1 -MaxSamples 3600 |
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidResult: (:) [Get-Counter], Exception
    + FullyQualifiedErrorId : CounterApiError,Microsoft.PowerShell.Commands.GetCounterCommand

    I can’t seem to shed any light from the error code.

    Cheers,
    Shaun

      1. Thanks for the reply Justin. I’m definitely running as an admin.

        The server is running on a VM using Windows Server 2012 R2 Standard (Build 9600).

        Using Windows PowerShell V 4.0

        I saw a few remedies of restarting the server, etc. and tried these with no success 😦

        Cheers,
        Shaun

      2. You need to run the PowerShell script as admin. Just being an admin on the machine doesn’t work. The easiest way to do this is run a PowerShell cmd window as admin and then run the script from that window.

  15. Hi Justin, thanks for this tool.

    I currently have a VM with 24 virtual cores (the host has 2 CPUs and is running two VMs). On it are 20 databases. I want to see if I can move these to an elastic database pool.

    In order to do this do I need to run the script for each database and then upload the CSV? Can this happen concurrently or does it need to be done database at a time?

  16. Hi Justin, I have been doing this but the same error occurs.

    Any other recommendations?

    Again thanks for the help,
    Shaun

  17. Hi Justin

    Is it possible to use the script on a server 2003? I’ve checked the powershell version but I allways receive the message that it need to be run under Windows 7 minimum.
    I also try to generate the csv with the counters but if I upload it on the site I receive unknown error.

    Name Value
    —- —–
    CLRVersion 2.0.50727.3662
    BuildVersion 6.0.6002.18111
    PSVersion 2.0
    WSManStackVersion 2.0
    PSCompatibleVersions {1.0, 2.0}
    SerializationVersion 1.1.0.1
    PSRemotingProtocolVersion 2.1

    I need to make an offer to move an existing SAP running under SBS2003 to Windows Azure.

    Thanks a lot for your Help.

    Greetings
    Dominik

    1. The calculator works fine with 2003 but the PowerShell script may need to be modified or you may need to install PowerShell 2.0+. Apologies but I don’t have access to a machine running Server 2003 to test it for you.

  18. Hi Justin

    I was able now to generate a csv with your defined counters. First i create one on my server 2008R2 with the powershel script to see how I must create the csv on the old 2003 server. because the server is installed in german I also was needed to change the headers to english like in before generated csv on my english installed 2008R2 Server.
    My question now, can I send you the file for analysis? I’m not sure about the result in Azure SQL Database DTU Calculator.

    “Based on your database utilization, we recommend you migrate your SQL Server workload to Standard – S2. This Service Tier/Performance Level should cover approximately 99.12 % of your utilization.

    NOTE: There is approximately 0.88 % of your workload that falls into a higher Service Tier/Performance Level. After migrating your database to Azure, you should evaluate your database’s performance using the guidance mentioned in the  information section above”

    Thanks and Greetings
    Dominik

    1. The calculator works fine with 2003 but the PowerShell script may need to be modified or you may need to install PowerShell 2.0+. Apologies but I don’t have access to a machine running Server 2003 to test it for you.

  19. Justin,

    Have you ever figured out a solution for the “Export -Counter : This cmdlet can be run only on Microsoft Windows 7 and above” error? I’d like to run the DTU Calculator on my Windows 2003 Server but cannot get around this error message…

    Thanks – SV

    1. The calculator works fine with 2003 but the PowerShell script may need to be modified or you may need to install PowerShell 2.0+. Apologies but I don’t have access to a machine running Server 2003 to test it for you.

  20. Hello,

    I have any AlwaysOn setup running SQL Server 2012 on Windows Server 2008. I have upgraded PowerShell to v4 but am getting the following. What am I missing?…

    Get-Counter : Internal performance counter API call failed. Error: c0000bb8.
    At C:\Users\Administrator\Downloads\sql-perfmon\sql-perfmon.ps1:47 char:1
    + Get-Counter -Counter $counters -SampleInterval 1 -MaxSamples 3600 |
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidResult: (:) [Get-Counter], Exception
    + FullyQualifiedErrorId : CounterApiError,Microsoft.PowerShell.Commands.GetCounterCommand

    1. Try opening PowerShell ISE. Before opening ISE, select ‘run as administrator’ regardless of you being an admin on the machine. Once you have PowerShell ISE open, browse to the DTU Calculator script, and run it from ISE.

  21. I’ve gathered stats over a 5 day period and the file is 17MB. I keep getting script time outs (which, I assume are related to the charts). Is there another way to generate the data?

    I’m actually running it for 7 days, but I wanted a sneak peek.

      1. I was trying to include all tasks that run on the server, including nightly batch jobs.

  22. Unfortunately we have SQL 2005 running on Windows 2003 still and we are going to move into Azure but your powershell script requires Windows 7 (2008R2) or higher. Is there a lower level Powershell script that can do the same thing for us?

  23. Hi Justin – Is there a way to find out what is currently spiking my dtu, I have started getting dtu spike from last two days have no idea how to methodologically identify the culprit query. /Thanks Shiju

  24. good afternoon! the script gives an error message that the database is not found . although the database is on the server . please tell me what to try . I am sorry for my English , I use Google translator
    I run as administrator

  25. Hi Justin,

    I have followed the instructions at http://dtucalculator.azurewebsites.net/ but when I run the script, the following error is displayed.

    Collecting counters…
    Press Ctrl+C to exit.
    Get-Counter : The \SQLServer:Databases(_Total)\Log Bytes Flushed/sec performance counter path is not valid.
    At C:\Users\Administrator\desktop\sql-perfmon.ps1:41 char:1
    + Get-Counter -Counter $counters -SampleInterval 1 -MaxSamples 3600 |
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidResult: (:) [Get-Counter], Exception
    + FullyQualifiedErrorId : CounterPathIsInvalid,Microsoft.PowerShell.Commands.GetCounterCommand

    Are you able to provide me with details on how this can be resolved? Thanks!

    Regards,
    Eric

    1. Try opening PowerShell ISE. Before opening ISE, select ‘run as administrator’ regardless of you being an admin on the machine. Once you have PowerShell ISE open, browse to the DTU Calculator script, and run it from ISE.

      1. I’ve got a similar problem that’s become a show stopper. I’ve had to change that counter format to:
        “\MSSQL$BISGSQL66:Databases(*)\Log Bytes Flushed/sec”
        which i found when running (get-counter -listset *).paths. I still get the same error. If I omit that counter from the list the script appears to run fine.

        Your thoughts?

  26. Hello! Trying to run the script (right clickt) but having the following error: Get -Counter Specified Object was not found on the computer. Level C:\Users\administrateur.XXX\Downloads\sql-perfmon\sql-perfmon.ps1 : 41 Char 12
    * Get -Counter <<<< -Counter $counters -SampleInterval 1 -MaxSamples 3600: * Categoryinfo : InvalidResult: [Get-Counter], Exception * FullyQualifiedErrorld : CounterApiError, Microsoft.Powershell.Commands.GetCounterCommand
    Sorry literally translated a french screenshot of the error (apparently in Administrator mode)….

    1. Try opening PowerShell ISE. Before opening ISE, select ‘run as administrator’ regardless of you being an admin on the machine. Once you have PowerShell ISE open, browse to the DTU Calculator script, and run it from ISE.

  27. Hello Justin,

    Thanks for taking the time to create this.

    I like others have mentioned am having a problem when running the script, where is throwing the error below. (direct copy and paste from the shell)

    Collecting counters…
    Press Ctrl+C to exit.
    Get-Counter : Internal performance counter API call failed. Error: c0000bb8.
    At C:\sql-perfmon\sql-perfmon.ps1:43 char:1
    + Get-Counter -Counter $counters -SampleInterval 1 -MaxSamples 3600 | Export-Count …
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidResult: (:) [Get-Counter], Exception
    + FullyQualifiedErrorId : CounterApiError,Microsoft.PowerShell.Commands.GetCounterCommand

    I can assure you that the powershell console is being run in the administrators context.

    the server OS is 2012r2, and is running SQL 2012

    Would greatly apriciate any feedback on what I can do to fix this, please let me know if there is any other info I can provide that may be relevant?

    Thanks,

    Ben

  28. Hi,

    I believe I have rectified my API error, I think it was having a hard time die to my SQL server having multiple instances/

    I changed the line “\SQLServer:Databases(_Total)\Log Bytes Flushed/sec”) to “\*:*(_Total)\Log Bytes Flushed/sec”)

  29. Hi Justin,

    I’ve tried running this script in all the ways mentioned, using PowerShell ISE, using run as administrator etc. But I still get the ‘CounterAPIError’

    do you have any suggestions on what I could do to make it work?

    Thanks,

  30. Hi Justin,

    I run your scrip as administrator but I got this error.

    Collecting counters…
    Press Ctrl+C to exit.
    Get-Counter : The specified object was not found on the computer.
    At C:\Users\c3dev\Downloads\sql-perfmon\sql-perfmon.ps1:43 char:1
    + Get-Counter -Counter $counters -SampleInterval 1 -MaxSamples 3600 |
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidResult: (:) [Get-Counter], Exception
    + FullyQualifiedErrorId : CounterApiError,Microsoft.PowerShell.Commands.GetCounterCommand

    Can you help me with this.

  31. Thanks for providing this tool.

    I have a couple of questions I hope you can help with.

    1) The script and the instructions seem to imply that this should be run for a specific DB, but it currently captures counters for all DB’s and all the counters bar the log bytes flushed will record metrics for the server rather than a specific DB. Is this correct?

    (The instructions state “you’ll need to capture several performance metrics for each database on your SQL server(s)”, and the script contains a comment mentioned a parameter for database name but this is never used – “.PARAMETER DatabaseName The name of the SQL Server database to monitor.”).

    2) The instructions indicate that the number of cores & databases should be entered when uploading the script but there’s only a field for cores:

    “enter the number of cores for your server, the number of databases on that server”

    3) The script linked to for both single and elastic calculations is the same, is this correct? (I would have thought you’d want to target a single DB for the former and multiple for the latter).

    1. 1) That’s correct. The calculator assumes on a single DB per server but since it collects at the server level, calculations will be based on all DBs on the server so you can think of it as a “pool”.
      2) Cores only. If you have databases on different servers with different files that you collected, you can use the “Pools” tab to enter multiple DBs.
      3) It’s very difficult to target a single DB. If you are familiar enough with SQL Server and DMVs, you can isolate a single DB and perform the same collection. The calculator will work in both scenarios.

  32. Hi,

    Whatever I do I always get the below error. tried all the above solutions but none of them did help. Server 2008 R2 with SQL server 2005 installed:
    Get-Counter : Internal performance counter API call failed. Error: c0000bb8.
    At C:\Users\administrator\Desktop\sql-perfmon\sql-perfmon.ps1:43 char:1
    + Get-Counter -Counter $counters -SampleInterval 1 -MaxSamples 3600 |
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidResult: (:) [Get-Counter], Exception
    + FullyQualifiedErrorId : CounterApiError,Microsoft.PowerShell.Commands.GetCounterCommand

  33. Hi, I’m getting the “Get-Counter: Internal Performance Counter API call failed. Error:c0000bb8” error that you have previously suggested is caused by not running from admin rights? I can 100% confirm I’ve tested running as elevated admin via the ISE window and Powershell window direct and am seeing the message 😦 Its running on Server 2012 Standard x64 with SQL 2012 Standard – any ideas? 🙂

    Kind Regards

    Joe

  34. Hi Justin,

    I have a customer running the script, they are running PowerShell as Admin but they still get the following error:

    Collecting counters…
    Press Ctrl+C to exit.
    Internal performance counter API call failed. Error: c0000bb8.
    At D:\Users\Administrator\Desktop\sql-perfmon.ps1:43 char:12
    + Get-Counter <<<< -Counter $counters -SampleInterval 1 -MaxSamples 3600 |
    + CategoryInfo : InvalidResult: (:) [Get-Counter], Exception
    + FullyQualifiedErrorId : CounterApiError,Microsoft.PowerShell.Commands.GetCounterCommand

    Any thoughts?

  35. Justin, one more question please, old server has 2 cpus of 4 cores (total 8) each WITH hyperthreading. In same time server wide setting for Max degree of parallelism is set to 1. Do you think it may affect estimation? Also what value I should put on dtu calculation website for number of cores? 8? or 16 counting HT?
    Thank you

  36. Is it possible to build a SqlDtuPerfmon.exe that hard-code the CultureInfo used for output?

    I live in Sweden where we have commas as decimal separators which make the .csv file corrupt since the separator is a comma as well.

    Your page cannot recommend a tier when I upload the csv file since there are a lot of peaks of 1000000 DTUs :-).

  37. Is there a version for SQL Server 2016 on a Windows 2012 server? I get the following error. I then don’t find the Database – Log Bytes Flushed/sec counter in Perfmon as the SQL 2016 counters are different. Can I use a different counter or do I have to wait for a new version to work with SQL 2016?

    Get-Counter : The specified object was not found on the computer.
    At L:\sql-perfmon.ps1:43 char:1
    + Get-Counter -Counter $counters -SampleInterval 1 -MaxSamples 3600 |
    + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
    + CategoryInfo : InvalidResult: (:) [Get-Counter], Exception
    + FullyQualifiedErrorId : CounterApiError,Microsoft.PowerShell.Commands.GetCounterCommand

      1. Thank you. I found the counter and had to change it to MSSQL$instance with my instance name but it still gave the same message. It does run without the counter then but if I run it without the counter won’t that affect the results when uploading to the calculator? I did try the command line utility and I get the following message.

        The following feature couldn’t be installed:
        .NET Framework 3.5 (includes .NET 2.0 and 3.0)

  38. When I try to calculate DTUs using the API with Cores number more than 30, I get the error: “Message”: “An error has occurred.”
    Justin, could you please describe why this happens?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s