Identifying performance issues using SQL Server Profiler
Last updated: May 22nd '03 | Best viewed with: All popular browsers | Best viewed at: 1024x768 | Links to external sites will open in a new window

About myself
My technical skills
My favorites
My picture album

Shortcut keys
My code library

VB resources
SQLServer resources
SQLServer books
Replication FAQ
Scripting resources
ASP resources

Search my site
Sign my guestbook
Contact information

SQL Server Articles New

Click here to find out the top 15 SQL Server books purchased by this site's visitors! NEW



Featured Book:
Real-World SQL-DMO for SQL Server, by Allan Mitchell and Mark Allison

This month's 2 click survey:
Is .NET important for a database professional?


Book of the month:
Click here to find out more about this excellent book!
 
NEW!!! Subscribe to my newsletter:
Want to keep in touch with the latest in SQL Server world? Email vyaskn@hotmail.com with 'subscribe' in the subject line

How to identify SQL Server performance issues, by analyzing Profiler output?
(See also: Automating server side tracing in SQL Server)

It is always better to be proactive than reactive, when it comes to identifying and eliminating SQL Server performance issues. In this article, I am going to explain the method I follow, to identify performance issues in my database applications, before those applications go into production environment.

Once the application is completely built and tested, I will conduct something called a "Load test" or "Stress test". There are several ways (or tools) in which one can stress test a database application. I use Mercury Loadrunner, though it is a very expensive tool to work with. Cheaper alternatives include, tools like "Database Hammer" from SQL Server 2000 Resource Kit. Idea behind stress testing is to be able to simulate real production loads, user connections and see how the application and database behave under extreme stress. This enables us to fix potential performance issues, blocking and deadlocking issues, before we put the application in production. So, it is very important to have a separate load testing environment, which mirrors your production environments in terms of hardware and software resources (like processing power, available memory, disk layouts and RAID arrays, versions and editions of software), network bandwidth etc. Without this, your load test may not be a valid one.

In this article, I will explain how to use Profiler to identify SQL Server specific performance issues. At the end of this article, you'll find books related to SQL Server performance tuning, and links to load testing tools.

Idea here, is to run Profiler during the load test, and analyze the Profiler output at the end of the load test, to identify the longest running queries, most CPU intensive queries, stored procedure recompilations, errors like deadlocks etc. For this purpose, I use a specific trace template, that captures all the data we need for this analysis (This trace template is only for use with SQL Server 2000 and is not compatible with SQL Server 7.0).

Click here to download the trace definition file LoadTest.tdf

This trace template captures the following event classes:

Database Database File Auto Grow
Log File Auto Grow
Errors and Warnings ErrorLog
EventLog
Exception
Locks Lock:Deadlock
Lock:Deadlock Chain
Stored ProceduresRPC:Completed
RPC:Starting
SP:CacheHit
SP:CacheInsert
SP:CacheMiss
SP:CacheRemove
SP:Completed
SP:ExecContextHit
SP:Recompile
SP:Starting
SP:StmtCompleted
T-SQLSQL:BatchCompleted
SQL:BatchStarting
SQL:StmtCompleted


This trace template also captures the following data columns:

EventClass
ApplicationName
CPU
SPID
DatabaseName
Duration
EndTime
Error
EventSubClass
HostName
IntegerData
LoginName
NTUserName
ObjectName
StartTime
TextData

Download the trace template LoadTest.tdf, from the above link, save it to a location on your PC, say C:\LoadTest. Follow these steps to create and run a new trace based on this definition file.

  1. Open Profiler.
  2. From the File menu, select New > Trace...
  3. In the 'Connect to SQL Server' dialog box, Connect to the SQL Server that you are going to be tracing, by providing the server name and login information (Make sure you connect as a sysadmin).
  4. In the General tab of the 'Trace Properties' dialog box click on the folder icon against the 'Template file name:' text box. Select the trace template file that you've just downloaded.
  5. Now we need to save the Profiler output to a table, for later analysis. So, Check the check box against 'Save to table:' check box. In the 'Connect to SQL Server' dialog box, specify an SQL Server name (and login, password), on which you'd like to store the Profiler output. In the 'Destination Table' dialog box, select a database and table name. Click OK. It's a good idea to save the Profiler output to a different SQL Server, than the one on which we are conducting load test.
  6. Check 'Enable trace stop time:' and select a date and time at which you want the trace to stop itself. (I typically conduct the load test for about 2 hours).
  7. Click "Run" to start the trace.

It's also a good idea to run Profiler on a client machine, instead of, on the SQL Server itself. On the client machine, make sure you have enough space on the system drive.

As soon as the Profiler starts tracing, kick start your load test process as well. At the end of the load test, you'll see that Profiler has logged massive amounts of information into the table, that you've selected in Step 4. Now we need to analyze this information and find out what are the long running queries and stored procedures, what are the most CPU intensive stored procedures, which stored procedures are being recompiled too many times, are there any errors that are being generated due to stress etc.

I wrote some stored procedures, that can be used to analyze the Profiler output and generate various reports based on your requirements. I'll list some of those stored procedures here and provide you the code. Once you get the idea, you should be able to write your own stored procedures to get the information you need. You need to create these stored procedures, in the same database as the Profiler output table. Note that, in all these stored procedures, I used dynamic SQL, so that we can reuse the stored procedures against multiple 'Profiler output' tables, by passing the name of the table as an input parameter. Also, dynamic SQL gives more flexibility when it comes to reporting. But it has its own disadvantages, and should be used with caution in production code.

Note: Consider creating a clustered index on the EventClass column of the Profiler output table, before you start executing the following procedures. This will improve the performance of the below mentioned stored procedures. For more information on EventClass, its values, and descriptions, and for programmatically extending the scope of the following stored procedures, download the Events table from here!

Downloadable stored procedure Usage information
LongRunningProcs This procedure identifies the top n long running stored procedures and queries.

Following is the description of the input parameters:

@Table - Name of the 'Profiler output' table. Mandatory.

@Top - Number of long running queries that'll be returned. By default top 150 most long running queries will be returned. Optional.

@Owner - Name of the 'Profiler output' table's owner. Default is 'dbo'. Optional.

@LoginName - Returns only those queries, executed by this login name. Optional.

@HostName - Returns only those queries executed from this workstation. Optional.

@ApplicationName - Returns only those queries executed by this application. Optional.

@NTUserName - Returns only those queries executed by this Windows account. Optional.

@StartTime - Returns only those queries executed after this date and time. Optional.

@EndTime - Returns only those queries executed before this date and time. Optional.

@MinDuration - Returns only those queries whose duration (in Seconds) is greater than or equal to @MinDuration. Optional.

@MaxDuration - Returns only those queries whose duration (in Seconds) is less than or equal to @MaxDuration. Optional.

@Debug - If 1 is passed then the dynamic SQL statement is printed before execution. If 0, no debug information will be printed. 0 is the default. Optional.

Example:

--To see the top 10 long running procedures executed by the login 'BusinessObjects'
EXEC dbo.LongRunningProcs @Table = 'ProfilerData', @Top = 10, @LoginName = 'BusinessObjects'
GO

--To see the top 150 long running procedures executed between '2002/11/24 14:30' and '2002/11/24 15:00'

EXEC dbo.LongRunningProcs @Table = 'ProfilerData', @StartTime = '2002/11/24 22:40', @EndTime = '2002/11/24 22:47'
GO
CPUIntensive This procedure identifies the top n long running stored procedures and queries.

Following is the description of the input parameters:

@Table - Name of the 'Profiler output' table. Mandatory.

@Top - Number of most CPU intensive queries that'll be returned. By default top 150 most CPU intensive queries will be returned. Optional.

@Owner - Name of the 'Profiler output' table's owner. Default is 'dbo'. Optional.

@LoginName - Returns only those queries, executed by this login name. Optional.

@HostName - Returns only those queries executed from this workstation. Optional.

@ApplicationName - Returns only those queries executed by this application. Optional.

@NTUserName - Returns only those queries executed by this Windows account. Optional.

@StartTime - Returns only those queries executed after this date and time. Optional.

@EndTime - Returns only those queries executed before this date and time. Optional.

@MinDuration - Returns only those queries whose CPU utilization (in Seconds) is greater than or equal to @MinDuration. Optional.

@MaxDuration - Returns only those queries whose CPU utilization (in Seconds) is less than or equal to @MaxDuration. Optional.

@Debug - If 1 is passed then the dynamic SQL statement is printed before execution. If 0, no debug information will be printed. 0 is the default. Optional.

Example:

--To see the top 150 most CPU intensive queries executed by the application 'MTS Objects'
EXEC dbo.CPUIntensive @Table = 'ProfilerData', @ApplicationName = 'MTS Objects'
GO

--To see the top 150 CPU intensive queries executed since '2002/11/24 11:30'
EXEC dbo.CPUIntensive @Table = 'ProfilerData', @StartTime = '2002/11/24 11:30'
GO
ProcedureCacheUsage This procedure displays the procedure cache usage information.

Following is the description of the input parameters:

@Table - Name of the 'Profiler output' table. Mandatory.

@Owner - Name of the 'Profiler output' table's owner. Default is 'dbo'. Optional.

@LoginName - Considers only those procedures, executed by this login name. Optional.

@HostName - Considers only those procedures, executed from this workstation. Optional.

@ApplicationName - Considers only those procedures, executed by this application. Optional.

@NTUserName - Considers only those procedures, executed by this Windows account. Optional.

@StartTime - Considers only those procedures, executed after this date and time. Optional.

@EndTime - Considers only those procedures, executed before this date and time. Optional.

@Debug - If 1 is passed then the dynamic SQL statement is printed before execution. If 0, no debug information will be printed. 0 is the default. Optional.

Example:

--To see the procedure cache usage statistics:
EXEC dbo.ProcedureCacheUsage 'ProfilerData'
GO
ShowErrors This procedure displays the error messages raised during the stress test.

Following is the description of the input parameters:

@Table - Name of the 'Profiler output' table. Mandatory.

@Top - Number of error messages that'll be returned. By default top 150 error messages will be returned. Optional. @Owner - Name of the 'Profiler output' table's owner. Default is 'dbo'. Optional.

@LoginName - Displays only those errors, generated by this login. Optional.

@HostName - Displays only those errors, generated by this workstation. Optional.

@ApplicationName - Displays only those errors, generated by this application. Optional.

@NTUserName - Displays only those errors, generated by this Windows account. Optional.

@StartTime - Displays only those errors, generated after this date and time. Optional.

@Debug - If 1 is passed then the dynamic SQL statement is printed before execution. If 0, no debug information will be printed. 0 is the default. Optional.

Example:

--To see all the errors generated by the login 'BusinessObjects'
EXEC dbo.ShowErrors @Table = 'ProfilerData', @LoginName = 'BusinessObjects'


Once you identify the bad queries and stored procedures, using the above methods, you need to work on those individual procedures/queries to address the problems. Some of the tools that will help in fine tuning performance include: Index Tuning Wizard, Graphical execution plan option in Query Analyzer.

As promised, here are links to some of the best SQL Server performance tuning related books. I personally, read some of these books and found them extremely useful and enlightening. Hope you find them useful too:

Top
Book Title Review available?
The SQL Server 2000 Performance Optimization and Tuning HandbookThe SQL Server 2000 Performance Optimization and Tuning Handbook
No
Microsoft SQL Server 2000 Performance Tuning Technical ReferenceMicrosoft SQL Server 2000 Performance Tuning Technical Reference
Yes. Click here to read
SQL Performance TuningSQL Performance Tuning
No
The Guru's Guide to Stored Procedures, XML and HTMLThe Guru's Guide to Stored Procedures, XML and HTML
No
Inside Microsoft SQL Server 2000Inside Microsoft SQL Server 2000
Yes. Click here to read
Microsoft SQL Server 2000 Resource KitMicrosoft SQL Server 2000 Resource Kit
Yes. Click here to read


Here are some links to web and application stress testing tools:

LoadRunner from Mercury Interactive

Database Hammer from SQL Server 2000 Resource Kit

A more comprehensive list of stress testing tools

A much more comprehensive list of stress testing tools can be found here