Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Interpret SQL Profiler Results

  • 07-11-2012 4:33pm
    #1
    Registered Users, Registered Users 2 Posts: 94 ✭✭


    Hi all,



    I'm new enough to SQL and learning the ropes. hopefully someone can help me a bit. Our company has a SQL server hosting 5 databases, ranging in size from 500MB to 45GB. There are also some applications hosted on this server, and some on a seperate Appserver.



    One of our software programs is reporting slow end user interaction. It's server based with a web UI, connecting to a 2GB DB. users are reporting the software to be extremely sluggish, slow opening up pages, querying jobs, etc.



    I ran the SQL Profiler tool (monitoring SQL: Batch Completed & RPC:Completed) and got some quite unusual results.



    All other applications being queries had durations 20-100, reads of 100-500 and CPU of 100-200 (broad figures i know, but they all seems relatively low).



    For some queries related to the DB of the application that is running slow, the reads are up on 6milliion, duration of 5-6secs for some, and CPU or 4-5K.



    There are numerous entries like this, I'll post some examples below. What i want to know is, why would these figures be so high? Is it the actual setup/copnfig of the database itself, or something going wrong in the software, or is there anyway of telling?

    Also, what ARE "normal" read speeds? Does it vary DB to DB.



    Any help would be appreciated. Thanks in advance (see below)





    EventClass

    textData

    Duration

    CPU

    Reads

    SQL Batch Completed

    Select wfo_BatchTable.*, job_Specifics.*, prd_ProductTable.Product_Name from wfo_BatchTable, job_Specifics, prd_ProductTable where wfo_BatchTable.ParentId <> 0 AND wfo_BatchTable.Programid in ( 4) and wfo_BatchTable.ProductId = prd_ProductTable.ProductID and wfo_BatchTable.JobId = job_Specifics.JobId and DateDueToStart >= '2012-10-13' and DateDueToStart < '2012-11-07' order by wfo_BatchTable.JobID

    4793

    3750

    3110332

    SQL Batch Completed

    Select wfo_BatchTable.*, job_Specifics.*, prd_ProductTable.Product_Name from wfo_BatchTable, job_Specifics, prd_ProductTable where wfo_BatchTable.ParentId = 0 AND wfo_BatchTable.Programid in ( 4) and wfo_BatchTable.ProductId = prd_ProductTable.ProductID and wfo_BatchTable.JobId = job_Specifics.JobId and DateDueToStart >= '2012-10-02' and DateDueToStart < '2012-11-13' order by wfo_BatchTable.JobID desc

    4698

    4703

    4271979

    SQL Batch Completed

    Select wfo_BatchTable.*, job_Specifics.*, prd_ProductTable.Product_Name from wfo_BatchTable, job_Specifics, prd_ProductTable where wfo_BatchTable.ParentId = 0 AND wfo_BatchTable.Programid in ( 4) and wfo_BatchTable.ProductId = prd_ProductTable.ProductID and wfo_BatchTable.JobId = job_Specifics.JobId and DateDueToStart >= '2012-10-13' and DateDueToStart < '2012-11-07' order by wfo_BatchTable.JobID desc

    2346

    2312

    2083176

    SQL Batch Completed

    Select wfo_BatchTable.*, job_Specifics.*, prd_ProductTable.Product_Name from wfo_BatchTable, job_Specifics, prd_ProductTable where wfo_BatchTable.ParentId <> 0 AND wfo_BatchTable.Programid in ( 4) and wfo_BatchTable.ProductId = prd_ProductTable.ProductID and wfo_BatchTable.JobId = job_Specifics.JobId and DateDueToStart >= '2012-10-13' and DateDueToStart < '2012-11-07' AND ExtraData1 LIKE '%pdi1%' order by wfo_BatchTable.JobID

    708

    515

    395244

    RPC Completed

    exec vtm_insFormDataItem @TableName=N'pdn_FormData',@FormId=332834,@DataItemName='ItemId',@DataItemValue=N'332141'

    122

    437

    157909

    RPC Completed

    exec vtm_GetSimpleMatrix2Data @TableName=N'pdn_FormData',@CriteriaName=N'ParentId',@CriteriaValue=N'332533'

    434

    1704

    315806

    RPC Completed

    exec vtm_getComplexMatrix2Data @TableName=N'pps_FormData',@CriteriaName=N'ParentId,VersionStatus',@CriteriaValue=N'70,Current'

    3591

    3594

    81188

    SQL Batch Completed

    Select wfo_BatchTable.*, job_Specifics.*, prd_ProductTable.Product_Name from wfo_BatchTable, job_Specifics, prd_ProductTable where wfo_BatchTable.ParentId = 0 AND wfo_BatchTable.Programid in ( 4) and wfo_BatchTable.ProductId = prd_ProductTable.ProductID and wfo_BatchTable.JobId = job_Specifics.JobId and DateDueToStart >= '2012-09-20' and DateDueToStart < '2012-11-05' order by wfo_BatchTable.JobID desc

    11010

    7625

    5908159

    SQL Batch Completed

    Select wfo_BatchTable.*, job_Specifics.*, prd_ProductTable.Product_Name from wfo_BatchTable, job_Specifics, prd_ProductTable where wfo_BatchTable.ParentId = 0 AND wfo_BatchTable.Programid in ( 4) and wfo_BatchTable.ProductId = prd_ProductTable.ProductID and wfo_BatchTable.JobId = job_Specifics.JobId and DateDueToStart >= '2012-09-20' and DateDueToStart < '2012-11-05' order by wfo_BatchTable.JobID desc

    6470

    6468

    5894163

    SQL Batch Completed

    Select wfo_BatchTable.*, job_Specifics.*, prd_ProductTable.Product_Name from wfo_BatchTable, job_Specifics, prd_ProductTable where wfo_BatchTable.ParentId = 0 AND wfo_BatchTable.Programid in ( 4) and wfo_BatchTable.ProductId = prd_ProductTable.ProductID and wfo_BatchTable.JobId = job_Specifics.JobId and DateDueToStart >= '2012-09-27' and DateDueToStart < '2012-11-01' order by wfo_BatchTable.JobID desc

    6250

    5344

    4750021

    SQL Batch Completed

    Select wfo_BatchTable.*, job_Specifics.*, prd_ProductTable.Product_Name from wfo_BatchTable, job_Specifics, prd_ProductTable where wfo_BatchTable.ParentId = 0 AND wfo_BatchTable.Programid in ( 4) and wfo_BatchTable.ProductId = prd_ProductTable.ProductID and wfo_BatchTable.JobId = job_Specifics.JobId and DateDueToStart >= '2012-10-11' and DateDueToStart < '2012-11-05' order by wfo_BatchTable.JobID desc

    4904

    2750

    2118114

    SQL Batch Completed

    Select wfo_BatchTable.*, job_Specifics.*, prd_ProductTable.Product_Name from wfo_BatchTable, job_Specifics, prd_ProductTable where wfo_BatchTable.ParentId <> 0 AND wfo_BatchTable.Programid in ( 4) and wfo_BatchTable.ProductId = prd_ProductTable.ProductID and wfo_BatchTable.JobId = job_Specifics.JobId and DateDueToStart >= '2012-10-11' and DateDueToStart < '2012-11-05' order by wfo_BatchTable.JobID

    6814

    4266

    3125306


Comments

  • Registered Users, Registered Users 2 Posts: 230 ✭✭bellylint


    Hi Joe,
    You're asking quite a broad broad question there overall. What indexes do you have on the tables? This can make a major difference.
    You can have a look at the query execution plan and see where the majority of the time is taken.


  • Registered Users, Registered Users 2 Posts: 2,781 ✭✭✭amen


    Joe,

    Database tunning and performance monitoring is a massive area. Unfortunately most developers ignore the DB until the applications is running very slowy.

    As a start check the tables involved in your query and ensure that you have appropriate Primary Keys and Indexes.

    Appropriate indexes are those columns which are used in the the sql statements and are sufficiently unique to be indexable i.e. UserId, Order ID. An column such as Sex (Male/Female) is not a good idea for an index.

    Although you have may indexes they may stale/out of date. For instance an index could have been added when there was 10 rows on table and the table now has 6 million rows so the index is now obsolete. You can drop and rebuild the index. (this though can take time and should be performed during the night).

    Once the index has been rebuilt you would need to recompile your SPs to pick up the new indexes. Generally you should recompile the SPs after the indexes have been rebuild( again at night). Depending in your system you may need to recompile SPs more regularly but as your DB is only 2Gbs you should be fine.

    Before doing anything though MS SQL Server has several performance reports/tables that tell you which tables are missing indexs, which sps are slow performing etc. You should google these. (Too many indexes can be bad)

    You should also look at the fill factor and index fragmentation.

    What type of Disk are you using and whats the Disk I/O like? This can also impact performance.

    One immediate thing you can do is open a query window and under the query options select Show Execution Plan and run your SP.

    This will provide a good graphical explanation of what your SP is doing.

    Do you have extensive use of Views ?

    Is AutoStatistics turned on ? (You may not need this).

    Have you checked your SQL Server logs to see if there is any interesting messages in their?

    Do you have the DeadLock SQL loging setup?

    Do your SPs/SQL normally access the tables in the same order ?(different SPs accessing the same tables in different orders can cause issues).

    What type of locking are you using ?

    If you have more specific questions post them.

    Be careful what you change though as a small change can have a big impact both good and bad.


  • Registered Users, Registered Users 2 Posts: 2,781 ✭✭✭amen


    Select * as in wfo_BatchTable.* is generally not a good idea.


  • Registered Users, Registered Users 2 Posts: 94 ✭✭joe2687


    Hi Guys, thanks for the replies

    As stated above, my experience with SQL is minimal, so a lot of that was over my head to be honest. The database and application are supported by a third party company, and I have arranged for an engineer on site to take a look at this, so really just wanted a broad overview to know what i was talking about.

    I will run some of the performance reports from within SQL to see what they are saying re indexes. But to answer some of your questions:

    - I don't have extensive use of Views
    - Nothing in the Server Logs
    - Deadlock SQL logging not setup - i should look into this now.


    Also you said Select * as in wfo_BatchTable.* is generally not a good idea. - why is that?

    Thanks.


  • Registered Users, Registered Users 2 Posts: 2,781 ✭✭✭amen


    In general when a query uses select * SQL Server will/may scan the entire table to obtain the required data.

    When using Select Column1, Column2 etc SQL Server will attempt to use the indexs and thus reduce the query time and the load (cpu, disk io etc) on the sql server.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 94 ✭✭joe2687


    amen wrote: »
    In general when a query uses select * SQL Server will/may scan the entire table to obtain the required data.

    When using Select Column1, Column2 etc SQL Server will attempt to use the indexs and thus reduce the query time and the load (cpu, disk io etc) on the sql server.


    OK I understand. maybe it was setup initially this way for a reason, but could possibly be looked at.


    If trying to determine where bottlenecks may be happening, what reports would you suggest running in SQL?


  • Registered Users, Registered Users 2 Posts: 2,781 ✭✭✭amen


    http://msdn.microsoft.com/en-us/library/ms161561(v=sql.90).aspx

    http://www.databasejournal.com/features/mssql/article.php/3743596/SQL-Server-Management-Studio-Reports-and-Dashboard.htm

    Try these but there are loads of links on the web explaining what you can run.

    The reports needs to be understood in the context of your db so don't take everything at face value.

    You can also write your own using the the Data Management Views.

    http://msdn.microsoft.com/en-us/library/ms188754(v=sql.100).aspx

    The best thing to do is to constantly benchmark/performance monitor the db so you can spot trends over a period of time. You should have at minimum have 5 min stats on disk io/cpu usage, wait stats,.

    You should also be tracking long running queries >(100ms), queries with high CPU usage etc

    The script below can also be handy but you should now what you are doing before making changes

    http://www.brentozar.com/blitz/

    Do you have a backup ? (Have you restored it to make sure it works?)

    Are your transaction logs backed up ?


Advertisement