How to Identify Slow Code

  • In a specific Query, how can you easily identify how long each piece of code is taking? My query is taking over 20 minutes so I can't just comment out each line and see what improves it, so I need a better method.

  • Look at the execution plan.

    Also run it with:

    SET STATISTICS IO, TIME ON

     

  • another option is to use the great free tool SentryOne Plan Explorer:

    https://www.sentryone.com/plan-explorer

    It gives you a lot more information than the standard execution plan in Management Studio, including actual I/O, actual CPU time, and lets you see each individual operation and the possible indexes that could meet that operation.

  • jakegordon1997 wrote:

    In a specific Query, how can you easily identify how long each piece of code is taking? My query is taking over 20 minutes so I can't just comment out each line and see what improves it, so I need a better method.

    The Actual Execution plan can provide some hints although it's not always "accurate" because of all the estimates it makes even in the "Actual" Execution plan.

    For long winded code, I always build it with a "Messaging" parameter where if I set it to something other than "0", it'll spit out the time (duration, which is the ultimate indication of problems so far as users are concerned) and row counts of each section.  Of course, you have to modify the code to do that but it'll help you find the real issue(s).

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • I assume we're talking about a batch here with multiple statements?

    In that case, put Extended Events to work. You can create a session using the sql_statement_completed event for batches, or sp_statement_completed for stored procedures. These will enable you to capture each individual statement of the batch and then you can tell how long each one took to run.

    Now, sp_statement_completed can be easily filtered by object_id. sql_statement_completed can't as easily be filtered. So, be very cautious in how you use the session if you're looking at sql_statement_completed. Turn it on, run the query, turn it off. You'll still have a lot of data. One way to simplify the data collection and consumption would be to enable causality tracking in the session. Then, all the statements will be grouped. I have a bunch of examples doing exactly this kind of work on my blog.

    Then, once you identify the offending statement, as everyone else has said, look at the execution plan. If you need help on that, look at the link in my signature.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply