performance issue

  • A proc that usually takes less than a second to complete (has been in production for 2 years) all of a suden takes 300+ seconds and attemts to read 300mm records where usually it is 300K. No changes have been made to the procedure recently. Do not see any fragmentation on the tables it queries. Do not see any scans. Missing indexes query does not give any suggestions for these tables. The issue is random and comes up 3-5 times during 24 hour period. Stats on the tables get updated nightly.

    Tried recompiling the proc when the issue was occuring but the cpu was maxed out and that still did not help at the time. Any suggestions? Can't paste proc here, sorry.

  • Is it consistently slow or slow at random times? Slow for specific parameter values and fast for others or slow for all?

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • GilaMonster (11/19/2011)


    Is it consistently slow or slow at random times? Slow for specific parameter values and fast for others or slow for all?

    Random times. I caputred a trace when this was going on where it took 300 sec to execute while trying to perform 300mm reads. Once dropping all connections and rerunning the same trace it took half a second with 2K reads. So far I'm thinking its a bad execution plan. But this proc hasn't had issues in the past running without recompile.

  • Try getting the execution plan when it's slow and when it's fast. Compare the two. It could be bad parameter sniffing.

    But to be honest, it sounds like contention. Have you looked at blocking processes?

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • Lexa (11/19/2011)


    GilaMonster (11/19/2011)


    Is it consistently slow or slow at random times? Slow for specific parameter values and fast for others or slow for all?

    Random times. I caputred a trace when this was going on where it took 300 sec to execute while trying to perform 300mm reads. Once dropping all connections and rerunning the same trace it took half a second with 2K reads. So far I'm thinking its a bad execution plan. But this proc hasn't had issues in the past running without recompile.

    What about the parameter values?

    The massive difference in reads implies there's some form of sub-optimal plan, but without seeing the procedure or the plan it's going to be hard to be more specific than that.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • Grant Fritchey (11/20/2011)


    Try getting the execution plan when it's slow and when it's fast. Compare the two. It could be bad parameter sniffing.

    But to be honest, it sounds like contention. Have you looked at blocking processes?

    Yes I have, no blocking at all. Unfortunately I can't get the execution plan when it is slow because this issue brings the entire server down. At time it becomes completely unresponsive.

  • GilaMonster (11/20/2011)


    Lexa (11/19/2011)


    GilaMonster (11/19/2011)


    Is it consistently slow or slow at random times? Slow for specific parameter values and fast for others or slow for all?

    Random times. I caputred a trace when this was going on where it took 300 sec to execute while trying to perform 300mm reads. Once dropping all connections and rerunning the same trace it took half a second with 2K reads. So far I'm thinking its a bad execution plan. But this proc hasn't had issues in the past running without recompile.

    What about the parameter values?

    The massive difference in reads implies there's some form of sub-optimal plan, but without seeing the procedure or the plan it's going to be hard to be more specific than that.

    I captured parameter values in a trace and ran the trace when the server was stable. It took less than a second to complete. Havn't seen this issue ever since I added "with recompile" so I guess it was a sub-optimal plan.

  • Lexa (11/20/2011)


    Grant Fritchey (11/20/2011)


    Try getting the execution plan when it's slow and when it's fast. Compare the two. It could be bad parameter sniffing.

    But to be honest, it sounds like contention. Have you looked at blocking processes?

    Yes I have, no blocking at all. Unfortunately I can't get the execution plan when it is slow because this issue brings the entire server down. At time it becomes completely unresponsive.

    If it does happen again, try pulling the plan striaght out of cache. You don't need to re-execute it. Just use the dynamic management views.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • Grant Fritchey (11/21/2011)


    Lexa (11/20/2011)


    Grant Fritchey (11/20/2011)


    Try getting the execution plan when it's slow and when it's fast. Compare the two. It could be bad parameter sniffing.

    But to be honest, it sounds like contention. Have you looked at blocking processes?

    Yes I have, no blocking at all. Unfortunately I can't get the execution plan when it is slow because this issue brings the entire server down. At time it becomes completely unresponsive.

    If it does happen again, try pulling the plan striaght out of cache. You don't need to re-execute it. Just use the dynamic management views.

    Thanks, do you have a query for that dmv by any chance?

    Also, does an update statistics force a recompile on all procedures that use the table for which update stats runs?

  • Lexa (11/21/2011)


    Grant Fritchey (11/21/2011)


    Lexa (11/20/2011)


    Grant Fritchey (11/20/2011)


    Try getting the execution plan when it's slow and when it's fast. Compare the two. It could be bad parameter sniffing.

    But to be honest, it sounds like contention. Have you looked at blocking processes?

    Yes I have, no blocking at all. Unfortunately I can't get the execution plan when it is slow because this issue brings the entire server down. At time it becomes completely unresponsive.

    If it does happen again, try pulling the plan striaght out of cache. You don't need to re-execute it. Just use the dynamic management views.

    Thanks, do you have a query for that dmv by any chance?

    Also, does an update statistics force a recompile on all procedures that use the table for which update stats runs?

    If the statistics get updated, yeah, you're going to get a recompile. That happens when they update automatically too.

    Really simple query. Tweak as needed

    SELECT *

    FROM sys.dm_exec_requests AS der

    CROSS APPLY sys.dm_exec_query_plan(der.plan_handle) AS deqp

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • Grant Fritchey (11/21/2011)


    Lexa (11/21/2011)


    Grant Fritchey (11/21/2011)


    Lexa (11/20/2011)


    Grant Fritchey (11/20/2011)


    Try getting the execution plan when it's slow and when it's fast. Compare the two. It could be bad parameter sniffing.

    But to be honest, it sounds like contention. Have you looked at blocking processes?

    Yes I have, no blocking at all. Unfortunately I can't get the execution plan when it is slow because this issue brings the entire server down. At time it becomes completely unresponsive.

    If it does happen again, try pulling the plan striaght out of cache. You don't need to re-execute it. Just use the dynamic management views.

    Thanks, do you have a query for that dmv by any chance?

    Also, does an update statistics force a recompile on all procedures that use the table for which update stats runs?

    If the statistics get updated, yeah, you're going to get a recompile. That happens when they update automatically too.

    Really simple query. Tweak as needed

    SELECT *

    FROM sys.dm_exec_requests AS der

    CROSS APPLY sys.dm_exec_query_plan(der.plan_handle) AS deqp

    Thanks again. What is the main reason for SQL Server to try to execute a bad query plan? I'm puzzled as to why this started to happen now.

  • Lexa (11/21/2011)


    Thanks again. What is the main reason for SQL Server to try to execute a bad query plan? I'm puzzled as to why this started to happen now.

    Because it thinks the plan is good. Typically the plan gets compiled and is either bad at compile time because of incorrect estimates (mostly from statistics) or is good then but is reused in cases where it's not a good plan.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • SS can't really make any difference between a good and bad plan.

    It's all about estimates and guesses and here's what I think is good enough to run.

    I'm sure I'm repeating someone else but we can't help debug this without comparing the "good" and "bad" plans side by side (actual plans, not estimated).

  • It's as the Ninja & Gail says, a plan is neither good nor bad, per se. It's the statistics, the structure, and your query, that drive what kind of plan will be created. The plan is created by the optimizer based on a number of seriously deep scientific and mathematical principles that I'm not even going to pretend that I understand. Based on these things it comes up with a query plan. For good or for ill, GIGO still applies on these as with so much else in life. If you're code, structure or stats are off, then so will the plan be.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

Viewing 15 posts - 1 through 15 (of 28 total)

You must be logged in to reply to this topic. Login to reply