performance issue

  • Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    Any good articles on parameter sniffing? Found this one so far http://www.simple-talk.com/sql/t-sql-programming/parameter-sniffing/

  • Has there been any hardware changes recently?

    I only ask since I had a similar issue come up once....and it was after I upgraded the memory on the server from 4GB to 32GB. From what I understood (and I'm still a noob) is that SQL was able to cache the entire database in memory which changed the execution plan....causing SQL to start pulling the entire table straight from memory instead of just a few rows from disk. Rewriting the proc fixed that issue.

  • Kudikai (11/22/2011)


    Has there been any hardware changes recently?

    I only ask since I had a similar issue come up once....and it was after I upgraded the memory on the server from 4GB to 32GB. From what I understood (and I'm still a noob) is that SQL was able to cache the entire database in memory which changed the execution plan....causing SQL to start pulling the entire table straight from memory instead of just a few rows from disk. Rewriting the proc fixed that issue.

    No hardware changes, no service packs.

  • Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    Would setting every parameter passed to the proc, to a local variable to be used in all underlining queries resolve bad parameter sniffing?

  • Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    Would setting every parameter passed to the proc, to a local variable to be used in all underlining queries resolve bad parameter sniffing?

    You have to define "resolve." It would result in all plans being based on samples of the statistics instead of specific plans based on seeks against the statistics. That is a solution if, and only if, the majority of your queries work better using sampled statistics. Most of the time, that's not true. You get a very good plan based on specific values going against the stats.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    Any good articles on parameter sniffing? Found this one so far http://www.simple-talk.com/sql/t-sql-programming/parameter-sniffing/%5B/quote%5D

    I wrote a chapter for the MVP Deep Dives vol 2, all on parameter sniffing, good and bad.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    Would setting every parameter passed to the proc, to a local variable to be used in all underlining queries resolve bad parameter sniffing?

    You have to define "resolve." It would result in all plans being based on samples of the statistics instead of specific plans based on seeks against the statistics. That is a solution if, and only if, the majority of your queries work better using sampled statistics. Most of the time, that's not true. You get a very good plan based on specific values going against the stats.

    OK, sounds like OPTIMIZE FOR is the best solution for removing "with recompile" and still getting good plans. With 3 user defined tables as parameters, I assume literal_constant cannot be used wtih OPTIMIZE FOR, correct? Would using OPTIMIZE FOR UNKNOWN produce the same behavor as setting all parameters to local variables?

    I have an instance within the proc where I'm already setting one passed table to a local variable. I don't know if that might be the cause of bad plans. Not sure what information SS uses when a user defined table is passed though. Promise to read your book in the near future:-)

  • I've seen this before and the resolution was to issue a sp_recompile on the stored procedure being used. Even though the table was indexed for some reason the stored procedure was not updated. I would make sure the table is indexed and then run the recompile. This has occurred in SQL 2005 & 2008.

  • Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    Would setting every parameter passed to the proc, to a local variable to be used in all underlining queries resolve bad parameter sniffing?

    You have to define "resolve." It would result in all plans being based on samples of the statistics instead of specific plans based on seeks against the statistics. That is a solution if, and only if, the majority of your queries work better using sampled statistics. Most of the time, that's not true. You get a very good plan based on specific values going against the stats.

    OK, sounds like OPTIMIZE FOR is the best solution for removing "with recompile" and still getting good plans. With 3 user defined tables as parameters, I assume literal_constant cannot be used wtih OPTIMIZE FOR, correct? Would using OPTIMIZE FOR UNKNOWN produce the same behavor as setting all parameters to local variables?

    I have an instance within the proc where I'm already setting one passed table to a local variable. I don't know if that might be the cause of bad plans. Not sure what information SS uses when a user defined table is passed though. Promise to read your book in the near future:-)

    When you say user defined table, you mean a table variable? That might explain things. Those have no statistics, so SQL Server assumes one (1) row. For the most part, as long as you have only a few rows (few means ~>75 maybe, depending) then the plans generated are OK. But as soon as you get more rows, the plan is junk.

    Again, all just speculation not seeing the code or exec plans.

    OPTIMIZE FOR UNKOWN provides a similar solution to using local variables. Instead of targeting a specific value with the statistics, you're sampling them. It's all a fun dance determining which plan is better most of the time, the sampled plan or the specific plan.

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • Grant Fritchey (11/23/2011)


    Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    Would setting every parameter passed to the proc, to a local variable to be used in all underlining queries resolve bad parameter sniffing?

    You have to define "resolve." It would result in all plans being based on samples of the statistics instead of specific plans based on seeks against the statistics. That is a solution if, and only if, the majority of your queries work better using sampled statistics. Most of the time, that's not true. You get a very good plan based on specific values going against the stats.

    OK, sounds like OPTIMIZE FOR is the best solution for removing "with recompile" and still getting good plans. With 3 user defined tables as parameters, I assume literal_constant cannot be used wtih OPTIMIZE FOR, correct? Would using OPTIMIZE FOR UNKNOWN produce the same behavor as setting all parameters to local variables?

    I have an instance within the proc where I'm already setting one passed table to a local variable. I don't know if that might be the cause of bad plans. Not sure what information SS uses when a user defined table is passed though. Promise to read your book in the near future:-)

    When you say user defined table, you mean a table variable? That might explain things. Those have no statistics, so SQL Server assumes one (1) row. For the most part, as long as you have only a few rows (few means ~>75 maybe, depending) then the plans generated are OK. But as soon as you get more rows, the plan is junk.

    Again, all just speculation not seeing the code or exec plans.

    OPTIMIZE FOR UNKOWN provides a similar solution to using local variables. Instead of targeting a specific value with the statistics, you're sampling them. It's all a fun dance determining which plan is better most of the time, the sampled plan or the specific plan.

    Yes, table variables. I cat try to use OPRIMIZE FOR with those for UNKNOWN or I can try setting all the table variables passed to the proc to some local variables and then use those in the joins, any other options?

    Basic structure of the procedure is as follows:

    1. Pass in three tables, @table1, @table2 and @table3

    2. select distinct rows from @tables1 into @loc_table

    3. select a number of fields from a big table (few billion rows) joined to 2 smaller physical tables, joining those to @loc_table, @table2 and @table3

    4. UNION a number of fields from medium table (a few hundred million rows) joined to 2 smaller physical tables joined to @loc_table and @table3

    Thanks.

  • You need to convert those to #tmp tables. Ideally with PK or at least stats if PK is not an option.

  • Lexa (11/23/2011)


    Grant Fritchey (11/23/2011)


    Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    Would setting every parameter passed to the proc, to a local variable to be used in all underlining queries resolve bad parameter sniffing?

    You have to define "resolve." It would result in all plans being based on samples of the statistics instead of specific plans based on seeks against the statistics. That is a solution if, and only if, the majority of your queries work better using sampled statistics. Most of the time, that's not true. You get a very good plan based on specific values going against the stats.

    OK, sounds like OPTIMIZE FOR is the best solution for removing "with recompile" and still getting good plans. With 3 user defined tables as parameters, I assume literal_constant cannot be used wtih OPTIMIZE FOR, correct? Would using OPTIMIZE FOR UNKNOWN produce the same behavor as setting all parameters to local variables?

    I have an instance within the proc where I'm already setting one passed table to a local variable. I don't know if that might be the cause of bad plans. Not sure what information SS uses when a user defined table is passed though. Promise to read your book in the near future:-)

    When you say user defined table, you mean a table variable? That might explain things. Those have no statistics, so SQL Server assumes one (1) row. For the most part, as long as you have only a few rows (few means ~>75 maybe, depending) then the plans generated are OK. But as soon as you get more rows, the plan is junk.

    Again, all just speculation not seeing the code or exec plans.

    OPTIMIZE FOR UNKOWN provides a similar solution to using local variables. Instead of targeting a specific value with the statistics, you're sampling them. It's all a fun dance determining which plan is better most of the time, the sampled plan or the specific plan.

    Yes, table variables. I cat try to use OPRIMIZE FOR with those for UNKNOWN or I can try setting all the table variables passed to the proc to some local variables and then use those in the joins, any other options?

    Basic structure of the procedure is as follows:

    1. Pass in three tables, @table1, @table2 and @table3

    2. select distinct rows from @tables1 into @loc_table

    3. select a number of fields from a big table (few billion rows) joined to 2 smaller physical tables, joining those to @loc_table, @table2 and @table3

    4. UNION a number of fields from medium table (a few hundred million rows) joined to 2 smaller physical tables joined to @loc_table and @table3

    Thanks.

    And it's Steps 3 & 4 that will bite you hard. Because the table variables have no statistics, JOIN and UNION and WHERE operations against them are problematic. Query hints such as OPTIMIZE FOR aren't going to help you at all for these values.

    How many rows typically are coming into the three table variables? If it's only a few hundred or a couple of thousand, you might consider loading them from the table variables into temp tables to get the statistics, but if it's a very small amount or a very large amount, that added step might be too costly. I'd drop any concerns about parameter sniffing and focus on testing what can be done about the table variables (if anything can be, or needs to be).

    ----------------------------------------------------The credit belongs to the man who is actually in the arena, whose face is marred by dust and sweat and blood... Theodore RooseveltThe Scary DBAAuthor of: SQL Server 2017 Query Performance Tuning, 5th Edition and SQL Server Execution Plans, 3rd EditionProduct Evangelist for Red Gate Software

  • Grant Fritchey (11/23/2011)


    Lexa (11/23/2011)


    Grant Fritchey (11/23/2011)


    Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    Grant Fritchey (11/22/2011)


    Lexa (11/22/2011)


    So far "with recompile" seems to prevent massive issues with the proc where the server becomes unresponsive but it causes LCK_M_SCH_S locks becuase of the recompiles. Looks like it's not the best solution. If I were to take "with recompile" off, do I need to increase the frequency of Update Stats on all tables in the joins, to avoid bad executions plans? Thanks.

    Assuming the cause is bad statistics, yeah.

    If the cause is something else, such as bad parameter sniffing, you might look at some of the standard fixes for bad parameter sniffing such as using the OPTIMIZE FOR query hint.

    Would setting every parameter passed to the proc, to a local variable to be used in all underlining queries resolve bad parameter sniffing?

    You have to define "resolve." It would result in all plans being based on samples of the statistics instead of specific plans based on seeks against the statistics. That is a solution if, and only if, the majority of your queries work better using sampled statistics. Most of the time, that's not true. You get a very good plan based on specific values going against the stats.

    OK, sounds like OPTIMIZE FOR is the best solution for removing "with recompile" and still getting good plans. With 3 user defined tables as parameters, I assume literal_constant cannot be used wtih OPTIMIZE FOR, correct? Would using OPTIMIZE FOR UNKNOWN produce the same behavor as setting all parameters to local variables?

    I have an instance within the proc where I'm already setting one passed table to a local variable. I don't know if that might be the cause of bad plans. Not sure what information SS uses when a user defined table is passed though. Promise to read your book in the near future:-)

    When you say user defined table, you mean a table variable? That might explain things. Those have no statistics, so SQL Server assumes one (1) row. For the most part, as long as you have only a few rows (few means ~>75 maybe, depending) then the plans generated are OK. But as soon as you get more rows, the plan is junk.

    Again, all just speculation not seeing the code or exec plans.

    OPTIMIZE FOR UNKOWN provides a similar solution to using local variables. Instead of targeting a specific value with the statistics, you're sampling them. It's all a fun dance determining which plan is better most of the time, the sampled plan or the specific plan.

    Yes, table variables. I cat try to use OPRIMIZE FOR with those for UNKNOWN or I can try setting all the table variables passed to the proc to some local variables and then use those in the joins, any other options?

    Basic structure of the procedure is as follows:

    1. Pass in three tables, @table1, @table2 and @table3

    2. select distinct rows from @tables1 into @loc_table

    3. select a number of fields from a big table (few billion rows) joined to 2 smaller physical tables, joining those to @loc_table, @table2 and @table3

    4. UNION a number of fields from medium table (a few hundred million rows) joined to 2 smaller physical tables joined to @loc_table and @table3

    Thanks.

    And it's Steps 3 & 4 that will bite you hard. Because the table variables have no statistics, JOIN and UNION and WHERE operations against them are problematic. Query hints such as OPTIMIZE FOR aren't going to help you at all for these values.

    How many rows typically are coming into the three table variables? If it's only a few hundred or a couple of thousand, you might consider loading them from the table variables into temp tables to get the statistics, but if it's a very small amount or a very large amount, that added step might be too costly. I'd drop any concerns about parameter sniffing and focus on testing what can be done about the table variables (if anything can be, or needs to be).

    Got it, thanks all.

Viewing 14 posts - 16 through 28 (of 28 total)

You must be logged in to reply to this topic. Login to reply