"with sample xxx percent" is your friend ...

  • When faced with challenging SQL optimization issues, you get the usual refrain of advice that says: "make sure that the index statistics are updated".

    Now, I would like to add an extra clause to that advice. And make sure that the updated stats are representtative of your table as a whole.

    SQL server will be happy with just sampling 1% of your table and figuring that it knows the density of a particular index. But that could be far from the truth. And that can result in a bad execution plan. This could also explain scnearios where queries that were fast yesterday suddenly become slow as the tables grow.

    So when updating statistics make sure that you add the "with sample xxx" percent for the large tables.

    An aha that I came across that I thought I should share ...

    Anil

  • This was removed by the editor as SPAM

  • Havent seen that happen, but I guess it could happen if your data was skewed just do. I would think (but not know for sure) that rebuilding the index would eliminate the problem, so you'd only have worry about cases where a lot of data was added/updated that fell outside the stats you had AND that the stats missed when they updated.

    Andy

    http://qa.sqlservercentral.com/columnists/awarren/

  • OK, so it is gauche to reply to your own post. But here goes:

    And all your good work by setting the right sampling for index statistics can be destroyed by a well meaning DBA who sets up a regular script to call sp_dbupdatestats without the critical @resample='YES' parameter!

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply