Tuning the squel qeury

  • I have taken the statistics by this statement...

    Please let me know this is sufficient or not

    CREATE STATISTICS transction_journa_stat

    ON tp.transaction_journal (transaction_id,OTHER_CASH_DRAWER_ID,STATUS,REVERSED_BY,total_value)

    WITH SAMPLE 100 PERCENT;

  • First of all thank you for the information..

    sys.dm_db_index_physical_stats table returns 2026 no of columns and the reason we have nvarchar data type is we need to support characters other than english.

    I got why you are using the nvarchar but the reason is that you are using nvarchar even for float,int and datetime fields.My question is related to that only.

    Regarding the output of that view.

    here is what you should execute and post the results specially the page_count,record_count,forwarded_record_count

    ,avg_page_space_used_in_percent,avg_record_size_in_bytes etc..

    select * from sys.dm_db_index_physical_stats(db_id(),object_id('TRANSACTION_JOURNAL'),null,null,'DETAILED')

    where index_id = 0

    CREATE STATISTICS transction_journa_stat

    ON tp.transaction_journal (transaction_id,OTHER_CASH_DRAWER_ID,STATUS,REVERSED_BY,total_value)

    WITH SAMPLE 100 PERCENT;

    Is that the only stat you got on the table? Statistics are created on leading column of indexes as well.Did you run the update statistics on your tables? If not try to run on all teh tables involved in query ,use a small sample if tables are huge otherwise try fullscan (fullscan might take lots of times based on table size).

    if it doesnt work try maxdop =1 as well as parallelism is not giving you much of benefit because most of time the actual work is being done by one thread other 3 are almost idle.

    GulliMeel

    Finding top n Worst Performing queries[/url]
    Improve the performance of Merge Join(special case)
    How to Post Performance Problem -Gail Shaw[/url]

  • You have two join conditions and these columns doesnt have the any statistics defined and thus uses default stats and which are causing the issue.

    AND ( TPTRJO1.STATUS NOT IN ( 'V', 'I', 'R' ) ) --This usually estimates around 3%

    AND ( TPTRJO1.TOTAL_VALUE <> 0 ) ---I tried it once and it estimates around 92%

    Thus your estimates are way too wrong.. e.g. optimizer estimated around 200K rows after applying other condition. Then it will see these two conditions and then apply the default selectivoity say for status first that is 3%. So you will have around 6K rows estimated.. Then total due will be used and after that estimation will be around 5.5K whereas you had around 200K rows and this forces optimizer to use the nested loop joins as estimated rows are less but actual rows are way too much in this cae around 40 times and thus nested loop join might not be right choice here..

    Other thing is that after joining Journal_cash table you do not get a single row. Thus that table should have come in join before so that most of rows are elimiated in the start of the plan.. Check the statistics on that table as well.

    Try to update the statistic on the column in where clause and then see how the optimizer will do.If it estimates correctly most probably you will get a better plan..

    GulliMeel

    Finding top n Worst Performing queries[/url]
    Improve the performance of Merge Join(special case)
    How to Post Performance Problem -Gail Shaw[/url]

Viewing 3 posts - 16 through 17 (of 17 total)

You must be logged in to reply to this topic. Login to reply