SQL 2000 - SET TRANSACTION ISOLATION LEVEL

  • SQL 2000 - SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED

    We've large query that takes considerable time. To speed up response time, we are thinking of adding SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED in our stored procedures.

    Does anyone have any positive or negative experience in

    1)using SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED

    Thank you,

    Rajani

  • This means that Shared Locks will not be placed on rows that are read from the query. In addition, you will be allowing dirty reads that may cause your query to read uncomitted data!


    Kindest Regards,

  • I'd never recommend it for a situation where it was critical that the data returned was accurate.  However, I have recommended it previously for a web site that displayed live football scores.  Scores were continually being updated and the previously locked-out "tell me the score" queries were all changed to READ UNCOMMITTED.  The result was a more than significant boost in performance.


    Cheers,
    - Mark

  • Thanks Mark.

    I think i'm getting the idea

  • Just to throw another US$0.02 in on the subject...  I require that any job that performs mass updates use the READ UNCOMMITTED option in the job, and that a cursor be used to drive the job.  BEGIN/COMMIT can be placed around any updates/inserts that must be done (we do not permit deletes in production apps).

    Example:

    SET TRANSACTION LEVEL READ UNCOMMITTED

    SET NOCOUNT ON

    DECLARE {@variables dtype},...

    DECLARE {cursorname} INSENSITIVE CURSOR FOR {select stmt}

    OPEN {cursorname}

    FETCH...

    WHILE @@FETCH_STATUS=0

    BEGIN

    ... processing here

       BEGIN TRANSACTION

       UPDATE...

       COMMIT WORK

    END

    etc...

    This permits the use of transactions and locks needed for insert and update, but boosts performance when walking the cursor and handling other selects outside of the cursor.

    Note as stated above, do this on data that you know isn't going to change, such as setting a default on a column that was not previously set.  On tables where the data you are selecting into the cursor is constantly changing, this use READ COMMITTED to ensure dirty data dosen't make it in (although this could slow things down a bit).

    Just a little tool for the kit

    Cheers,

    -- Joe

  • I use SET TRANSACTION ISOLATION LEVEL in all my reporting where the data does not have to be up to the minute (which is most reports most of the time, if you think about it. You could also use WITH (NOLOCK) query hints on the individual tables, which amounts to the same thing, but can be more granular -- in other words you could apply it to some tables but not all in your query.

    The most important thing, however, for rapid query performace is good indexing and using joins that employ valid Search Arguments ("SARGs"). There is absolutely no substitute for these. I have been able to get up to 400% performance gains from queries involving millions of records from index and WHERE clause tweaking.

    See esp. Chapters 15-17 of Inside SQL Server 2000 by Kalen Delaney.

    Avoid curors like the plague, if at all possible, use a WHILE loop, if you can.

    G. Milner

  • Thank you very much gmilner and Joe.
    you made very valuable comments

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply