Cursor Causing Performance Issues

  • We have a multi-page t-sql batch script running for a long time and we have identified that a part of that script that has a cursor is talking nearly 90% of the time.

    Following is the scrubbed code for the Cursor:

    [highlight=#ffff11]

    DECLARE UserCursor CURSOR FAST_FORWARD FOR SELECT Col1,Col2 FROM #Temp2 group by Col1,Col2

    OPEN UserCursor

    FETCH NEXT FROM UserCursor INTO @Var1,@Var2

    while @@FETCH_STATUS = 0

    BEGIN

    EXEC @return_value = [dbo].[User_sp]

    @InputParameter1 = @Var2,

    @InputParameter2 = @Var1,

    @OutputParameter1 = @Var3 OUTPUT,

    @Return_Code = @Return_Code OUTPUT

    update #Temp2 set Col3 = @Var3 where Col1 = @Var1 and Col2 = @Var

    FETCH NEXT FROM UserCursor INTO @Var1,@Var2

    END

    CLOSE UserCursor

    DEALLOCATE UserCursor

    SELECT 'Return Value' = @Return_Code[/highlight]

    As seen above the rows in the #Temp2 table are being updated based on output of SP using the cursor...Temp2 table has millions of rows causing performance issues.

    One alternative I can think off is to use Cursor itself in the update statement like this:

    [highlight=#ffff11] update #Temp2 set Col3 = @Var3 where CURRENT OF UserCursor;[/highlight]

    I have been tasked with tuning this script recevied from our developers team and was wondering if there are any better ways of tuning this piece of code....

    Any suggestions will be helpful.

  • That might get you a percent or two improvement at most.

    I'd look at the code for that stored proc. See if it can be converted into a set-based update of the temp table. Depends what the proc does.

    Gail Shaw
    Microsoft Certified Master: SQL Server, MVP, M.Sc (Comp Sci)
    SQL In The Wild: Discussions on DB performance with occasional diversions into recoverability

    We walk in the dark places no others will enter
    We stand on the bridge and no one may pass
  • I disagree. Using Temp Tables to "Divide'n'Conquer" is frequently much more efficient that using CTE's or derived tables to do a complicated task.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • @sqlsatyam,

    I agree with Gail (Gila Monster)... we need to see the stored procedure that you're calling in the cursor. One of the very few things that Celko is likely correct about in his rant is the manner in which that proc was created. We can probably help fix this problem but we need to see that stored proc.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • As Gail stated, it may depend on what your sproc is doing.

    If you can get a grip on that, you may be able to tackle your cursor problem.

    If you can, try to avoid the cursor in the first place.

    Does your temp table have the needed indexes to support your group-by and update statements ?

    btw You didn't do everything wrong, despite of what a person may suggest. The thing that went wrong is the lack of educating people. Yes, Educating, so they learn performance isn't only a run time problem in their production environment and can be avoided by keeping simple rules in mind at design time and development time.

    Now, this whole problem may cause a full rewrite of this tsql routine where someone could have done it correct at the first time.

    I wouldn't trust any person that is bragging about having needed to write only 5 cursors in a careers time, because that only means this person isn't informing your the number of times he started rewriting code to avoid the cursor. And that's the most important message of the whole story !

    There is no shame in writing a cursor. Not trying to avoid the cursor is the worst part.

    After all, sqlserver (just like any rdbms) is working best when handling sets !

    Try to avoid row by row operations, because that will always come with a price.

    Johan

    Learn to play, play to learn !

    Dont drive faster than your guardian angel can fly ...
    but keeping both feet on the ground wont get you anywhere :w00t:

    - How to post Performance Problems
    - How to post data/code to get the best help[/url]

    - How to prevent a sore throat after hours of presenting ppt

    press F1 for solution, press shift+F1 for urgent solution 😀

    Need a bit of Powershell? How about this

    Who am I ? Sometimes this is me but most of the time this is me

  • GilaMonster (9/10/2011)


    That might get you a percent or two improvement at most.

    If "that" means the OPs suggestion of using uodate...where current it won't give even a couple of per cent of improvement because it won't work. If it could work, he would need to change the cursor type (fast forward doesn't permit updates) maybe to to scroll_locks or maybe leave it unspecified and add a for update clause; but then he would be trying to update through a derived table which is generated by a group by operation and doesn't contain the column being updated, so that's never going to work because the group by clause forces the cursor to be read only even when either or both of scroll_locks and for update (with or without OF) is specified. Using select distinct instead of group by doesn't help, either, since that too forces read only. Or were you thinking of ditching the group by and going almost totally RBAR, which might under some extremely rare conditions be faster than the current version (eg if Col1,Col2 combinations are almost always unique in the table).

    I'd look at the code for that stored proc. See if it can be converted into a set-based update of the temp table. Depends what the proc does.

    That's the only sensible option, I think.

    Tom

  • CELKO (9/10/2011)


    Another good rule of thumb is that no procedure should be longer than one page.

    It's a good rule of thumb for any language, not just SQL; but of course it's a rule of thumb, not an absolute and universal necessity.

    T-SQL is a one-pass compiler. It was never meant for production work.

    I don't believethat T_SQL was never meant for production work. In fact I think it is blatant nonsense - and I don't imagine that you believe it either. I know you like to use a forceful and attacking, indeed offensive sytle when responding to questions, but blatant untruth is not your usual style. I'm surprised and shocked at what I can only see as an attempt to mislead.

    T-SQL code is used successfully in vast numbers of production systems round the world.

    Yes, one can effectively write Cobol programs in T-SQL (or in standard SQL, just as easily) but that is true of just about every language I've ever known ("just about" because there are seveal languages where it's hard to write anything as good as a bad Cobol program), and no competent developer writes Cobol in any dialect of SQL.

    Finally, this is probably the worst ways to do this. I think the developers are being cruel to you because you are the new kid on the team

    Yes, it could be that he's been sent to fetch a bucket of holes, or asked to write a driver for a GSD. But we need to see that SP before reaching any conclusion.

    Tom

  • sqlsatyam

    I'd have to agree with Gail, Jeff and Johan - the problem is in your stored procedure. I'd go even further and say that the procedure was written to accept input for dealing with one row of data at a time - it probably has a place somewhere else in your application. Because it was already written, and produced the correct results, someone decided to take the easy way out and to just call that procedure in a loop for every record that you are now needing to process as a set. This Row-By-Agonizing-Row (RBAR) approach is what is killing the performance of your script.

    There are many people here that enjoy taking cursor-laden code, and converting it into high-performing set-based code. Personally, I have taken cursor-laden code that runs in > 2 hours to a set-based method that runs in < 10 seconds. There are people that will help you out with this, but we will need to see the code for everything that the procedure is doing, as well as the DDL scripts (CREATE TABLE, CREATE INDEX, etc.) for the tables affected, and INSERT statements to put some test data into those tables that shows your problem will go a long way in getting people to look at your issue and help you out. Don't forget to include what your expected results should be, based on the sample data provided. As a bonus to you, you will get tested code back. For more details on how to get all of this into your post, please look at the first link in my signature.

    You posted in a SQL 2008 forum - please confirm that this is the version of sql that you are using (a lot of people post in the first forum that they come across (this one), and down the road tell us about being on a different version. If this happens to be the case, don't worry about it - just let us know here, along with the post for your proc code and sample data).

    There are several people on this site ready, willing and able to help you convert this, so please HELP US HELP YOU. Don't let those that just want to scream at you keep you away from learning better ways of doing things. And again, before you post, please read the first link in my signature block below - this shows you the kind of information that we need to help you out, and how to go about getting that information.

    Looking forward to helping you out further...

    Wayne
    Microsoft Certified Master: SQL Server 2008
    Author - SQL Server T-SQL Recipes


    If you can't explain to another person how the code that you're copying from the internet works, then DON'T USE IT on a production system! After all, you will be the one supporting it!
    Links:
    For better assistance in answering your questions
    Performance Problems
    Common date/time routines
    Understanding and Using APPLY Part 1 & Part 2

  • Jeff Moden (9/11/2011)


    Joe, I have to ask... where do you come up with such absolute rubbish? Using Temp Tables to "Divide'n'Conquer" is frequently much more efficient that using CTE's or derived tables to do a complicated task and you bloody well know it. And speaking of "1950's technology", YOU have absolutely no room to talk considering that YOU use a While Loop and a "Push Stack" to convert Adjacency Lists to Nested Sets instead of CTE's and derived tables. You can't get more '50's than that.

    Another good rule of thumb is that no procedure should be longer than one page. T-SQL is a one-pass compiler. It was never meant for production work.

    You're kidding, right? Please tell me you're kidding so that my last vestige of faith in you isn't totally shot to hell.

    Darn, I need to rewrite 150+ reports of mine. Too bad they actually do what they need, and fast too :-D.

  • CELKO (9/10/2011)


    Another good rule of thumb is that no procedure should be longer than one page.

    One page? Define "page". Are you still printing everything out on greenbar paper? Where the hell are you still finding greenbar paper? Hilarious how you complain about doing things old ways and then use archaic terms.

    --------------------------------------
    When you encounter a problem, if the solution isn't readily evident go back to the start and check your assumptions.
    --------------------------------------
    It’s unpleasantly like being drunk.
    What’s so unpleasant about being drunk?
    You ask a glass of water. -- Douglas Adams

  • Thanks to everyone who replied to my post.... Feels good to be in the community where experts are ready to help out....

    Regarding the issue i posted yesterday....i modified my cursor to use the Current Of option in the update statement (changed it from a fast-forward cursor to a normal cursor and took out the group by clause).... This change reduced the time of the curosr from about 4hr30mins to around 4hr10mins....

    Following is the scrubbed code for the SP..... It does a select on a table (about 100000 rows in that table) into a temp table, Concatenates the contents of the temp table (again in a cursor 🙁 ) and returns the output....And yes we are using SQL Server 2008....

    We cant modify the sp in any way, so I am hoping we can modify the main cursor to use some kind of cte (or while loop ??) to avoid this sp call...

    CREATE PROCEDURE [dbo].[user_sp] (

    @InputParameter1 varchar(20) = NULL,

    @InputParameter2 varchar(128) = NULL,

    @OutputParameter1 varchar(4000) OUTPUT,

    @Return_Code int OUTPUT)

    AS

    BEGIN

    SET NOCOUNT ON;

    DECLARE

    @sql nvarchar(MAX),

    @paramlist nvarchar(4000),

    @spos int,

    @epos int,

    @NumberOfInputFields smallint,

    @Field varchar(1),

    @Lookup_Effective_Date datetime ,

    @Var6 varchar(4000),

    @Var7 smallint,

    @Var8 varchar(20);

    set @Lookup_Effective_Date datetime = '12/31/2009';

    set @Var6 varchar(4000) = ' ';

    CREATE TABLE #Temp_Ref(

    Col1 varchar(20) NOT NULL,

    Col2 smallint NOT NULL);

    SELECT @sql =

    'INSERT INTO #Temp_Ref (

    Col1,

    Col2)

    SELECT

    Col1,

    Col2

    FROM

    dbo.Reference_Table

    WHERE

    Col1 = @InputParameter1 AND

    Col2 = @InputParameter2 AND

    Start_Date <= @xLookup_Effective_Date AND

    End_Date >= @xLookup_Effective_Date'

    -- skip leading 'Ç' delimiters if any

    SET @spos = 1;

    SET @epos = 1;

    SET @NumberOfInputFields = 1;

    SET @Field = '';

    WHILE @epos > 0

    BEGIN

    IF @NumberOfInputFields < 9

    SET @Field = '0';

    BEGIN

    SET @sql = @sql + ' AND ' +

    '(Field' + @Field + CONVERT(VARCHAR(2), @NumberOfInputFields) +

    '_Text' + '=''%'' OR HASHBYTES(''MD5'', Field' + @Field + CONVERT(VARCHAR(2), @NumberOfInputFields) + '_Text)' + '=';

    SET @epos = CHARINDEX('Ç', @Var6, @spos);

    IF @epos = @spos

    SET @sql = @sql + '''HASHBYTES(''MD5'', '')';

    IF @epos > @spos

    SET @sql = @sql + 'HASHBYTES(''MD5'', ''' + SUBSTRING(@Var6, @spos, @epos - @spos) + '''))';

    END

    IF @epos != 0

    SET @spos = @epos + 1;

    SET @NumberOfInputFields = @NumberOfInputFields + 1;

    END;

    BEGIN

    SET @sql = @sql + 'HASHBYTES(''MD5'', ''' + SUBSTRING(@Var6, @spos, LEN(@Var6) - @spos + 1) + '''))';

    END

    SELECT @paramlist = '@xInputParameter1 varchar(20),

    @xInputParameter2 varchar(128),

    @xLookup_Effective_Date datetime'

    EXEC sp_executesql @sql, @paramlist, @InputParameter1, @InputParameter2, @Lookup_Effective_Date

    SELECT @Var7 = MIN(Col2) FROM #Temp_Ref;

    -- @@Var7 can be NULL

    IF @Var7 IS NOT NULL

    BEGIN

    DECLARE Conformed_Cursor CURSOR FAST_FORWARD

    FOR SELECT Col1 FROM #Temp_Ref

    WHERE Col2 = @Var7

    ORDER BY Col1;

    OPEN Conformed_Cursor;

    FETCH NEXT FROM Conformed_Cursor INTO @OutputParameter1;

    IF @@FETCH_STATUS = 0

    BEGIN

    FETCH NEXT FROM Conformed_Cursor INTO @Var8;

    WHILE @@FETCH_STATUS = 0

    BEGIN

    SET @OutputParameter1 = @OutputParameter1 + 'Ç' + @Var8;

    FETCH NEXT FROM Conformed_Cursor INTO @Var8;

    END;

    END;

    CLOSE Conformed_Cursor;

    DEALLOCATE Conformed_Cursor;

    END;

    IF @OutputParameter1 IS NULL

    BEGIN

    SET @Return_Code = -1;

    END

    ELSE

    BEGIN

    SET @Return_Code = 0;

    END;

    END;

    GO

  • CELKO (9/11/2011)


    Darn, I need to rewrite 150+ reports of mine. Too bad they actually do what they need, and fast too :-D.

    You should not be writing reports in SQL at all. It is a data retriever tool, not a report writer. You grab the query results (which should be in a normal form, without any formatting and minimal calculations) and throw it over the wall (tier) to an application, report writer or a report server. The sure sign of someone who "just do not get it" is code loaded with CONVERT(), CAST(), STUFF(), PAD() and string functions.

    Lately, I keep seeing more and more shops using a Data Warehouse for reporting. Then you are doing ETL and not traditional queries at all.

    I'm glad to see I'm ok in your book. :exclamation:

  • CELKO (9/11/2011)


    Darn, I need to rewrite 150+ reports of mine. Too bad they actually do what they need, and fast too :-D.

    You should not be writing reports in SQL at all. It is a data retriever tool, not a report writer. You grab the query results (which should be in a normal form, without any formatting and minimal calculations) and throw it over the wall (tier) to an application, report writer or a report server. The sure sign of someone who "just do not get it" is code loaded with CONVERT(), CAST(), STUFF(), PAD() and string functions.

    Lately, I keep seeing more and more shops using a Data Warehouse for reporting. Then you are doing ETL and not traditional queries at all.

    Heh... then you'd just love the HTML formatted email messages I'm creating for current system conditions including where I format "cells" with different colors to easily spot "caution" and "red" conditions. 😉

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • CELKO (9/11/2011)


    While temp tables in the Sybase/SQL Server model can be indexed, constrained, etc. nobody does. They really are like a deck of punch cards. Compare this to the ANSI/ISO model where a temp table is a "regular" table with local or global and "clean up" options. added to it.

    So you are claiming that it's not possible for people to misuse/abuse tables in the ANSI/ISO model? Or is it just that nobody misuses/abuses them because nobody uses them at all? The former is of course untrue even for "regular" tables - there are plenty of developers out there who never write a domain constraint (or whatever the language they use calls it) or a UNIQUE constraint or a NOT NULL constraint in every SQL dialect, including the standard ones.

    Tom

  • CELKO (9/11/2011)


    While temp tables in the Sybase/SQL Server model can be indexed, constrained, etc. nobody does.

    Speak for yourself. I do and I know others who do.

    --------------------------------------
    When you encounter a problem, if the solution isn't readily evident go back to the start and check your assumptions.
    --------------------------------------
    It’s unpleasantly like being drunk.
    What’s so unpleasant about being drunk?
    You ask a glass of water. -- Douglas Adams

Viewing 15 posts - 1 through 15 (of 27 total)

You must be logged in to reply to this topic. Login to reply