Replacement for Cursors

  • Here's a pretty contrived reply! 🙂

    You still gotta know in advance how many names you've got otherwise dynamic SQL is the only approach.

    DECLARE @Table TABLE(Name varchar(30) )

    DECLARE @name1 VARCHAR(30), @name2 VARCHAR(30), @name3 VARCHAR(30), @name4 VARCHAR(30), @name5 VARCHAR(30)

    --Sample Data

    Insert Into @Table

    Select 'Jack' Union all Select 'Vinu' Union all Select 'Jim'

    Union all Select 'Stan' Union all Select 'Ash'

    ;WITH Names AS (

    SELECT name, ROW_NUMBER() OVER (PARTITION BY (SELECT NULL) ORDER BY Name) As rk

    FROM @Table)

    SELECT @name1 = CASE rk WHEN 1 THEN name ELSE @name1 END

    ,@name2 = CASE rk WHEN 2 THEN name ELSE @name2 END

    ,@name3 = CASE rk WHEN 3 THEN name ELSE @name3 END

    ,@name4 = CASE rk WHEN 4 THEN name ELSE @name4 END

    ,@name5 = CASE rk WHEN 5 THEN name ELSE @name5 END

    FROM Names

    SELECT @name1, @name2, @name3, @name4, @name5

    Or, in case you didn't like the trouble I went to so that the names are ordered, you can do this:

    DECLARE @Table TABLE(Name varchar(30) )

    DECLARE @name1 VARCHAR(30), @name2 VARCHAR(30), @name3 VARCHAR(30), @name4 VARCHAR(30), @name5 VARCHAR(30)

    DECLARE @i INT

    --Sample Data

    Insert Into @Table

    Select 'Jack' Union all Select 'Vinu' Union all Select 'Jim'

    Union all Select 'Stan' Union all Select 'Ash'

    SELECT @i = 1

    SELECT @name1 = CASE @i WHEN 1 THEN name ELSE @name1 END

    ,@name2 = CASE @i WHEN 2 THEN name ELSE @name2 END

    ,@name3 = CASE @i WHEN 3 THEN name ELSE @name3 END

    ,@name4 = CASE @i WHEN 4 THEN name ELSE @name4 END

    ,@name5 = CASE @i WHEN 5 THEN name ELSE @name5 END

    ,@i = @i + 1

    FROM @Table

    SELECT @name1, @name2, @name3, @name4, @name5


    My mantra: No loops! No CURSORs! No RBAR! Hoo-uh![/I]

    My thought question: Have you ever been told that your query runs too fast?

    My advice:
    INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
    The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.

    Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
    Since random numbers are too important to be left to chance, let's generate some![/url]
    Learn to understand recursive CTEs by example.[/url]
    [url url=http://www.sqlservercentral.com/articles/St

  • Thanx Dwain.

    So, if I don't know the number of items then how would I do it using Dynamic SQL??

    Vinu Vijayan

    For better and faster solutions please check..."How to post data/code on a forum to get the best help" - Jeff Moden[/url] 😉

  • You called my bluff and I realize I have to fold.

    The issue I didn't think about when I said that, is that you need to DECLARE each of the local variables you want to assign to. You could DECLARE 10 or 50 or 1000 in advance as placeholders, and then use my query above and limit by TOP n to ensure no error.

    But even in dynamic SQL, using sp_execsql you can only return to a list of local variables that you've pre-declared.

    So I'm stumped. Maybe someone with more experience can make a suggestion.


    My mantra: No loops! No CURSORs! No RBAR! Hoo-uh![/I]

    My thought question: Have you ever been told that your query runs too fast?

    My advice:
    INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
    The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.

    Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
    Since random numbers are too important to be left to chance, let's generate some![/url]
    Learn to understand recursive CTEs by example.[/url]
    [url url=http://www.sqlservercentral.com/articles/St

  • Thanx again Dwain.

    You're examples were very helpful and have definitely taught me some ne stuff. As far as the Dynamic SQL part goes....you can't have everything always...right??

    There are uses for cursors. Determining when has to happen on a case by case basis. There is no general rule as to when you use them and when you don't. Many times you can find set based alternatives. Sometimes, it just takes longer to find them.

    This exercise in general terms just isn't worth playing.

    @Lynn: Suppose, you see a requirement which at first sight doesn't look like it can be solved without using a cursor. How much time does someone as experienced as you give to put your head down and find an alternative?....and after how long does it take you to decide that you'd rather go with Cursors?

    Vinu Vijayan

    For better and faster solutions please check..."How to post data/code on a forum to get the best help" - Jeff Moden[/url] 😉

  • Vinu,

    I'm going to jump in on your question to Lynn because I think you've got it all wrong.

    I never approach a problem looking for an alternative to a CURSOR. I always assume the only way to solve the problem is without a CURSOR. I think that reasoning is 99.99% correct. Aside from a couple of cases early in my SQL learning days when I wasn't making this assumption, I have not yet hit that .01% case.

    I am a fond believer in the idea that attitude is everything. If you go in with wrong-headed thinking, it will take you longer to get on the right track than if you start on the correct path.

    For the record, the majority of my posts solve the problem in under 30 minutes (usually in <10). The exception is usually the case of a back-and-forth exchange where I'm in a competition on performance. I'm not trying to brag here, just pointing out that if you start in the right direction, it takes you a shorter period to find the correct answer.


    My mantra: No loops! No CURSORs! No RBAR! Hoo-uh![/I]

    My thought question: Have you ever been told that your query runs too fast?

    My advice:
    INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
    The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.

    Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
    Since random numbers are too important to be left to chance, let's generate some![/url]
    Learn to understand recursive CTEs by example.[/url]
    [url url=http://www.sqlservercentral.com/articles/St

  • vinu512 (4/26/2012)


    As far as the Dynamic SQL part goes....you can't have everything always...right??

    No. The point I was trying to illustrate with the dynamic SQL I showed you, is that you're only limited by what you can design.

    vinu512 (4/26/2012)


    So, if I don't know the number of items then how would I do it using Dynamic SQL??

    What about this: -

    SET NOCOUNT ON;

    --DDL

    CREATE TABLE Ex (NAME VARCHAR(30));

    --Sample Data

    INSERT INTO Ex

    SELECT 'Jack'

    UNION ALL SELECT 'Vinu'

    UNION ALL SELECT 'Jim'

    UNION ALL SELECT 'Stan'

    UNION ALL SELECT 'Ash';

    --No Cursors

    DECLARE @SQL NVARCHAR(MAX);

    WITH CTE(variables,NAME) AS (

    SELECT '@name' + CAST(ROW_NUMBER() OVER(ORDER BY NAME) AS VARCHAR(3)), NAME

    FROM Ex),

    CTE2 (declareStmt) AS (

    SELECT STUFF((SELECT +CHAR(10)+'DECLARE @name' + CAST(ROW_NUMBER() OVER(ORDER BY NAME) AS VARCHAR(3)) + ' VARCHAR(30) = '+CHAR(39)+NAME+CHAR(39)+';'

    FROM CTE

    FOR XML PATH('')), 1, 1, '')),

    CTE3 (selectStmt) AS (

    SELECT 'SELECT ' + STUFF((SELECT ','+variables

    FROM CTE

    FOR XML PATH('')), 1, 1, '')+';')

    SELECT @SQL = STUFF((SELECT CHAR(10)+stmts

    FROM (SELECT declareStmt

    FROM CTE2

    UNION ALL

    SELECT selectStmt

    FROM CTE3) a(stmts)

    FOR XML PATH('')), 1, 1, '');

    --Show what the dynamic SQL created

    PRINT @SQL;

    --Execute the code

    EXECUTE sp_executesql @SQL;

    The @SQL variable ends up holding the following: -

    DECLARE @name1 VARCHAR(30) = 'Ash';

    DECLARE @name2 VARCHAR(30) = 'Jack';

    DECLARE @name3 VARCHAR(30) = 'Jim';

    DECLARE @name4 VARCHAR(30) = 'Stan';

    DECLARE @name5 VARCHAR(30) = 'Vinu';

    SELECT @name1,@name2,@name3,@name4,@name5;

    So we dynamically figure out the variables in the first CTE, then create the "declare" statements in CTE2 and finally the "select" statement in CTE3. These are then UNION ALL'd to the outer @SQL variable, allowing us to execute the whole thing.

    vinu512 (4/26/2012)


    Suppose, you see a requirement which at first sight doesn't look like it can be solved without using a cursor. How much time does someone as experienced as you give to put your head down and find an alternative?....and after how long does it take you to decide that you'd rather go with Cursors?

    The way I work allows me to get this down long before I start coding.

    Step 1. I start with pen and paper and jot down a few algorithms. Normally, I end up with between 3 and 10 algorithms written down.

    Step 2. I'll then code up the 3 that I think are most likely to perform the best, as well as 1 RBAR algorithm.

    Step 3. I set-up a test environment of 100 rows based on the table structures and data set and run them all against it comparing the result-sets by inserting each one into temp tables. I repeat this test 10 times, assuming that each run produces the same results.

    Step 4. Once all of my algorithms have passed the first stage of testing, I recreate my test environment based on the table structures and data set but this time add 1,000,000 rows and test again. I repeat this test 5 times, assuming that each run produces the same results.

    Step 5. Once I have the two fastest, I take a sub-set of the actual data and test again, this is not repeated.

    Step 6. Assuming the results are the same, I do a final test on the actual tables that will be used. I repeat this test 5 times.

    Step 7. Finally, I do a logic check of the fastest algorithm, then submit the code into the correct branch of our software.


    Forever trying to learn
    My blog - http://www.cadavre.co.uk/
    For better, quicker answers on T-SQL questions, click on the following...http://www.sqlservercentral.com/articles/Best+Practices/61537/
    For better, quicker answers on SQL Server performance related questions, click on the following...http://www.sqlservercentral.com/articles/SQLServerCentral/66909/

  • The way I work allows me to get this down long before I start coding.

    Step 1. I start with pen and paper and jot down a few algorithms. Normally, I end up with between 3 and 10 algorithms written down.

    Step 2. I'll then code up the 3 that I think are most likely to perform the best, as well as 1 RBAR algorithm.

    Step 3. I set-up a test environment of 100 rows based on the table structures and data set and run them all against it comparing the result-sets by inserting each one into temp tables. I repeat this test 10 times, assuming that each run produces the same results.

    Step 4. Once all of my algorithms have passed the first stage of testing, I recreate my test environment based on the table structures and data set but this time add 1,000,000 rows and test again. I repeat this test 5 times, assuming that each run produces the same results.

    Step 5. Once I have the two fastest, I take a sub-set of the actual data and test again, this is not repeated.

    Step 6. Assuming the results are the same, I do a final test on the actual tables that will be used. I repeat this test 5 times.

    Step 7. Finally, I do a logic check of the fastest algorithm, then submit the code into the correct branch of our software.

    This is a very good approach.

    Thanks for sharing Cadavre.

    Vinu Vijayan

    For better and faster solutions please check..."How to post data/code on a forum to get the best help" - Jeff Moden[/url] 😉

  • Cadavre,

    I didn't realize that if you DECLAREd the local variables within the scope of the dynamic SQL, they would still be in scope upon return to the calling session.

    That is what you are saying, right?


    My mantra: No loops! No CURSORs! No RBAR! Hoo-uh![/I]

    My thought question: Have you ever been told that your query runs too fast?

    My advice:
    INDEXing a poor-performing query is like putting sugar on cat food. Yeah, it probably tastes better but are you sure you want to eat it?
    The path of least resistance can be a slippery slope. Take care that fixing your fixes of fixes doesn't snowball and end up costing you more than fixing the root cause would have in the first place.

    Need to UNPIVOT? Why not CROSS APPLY VALUES instead?[/url]
    Since random numbers are too important to be left to chance, let's generate some![/url]
    Learn to understand recursive CTEs by example.[/url]
    [url url=http://www.sqlservercentral.com/articles/St

  • dwain.c (4/26/2012)


    Cadavre,

    I didn't realize that if you DECLAREd the local variables within the scope of the dynamic SQL, they would still be in scope upon return to the calling session.

    That is what you are saying, right?

    Nope, I need to learn to explain myself better I think 😀

    They're only in the scope of the dynamic SQL, so once the execution of the @SQL variable is done with then we're done. But that doesn't stop us from doing everything we need to do in the dynamic SQL.


    Forever trying to learn
    My blog - http://www.cadavre.co.uk/
    For better, quicker answers on T-SQL questions, click on the following...http://www.sqlservercentral.com/articles/Best+Practices/61537/
    For better, quicker answers on SQL Server performance related questions, click on the following...http://www.sqlservercentral.com/articles/SQLServerCentral/66909/

  • There are always ways to avoid CURSOR.....but you might end up in a costlier plan.

    If you have decided to use tally table.....try it out....

    Unless someone uses a feature or command he cannot comment on it.....and without knowing the prospects of using a tally table commenting will be ignorance....:-D

    Chandrachurh Ghosh
    DBA – MS SQL Server
    Ericsson India Global Services Limited
    Quality is not an act, it is a habit.

  • There are always ways to avoid CURSOR.....

    Cannot agree with above.

    There are cases when you cannot avoid using a CURSOR.

    _____________________________________________
    "The only true wisdom is in knowing you know nothing"
    "O skol'ko nam otkrytiy chudnyh prevnosit microsofta duh!":-D
    (So many miracle inventions provided by MS to us...)

    How to post your question to get the best and quick help[/url]

  • Thanks Eugene.

    As simple it may be....but that was what I was looking for. 😎

    Vinu Vijayan

    For better and faster solutions please check..."How to post data/code on a forum to get the best help" - Jeff Moden[/url] 😉

  • Thanks Eugene.

    As simple it may be....but that was what I was looking for. 😎

    Thanx for all the replies. Just climbed up a few steps on the learning curve and its all because of all your help.

    CHEERS!!!:hehe:

    Vinu Vijayan

    For better and faster solutions please check..."How to post data/code on a forum to get the best help" - Jeff Moden[/url] 😉

  • vinu512 (4/26/2012)


    Thanks Eugene.

    As simple it may be....but that was what I was looking for. 😎

    The key in using CURSORS (or loops, which are no much difference to cursors), as in anything else is:

    You should know what are you doing and understand how it works!

    _____________________________________________
    "The only true wisdom is in knowing you know nothing"
    "O skol'ko nam otkrytiy chudnyh prevnosit microsofta duh!":-D
    (So many miracle inventions provided by MS to us...)

    How to post your question to get the best and quick help[/url]

  • I'll go along with everyone else here and just reiterate that there are places where a loop of one sort or another is necessary, and in those cases a cursor is often the best bet. Outside of those cases, don't use a loop at all, no matter how disguised it may be.

    For example, I have a dataset of 5.8-million rows that I need to check each row if name, email, phone, and address data in it can be matched to any prior row. Could do the whole thing as a triangular join, but even a 1000-row subset of the data overloads the server if I do it that way. Partially because the join math is insanely complex. (It has to identify that "Joe" and "Joseph" are the same name, for example. And that "123 N 1st Ave" = "123 N. 1st Ave." = "123 North First Avenue". And that "bob.smith@gmail.com" = "bobsmith@gmail.com", but "bob.smith@hotmail.com" != "bobsmith@hotmail.com", because of special rules for GMail names.) Partially because of complex rules on how to handle sequences of data.

    So, I have a cursor spending a month going through the data one row at a time, and even have to sub-manage that so that the cursor dataset is only a few thousand rows at a time and resets itself in a "meta-loop". Processes about 2 to 3 records per second, and getting it up to that speed took a lot of tuning.

    That's a case where looping and cursors are necessary. Not because of what's being done with the data (it could be done through a single query), but because modern server hardware can't finish the job in finite time except through breaking the job up this way.

    So, there are places where you should use them, and places where you must use them because of current limits on the hardware or the SQL Server engine, and then there's the vast majority of work, where you should avoid them and their disguised versions like the plague.

    - Gus "GSquared", RSVP, OODA, MAP, NMVP, FAQ, SAT, SQL, DNA, RNA, UOI, IOU, AM, PM, AD, BC, BCE, USA, UN, CF, ROFL, LOL, ETC
    Property of The Thread

    "Nobody knows the age of the human race, but everyone agrees it's old enough to know better." - Anon

Viewing 15 posts - 16 through 30 (of 59 total)

You must be logged in to reply to this topic. Login to reply