Help need in Avoiding Loop

  • Got it. yes it's happening. we have every one hour backup.

    Is it possible to create dummy test data to check this scenario?

    SELECT d.*

    INTO #Users

    FROM (

    SELECT 1, 2, 10 UNION ALL

    SELECT 2, 4, 12 UNION ALL

    SELECT 3, 3, 8 UNION ALL

    SELECT 4, 26, 30 UNION ALL

    SELECT 5, 24, 20 UNION ALL

    SELECT 6, 70, 100

    ) d (UserID, PendingAmount, AvailableAmount)

    ;

    SELECT d.*

    INTO #Users_Transaction

    FROM (

    SELECT 1, 3, 0 UNION ALL

    SELECT 1, 2, 0 UNION ALL

    SELECT 2, 5, 0 UNION ALL

    SELECT 2, 10, 0 UNION ALL

    SELECT 3, 13, 0 UNION ALL

    SELECT 1, 13, 1

    ) d (UserID, Refund, Status)

    ;

    How can i create 60000 + dummy records on the above table format to test. any help please

  • born2achieve (4/7/2014)


    How can i create 60000 + dummy records on the above table format to test. any help please

    With the idea of teaching a man to fish, please see the following.

    http://qa.sqlservercentral.com/articles/Data+Generation/87901/

    http://qa.sqlservercentral.com/articles/Test+Data/88964/

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Hi Jeff,

    Finally i am able to create 500000 mock records as test data and i did test. it took 20 seconds. hope this is quick result. Any comments or suggestions please

  • born2achieve (4/7/2014)


    Hi Jeff,

    Finally i am able to create 500000 mock records as test data and i did test. it took 20 seconds. hope this is quick result. Any comments or suggestions please

    It's not really possible for me to judge from here nor should I judge in this case. I don't know your data, the datatypes of the data, your server, your disks, your indexes or how many, FKs, Unique Keys, how you tested it, what the test data is, etc, etc, etc. Look at the execution plan, IO stats, etc. "It Depends". 20 seconds on a half million rows could be anything from "incredible" to "yawn, let's watch the moss grow".

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Hi jeff,

    Here how i generated the record

    ;;WITH x AS

    (

    SELECT TOP (2300) [object_id] FROM sys.all_objects

    )

    SELECT d.*

    INTO Test_Balance

    FROM (

    select dd.n,dd.n * 10, dd.n*2 from(

    SELECT TOP (500000) n = ROW_NUMBER() OVER (ORDER BY x.[object_id])

    FROM x CROSS JOIN x AS y ) dd

    ) d (IDUser, PendingAmount, AvailableAmount)

    ;;WITH x AS

    (

    SELECT TOP (2300) [object_id] FROM sys.all_objects

    )

    SELECT d.*

    INTO Test_Transaction

    FROM (

    select dd.n,dd.n, 0 from(

    SELECT TOP (500000) n = ROW_NUMBER() OVER (ORDER BY x.[object_id])

    FROM x CROSS JOIN x AS y ) dd

    ) d (IDUser, Refund, Status)

    Test proc:

    BEGIN

    BEGIN TRY

    -- Set it so if the first UPDATE fails, we won't even start the second update.This really says "If we're in a transaction

    -- and something fails, stop processing the transaction and do a rollback if we can".

    SET XACT_ABORT ON;

    --Create a table to remember the rows we updated.

    IF OBJECT_ID('tempdb..#ValuesUpdated') IS NOT NULL

    BEGIN

    DROP TABLE #ValuesUpdated;

    END

    CREATE TABLE #ValuesUpdated (IDUser BIGINT);

    DECLARE @RecordCount INT;

    BEGIN TRANSACTION;

    SELECT @RecordCount = count(1)

    FROM Test_Transaction;

    IF (@RecordCount > 0)

    BEGIN

    WITH AggregateData

    AS (

    --=== Preaggregate the RewardAmount by Idmember so that we

    -- don't have to mess with it in the UPDATE.

    SELECT IDUser

    ,TotalRefund = SUM(COALESCE(Refund, 0))

    FROM Test_Transaction

    WHERE [Status] = 0

    GROUP BY IDUser

    ) --Do the update and capture the IDUser that we updated.

    UPDATE MB

    SET MB.PendingAmount = COALESCE(MB.PendingAmount, 0) + AD.TotalRefund

    OUTPUT INSERTED.IDUser

    INTO #ValuesUpdated(IDUser)

    FROM Test_Balance MB

    INNER JOIN AggregateData AD ON MB.IDUser= AD.IDUser;

    --===== Using the captured IDUser, update the status to

    -- identify that the rows have been used.

    UPDATE MRB

    SET MRB.STATUS = 1

    FROM Test_Transaction MRB

    WHERE MRB.IdMember IN (

    SELECT IDUser

    FROM #ValuesUpdated

    )

    AND [Status] = 0;

    END

    COMMIT TRANSACTION;

    END TRY

    BEGIN CATCH

    DECLARE @ErrorMessage NVARCHAR(4000);

    DECLARE @ErrorSeverity INT;

    DECLARE @ErrorState INT;

    DROP TABLE #ValuesUpdated;

    SELECT @ErrorMessage = ERROR_MESSAGE()

    ,@ErrorSeverity = ERROR_SEVERITY()

    ,@ErrorState = ERROR_STATE();

    ROLLBACK TRANSACTION;

    RAISERROR (

    @ErrorMessage

    ,@ErrorSeverity

    ,@ErrorState

    );

    --IF XACT_STATE() = 1

    -- COMMIT TRANSACTION

    --IF XACT_STATE() = - 1

    -- ROLLBACK TRANSACTION

    END CATCH;

    END

    IDUser is primary key of my actual table of "User" and IDUser is foreign key in Test_Balance and Test_Transaction.

    In order to test the logic i just created mock data without any relationship. Does it impact the performance of my actual table? Any suggestions please

  • Just a couple of tips here...

    First, although it's fast, you made the generation of the test data a wee bit complicated and a bit more difficult to read than it needs to be (the formatting thing again). Here's how I would have generated the test data (and I corrected the code that gave me an "Invalid column name 'IdMember'" error).

    --===== Create and populate the balance table on-the-fly

    WITH

    cteRowSource AS

    (

    SELECT TOP 500000

    N = ROW_NUMBER() OVER (ORDER BY (SELECT NULL))

    FROM master.sys.all_columns ac1

    CROSS JOIN master.sys.all_columns ac2

    )

    SELECT IDUser = ISNULL(N,0)

    ,PendingAmount = N*10

    ,AvailableAmount = N*2

    INTO dbo.Test_Balance

    FROM cteRowSource

    ;

    --===== Apply the expected index to get rid of the

    -- expensive HASH JOINs in favor of MERGE JOINs.

    ALTER TABLE dbo.Test_Balance

    ADD PRIMARY KEY CLUSTERED (IDUser)

    ;

    --===== Create and populate the transaction table on-the-fly

    WITH

    cteRowSource AS

    (

    SELECT TOP 500000

    N = ROW_NUMBER() OVER (ORDER BY (SELECT NULL))

    FROM master.sys.all_columns ac1

    CROSS JOIN master.sys.all_columns ac2

    )

    SELECT IDUser = ISNULL(N,0)

    ,Refund = N

    ,[Status] = 0

    INTO dbo.Test_Transaction

    FROM cteRowSource

    ;

    --===== Apply the expected index to get rid of the

    -- expensive HASH JOINs in favor of MERGE JOINs.

    ALTER TABLE dbo.Test_Transaction

    ADD PRIMARY KEY CLUSTERED (IDUser)

    ;

    If you go through that, you'll also see that I've added the expected PKs, which have an underlying clustered index, in this case. Since the column(s) of a PK require NOT NULL attributes, how did I do that by using SELECT/INTO? Go look at my SELECT/INTO code and see what you notice.

    The other thing is that the indexes are critical to fast operation here. Without them, the code does some nasty ol' HASH JOINs and sorts galore. With the indexes, the code uses some very high perform MERGE JOINs that also avoid sorts. Since the code also ends up putting 500,000 rows in the Temp Table, it's much faster to add a clustered index (again, in the form of a PK, which also provides a double check that the code ran correctly) to the Temp Table than to use the Temp Table without one. I also changed your COUNT(1) check to something a little quicker. An index would help there but it's not worth it. Remember that all indexes except the clustered index are a duplication of data.

    With all that in mind, here's the modified code. Heh... and, no... I didn't take the time to reformat your code.

    SET XACT_ABORT ON;

    --Create a table to remember the rows we updated.

    IF OBJECT_ID('tempdb..#ValuesUpdated') IS NOT NULL

    BEGIN

    DROP TABLE #ValuesUpdated;

    END

    CREATE TABLE #ValuesUpdated (IDUser BIGINT NOT NULL PRIMARY KEY CLUSTERED);

    BEGIN TRANSACTION;

    IF EXISTS (SELECT * FROM Test_Transaction WHERE Status = 0)

    BEGIN

    WITH AggregateData

    AS (

    --=== Preaggregate the RewardAmount by Idmember so that we

    -- don't have to mess with it in the UPDATE.

    SELECT IDUser

    ,TotalRefund = SUM(COALESCE(Refund, 0))

    FROM Test_Transaction

    WHERE [Status] = 0

    GROUP BY IDUser

    ) --Do the update and capture the IDUser that we updated.

    UPDATE MB

    SET MB.PendingAmount = COALESCE(MB.PendingAmount, 0) + AD.TotalRefund

    OUTPUT INSERTED.IDUser

    INTO #ValuesUpdated(IDUser)

    FROM Test_Balance MB

    INNER JOIN AggregateData AD ON MB.IDUser= AD.IDUser;

    --===== Using the captured IDUser, update the status to

    -- identify that the rows have been used.

    UPDATE MRB

    SET MRB.STATUS = 1

    FROM Test_Transaction MRB

    WHERE MRB.IdUser IN (

    SELECT IDUser

    FROM #ValuesUpdated

    )

    AND [Status] = 0;

    END

    COMMIT TRANSACTION;

    Now, if you really want to learn something (no better way to learn than by "doing");-), try the things that you asked about before you ask about them so that you don't have to ask about them. 🙂

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Hi Jeff,

    Thank you so much for mentoring me. After reading changes you made on my sample and after i executed the procedure against the test data, it executed in 4 seconds. really it is excellent approach and typically it reduces 16 seconds from my first try.

    Great and wonderful. Thanks a lot. I learnt improving the speed here.

  • Hi Jeff,

    One final question about exception. In my sample logic i am suing try catch to get the exception and transaction to rollback if any exception found. Also i am capturing the exception on begin catch with assigning to some variable. Will that be enough? Because if some issue occurs in the logic and if transaction rolls back then i have to know where the place exact exception occurs . Any suggestion please

  • born2achieve (4/8/2014)


    Hi Jeff,

    One final question about exception. In my sample logic i am suing try catch to get the exception and transaction to rollback if any exception found. Also i am capturing the exception on begin catch with assigning to some variable. Will that be enough? Because if some issue occurs in the logic and if transaction rolls back then i have to know where the place exact exception occurs . Any suggestion please

    "Must look eye." 😉

    Study this code to see that it's a simple 1/0 error and it uses your Try/Catch block. Run it to see what happens.

    BEGIN TRY

    BEGIN TRANSACTION

    SELECT 1/0 --This is the code under test

    COMMIT TRANSACTION

    END TRY

    BEGIN CATCH

    DECLARE @ErrorMessage NVARCHAR(4000);

    DECLARE @ErrorSeverity INT;

    DECLARE @ErrorState INT;

    --DROP TABLE #ValuesUpdated; --Commented this out for this test (JBM)

    SELECT @ErrorMessage = ERROR_MESSAGE()

    ,@ErrorSeverity = ERROR_SEVERITY()

    ,@ErrorState = ERROR_STATE();

    ROLLBACK TRANSACTION; --This SHOULD have been conditional but they didn't do it right. It works for now.

    RAISERROR (

    @ErrorMessage

    ,@ErrorSeverity

    ,@ErrorState

    );

    --IF XACT_STATE() = 1

    -- COMMIT TRANSACTION

    --IF XACT_STATE() = - 1

    -- ROLLBACK TRANSACTION

    END CATCH

    ;

    Now, study and then run this code.

    SET XACT_ABORT ON;

    BEGIN TRANSACTION

    SELECT 1/0 --This is the code under test

    COMMIT TRANSACTION

    ;

    Now... what were you counting on TRY/CATCH to do for you in this case?

    Heh... I'll say it one more time and then I'm going to quit because you're not taking the hint. 😉 [font="Arial Black"]Try[/font] before you ask and I'm not saying that to be mean. It's truly the only way you will learn and remember. It's just not that difficult to test your own questions with a smidgen of code.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Hi Jeff,

    I apologize for extending the thread and your time. I did try to browse through articles before i post in this forum. even i tried some sample as as well. I am developing a critical functionality and would like to check with the expert like you. That's why i posted. I am very sorry if i hurt your time.

    The sample for you posted for exception handling i already tried before i post my doubt. Since i am dealing with the transaction i had quite suspicious on the exception handling. That's why i posted.

    Once again thank you so much for your precious time.

  • born2achieve (4/8/2014)


    Hi Jeff,

    I apologize for extending the thread and your time. I did try to browse through articles before i post in this forum. even i tried some sample as as well. I am developing a critical functionality and would like to check with the expert like you. That's why i posted. I am very sorry if i hurt your time.

    The sample for you posted for exception handling i already tried before i post my doubt. Since i am dealing with the transaction i had quite suspicious on the exception handling. That's why i posted.

    Once again thank you so much for your precious time.

    No, no... it's not my time that I'm worried about. In kind of a "tough love" manner, I'm trying to teach you how to think (must try instead of relying on others) so that you can live up to that fine handle you have of "Born2Achieve" (not being sarcastic there, either). I think you have the right stuff and you just need a push to develop that "critical functionality". Think of how rightfully proud you would have been if you spent a little time making that query run in 4 seconds instead of 20 by yourself if you had just taken 20 minutes out to build the indexes you were talking about instead of asking me. 🙂 Experiment! You now know how to make wads of test data in just a minute or two to write the code and you create a million rows of data across two tables in seconds once the code is done. Start solving problems that you find on these forums. You don't have to post answers but dive in. You won't find a better source of real world and theoretical problems anywhere.

    Shifting gears, I have to ask this because a I'm finding out that a whole lot of people that work with T-SQL every day don't have a clue what it is. If you do know, then I apologize for asking such a simple question. If you don't know what it is, then it's going to help a lot and I'd rather be a bit embarrassed that I asked rather than you not know. So, here's the question...

    Do you know what the software called "Books Online" is?

    Also, do you have a computer at home? If you do (and sorry for yet another simple question), see if you can get a copy of SQL Server Developer's Edition. My understanding is that it's a bit difficult to find outside the United States (most folks here can get it at Amazon.com for something less than $65 USD including shipping). It's actually the Enterprise Edition with special licensing and it runs on a standard desktop or laptap without having to install Windows Server. If you can't get it online, find out where the closest Microsoft office is and drop in to ask them how you can get it. If you're really intent at becoming very good at SQL Server (not just T-SQL, either), there's no better way to learn than having your own copy to work with whenever and however you want.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Hi Jeff,

    I totally agree your logic and will follow that. yes of course, knowledge can only be developed when we try.

    you are great mentor. thank you.

  • born2achieve (4/9/2014)


    Hi Jeff,

    I totally agree your logic and will follow that. yes of course, knowledge can only be developed when we try.

    you are great mentor. thank you.

    Thank for the the awfully nice compliment but it's not that. I just know that you have it in you. Look how you took the hint about the test data. It's not mentors that are great... it takes great students.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Jeff Moden (4/9/2014)


    born2achieve (4/9/2014)


    Hi Jeff,

    I totally agree your logic and will follow that. yes of course, knowledge can only be developed when we try.

    you are great mentor. thank you.

    Thank for the the awfully nice compliment but it's not that. I just know that you have it in you. Look how you took the hint about the test data. It's not mentors that are great... it takes great students.

    Actually, it takes both. You have been a great mentor to me as well! You can see the spark in those of us that want to learn and help make it burn brighter. Not every mentor can do that as well as you have been able to in many of us. I still learn much from your articles and insights on numerous forum threads.

  • Gosh. Thanks, Lynn. :blush:

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

Viewing 15 posts - 16 through 29 (of 29 total)

You must be logged in to reply to this topic. Login to reply