Execute a Stored Procedure against every row insert in Table

  • azhar.iqbal499 - Monday, November 5, 2018 4:10 AM

    I have around 200 tables and we have insert in each table with in each seconds.
    so there are around 90 million records in all tables. We want to execute an SP against each row in a table. Trigger is not a choice for us because it will further slow down things. I am making a custom solution to loop over a every row in a table and execute SP in a loop. 
    How can i make this in parallel by using SQL or SSIS package. I am using MSSQL Server 2016.
    Please advice me for better solution.

    Assuming you do not update all 90 million rows every day, then just use 2 datetime fields in each table being checked ( 1 for last update datetime and 1 for last checked datetime)
    in your audit table store your process check datetime after that its a simple logic check of:
    if process check datetime < table last checked datetime
        do process

  • azhar.iqbal499 - Wednesday, November 7, 2018 6:21 AM

    This option will not be viable for us, As our data is already there inside the SQL server.

    That misses the point entirely.   However, what exactly you are doing in any given stored procedure is going to matter one heck of a lot.   What will you do if there's ever a time when due to capacity, a given row's execution of a stored procedure takes longer than 5 seconds?  Let me give you a  preview:   Crash, Bang, Boom, system is in trouble and can't be easily fixed.   The idea that you can do such a thing on every row for 90 million rows is going to have to be set-based.   90 million separate executions is just not realistic any other way.   Once you have all 90 million current rows taken care of, then you can just execute the procedure at insert time, but again, you'd better be ready with a secondary system or some massive capacity upgrade ready on a moment's notice if you expect to be able to maintain that kind of process.  Isn't there some way to run the process for multiple rows?   Trying to do 90 million rows one by one is not generally practical...

  • Here's some data to back me up:
        If your stored procedure took 15 minutes to run, you'd be doing this for 5,707,031.25 YEARS.  (Clearly impossible to accomplish)
        If your stored procedure took just 15 seconds to run, you'd have to do it for 11.41 YEARS.        (Same problem here)
        And if this thing runs for just 1 lousy second per execution, you'd still be busy for 2.85 YEARS. (Still not even close to practical)
        Finally, just to get this to happen in 24 hours, you'd need to get execution time down to less than 1 millisecond. (2 and 2/3 ten-thousandths of a second)

        And then to actually be practical, that procedure would probably have to run once over the weekend, because unless the procedure is totally trivial, I'll guess it runs longer than that sub-millisecond time-frame, but again, all that much longer and you never finish...

    EDIT:  Even 200 concurrent executions of the procedure (one for each table), would only bump your 24 finish time requirement to 1 1/3 100ths of a second per execution.  That's roughly 13 1/3 milliseconds.   Now add in the existing workload on your server to the equation....

  • azhar.iqbal499 - Wednesday, November 7, 2018 1:46 AM

    Actually I am working over Drilling Software which insert 1 row a into a table after five second. We Pass Some columns as parameter to a SP for Each row. These sps are called after every insert and it should not impact the OLTP. These SP are used to build some R Model so they needs Parameters for each row.  
    Client suggest me implement Service Broker for this. But for me it is extra overhead to manage Service Queue for just calling Procedure. 
    So What I did, on every insert of Row, I add same row in Audit table through after trigger. Trigger is only for INSERT. Then Loop on this table to call sp and mark the Flag isExecuted=TRUE, so that i don't pick record next time.
    I can create multiple SPs to run them parallel on the basis of ID identity column in the Audit table. I will move this stuff into a job and job will run continuously to cater new records in a base table. 
    is that make sense?
    Any better idea would be highly appreciated.

    Thanks in Advance.

    It would be very helpful if you stopped thinking of how to do this - and identify for us what you are trying to accomplish.  All you have stated is that you must execute a stored procedure for every row in every table...but I seriously doubt that is accurate.  As someone else asked earlier...what is this stored procedure doing that requires that it run for each row?

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • azhar.iqbal499 - Wednesday, November 7, 2018 1:46 AM

    Actually I am working over Drilling Software which insert 1 row a into a table after five second. We Pass Some columns as parameter to a SP for Each row. These sps are called after every insert and it should not impact the OLTP. These SP are used to build some R Model so they needs Parameters for each row.  
    Client suggest me implement Service Broker for this. But for me it is extra overhead to manage Service Queue for just calling Procedure. 
    So What I did, on every insert of Row, I add same row in Audit table through after trigger. Trigger is only for INSERT. Then Loop on this table to call sp and mark the Flag isExecuted=TRUE, so that i don't pick record next time.

    azhar.iqbal499 - Wednesday, November 7, 2018 9:44 AM

    Audit Table will be only one and it will be deleted for older data of one day back.

    So you have a row inserted onto the table every 5 seconds or 17,280 rows per day and you delete the audit rows that are over 24 hours old?
    You have a procedure that loops through the audit table and does some processing to the rows that haven't been processed in the audit table?
    Can you provide details of the processing that happens to these rows? Speeding this up might be the key to the solution.

  • Jonathan AC Roberts - Sunday, November 18, 2018 9:02 AM

    azhar.iqbal499 - Wednesday, November 7, 2018 1:46 AM

    Actually I am working over Drilling Software which insert 1 row a into a table after five second. We Pass Some columns as parameter to a SP for Each row. These sps are called after every insert and it should not impact the OLTP. These SP are used to build some R Model so they needs Parameters for each row.  
    Client suggest me implement Service Broker for this. But for me it is extra overhead to manage Service Queue for just calling Procedure. 
    So What I did, on every insert of Row, I add same row in Audit table through after trigger. Trigger is only for INSERT. Then Loop on this table to call sp and mark the Flag isExecuted=TRUE, so that i don't pick record next time.

    azhar.iqbal499 - Wednesday, November 7, 2018 9:44 AM

    Audit Table will be only one and it will be deleted for older data of one day back.

    So you have a row inserted onto the table every 5 seconds or 17,280 rows per day and you delete the audit rows that are over 24 hours old?
    You have a procedure that loops through the audit table and does some processing to the rows that haven't been processed in the audit table?
    Can you provide details of the processing that happens to these rows? Speeding this up might be the key to the solution.

    Yes you are correct. The SPs actually build a R Model against every row entry. If we can speed up this process and runs parallel sps then we can achieve this goal.

  • azhar.iqbal499 - Monday, November 19, 2018 12:11 AM

    Jonathan AC Roberts - Sunday, November 18, 2018 9:02 AM

    azhar.iqbal499 - Wednesday, November 7, 2018 1:46 AM

    Actually I am working over Drilling Software which insert 1 row a into a table after five second. We Pass Some columns as parameter to a SP for Each row. These sps are called after every insert and it should not impact the OLTP. These SP are used to build some R Model so they needs Parameters for each row.  
    Client suggest me implement Service Broker for this. But for me it is extra overhead to manage Service Queue for just calling Procedure. 
    So What I did, on every insert of Row, I add same row in Audit table through after trigger. Trigger is only for INSERT. Then Loop on this table to call sp and mark the Flag isExecuted=TRUE, so that i don't pick record next time.

    azhar.iqbal499 - Wednesday, November 7, 2018 9:44 AM

    Audit Table will be only one and it will be deleted for older data of one day back.

    So you have a row inserted onto the table every 5 seconds or 17,280 rows per day and you delete the audit rows that are over 24 hours old?
    You have a procedure that loops through the audit table and does some processing to the rows that haven't been processed in the audit table?
    Can you provide details of the processing that happens to these rows? Speeding this up might be the key to the solution.

    Yes you are correct. The SPs actually build a R Model against every row entry. If we can speed up this process and runs parallel sps then we can achieve this goal.

    Can you supply the DDL of the table you are querying?

  • azhar.iqbal499 - Monday, November 19, 2018 12:11 AM

    Jonathan AC Roberts - Sunday, November 18, 2018 9:02 AM

    azhar.iqbal499 - Wednesday, November 7, 2018 1:46 AM

    Actually I am working over Drilling Software which insert 1 row a into a table after five second. We Pass Some columns as parameter to a SP for Each row. These sps are called after every insert and it should not impact the OLTP. These SP are used to build some R Model so they needs Parameters for each row.  
    Client suggest me implement Service Broker for this. But for me it is extra overhead to manage Service Queue for just calling Procedure. 
    So What I did, on every insert of Row, I add same row in Audit table through after trigger. Trigger is only for INSERT. Then Loop on this table to call sp and mark the Flag isExecuted=TRUE, so that i don't pick record next time.

    azhar.iqbal499 - Wednesday, November 7, 2018 9:44 AM

    Audit Table will be only one and it will be deleted for older data of one day back.

    So you have a row inserted onto the table every 5 seconds or 17,280 rows per day and you delete the audit rows that are over 24 hours old?
    You have a procedure that loops through the audit table and does some processing to the rows that haven't been processed in the audit table?
    Can you provide details of the processing that happens to these rows? Speeding this up might be the key to the solution.

    Yes you are correct. The SPs actually build a R Model against every row entry. If we can speed up this process and runs parallel sps then we can achieve this goal.

    And how long would it take to run the process that creates the R model (not including the running of the model itself)?  Can that process take place detached from the processing of the individual row?  Given your volume, this might well be totally impractical purely based on volume.

Viewing 8 posts - 16 through 22 (of 22 total)

You must be logged in to reply to this topic. Login to reply