SSIS Default bufferMaxrows

  • Hi,

    I got different packages which are running fine. But now i got issue with one newly designed package whihc will update two columns in a table with the values in another table

    This package got stuck after 9,678 rows with both source and destinations in yellow, then i increased defaultbuffermaxrows to 100000 then it processed 99,000 records. again i increased it and again got some extra rows. Do i need to manually change this value everythime if my rows get increased? i'm updating around 6000000 rows and new rows gets added daily. Now how do i need to handle this situation.

  • tripri (9/9/2011)


    Hi,

    I got different packages which are running fine. But now i got issue with one newly designed package whihc will update two columns in a table with the values in another table

    This package got stuck after 9,678 rows with both source and destinations in yellow, then i increased defaultbuffermaxrows to 100000 then it processed 99,000 records. again i increased it and again got some extra rows. Do i need to manually change this value everythime if my rows get increased? i'm updating around 6000000 rows and new rows gets added daily. Now how do i need to handle this situation.

    You are only masking the real issue by increasing the buffer max rows. By saying it "processed" 9k rows, I am assuming that you are referring to the record count shown in SSIS. That is not a true reflection of the amount of rows completely processed. That number only indicates the amount of records that has moved through that portion of the SSIS pipe.

    My guess is that something is getting stuck somewhere else. The buffer max rows property you are currently changing, only specifies what amount of rows are kept in the buffer at a time. You should also look at the commit size of your destination in the data flow, which indicates how many rows are commited at a time.

    Lastly, I would suggest that you run a trace while the package is running. Ensure that records are actually being updated. Look at the execution plan of the update queries to see if it isn't index updates that are taking such a long time (and effectively brining it all to a standstill).

    Also see if there are any locks or resource waits (use sp_who2 or Adam Machanic's WhoIsActive) while the package is executing.

    As you can see, there are a lot of things to keep in mind here. If the tables are both on the same server, I would suggest that you rather use an execute sql task as an alternative. Set-based operations will perform much better than row-by-row updates.

    Hope this helps, and good luck 🙂

  • Actaully it's working fine with exec sql task but getting stuck if i use dataflow task. I'm bit unsucessful to find out why its getting stuck with dataflow task when it is successfull with sql task.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply