Does Parallel Tasks having same set of Tables Slow Down DTS Execution Time

  • Hi,

    we use DTS 2000.

    we have set 4 tasks to be run at a time in our DTS.

    while running these 4 tasks say two tasks have

    same set of tables but different filter conditions.

    I Would like to know Will this situation slow down the other two tasks.

    Any pointers on fine tuning the execution time of DTS would be helpful

    Thanks & Warm Regards

    chethan

  • Depending on the task, table locks could slow this down slightly. Otherwise running parallel tasks is usually a good way to take advantage of multiprossors. It is important to have a baseline from SQL profiler and performance monitor and then run profiler and perf mon to meaure the DTS package effect. There is a very good article on DTS at:

    http://vyaskn.tripod.com/sql_server_dts_best_practices.htm

    Another good source is:

    http://www.sqldts.com/

  • Couple of things from my experience

     

    If you are sharing the Source or Destination connection objects then it will slow down. Use different set of source and destination connections for all the parallel tasks.

    If you are reading from the same set of tables with different filter conditions it should not affect the performance. If you are using the same table as the destination in both the tasks then it will slow you down.You may then want to stage it in two stage tables and do sql to merge them after the parallel load is done

  • thanks guys,

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply