Looping throuch each row of text file

  • Hi,

    I was wondering what the easiest way would be to loop through each row of a text file and pass a portion of each row to an SQL stored proc. All of this needs to be a DTS job.

    The way i was thinkin was to first get every row into a temp table, then the stored proc loops through the table. This is slightly a slow way, so if anyone can give me ideas on how to eliminate the temp table part, that would be great.

    Thanks

  • Rather than thinking and approaching the problem in a row-by-row mindset, describe the full problem, including sample input and output data and we may be able to educate you in a far better way to solve this and all future problems (if you search for set-based processing you may get a hint in this regard).

  • If you must use a record-based approach, I'd recommend using a staging table with a trigger using a cursor. (INSERTED i INNER JOIN StagingTable).

    That being said, 99% of the time, you should use a batch-based approach with a staging table and the same trigger minus the stored proc/cursor.

  • The best way to loop through single rows is to use a Sql Cursor, they are relatively expensive, (slow), to use. I I agree with the previous poster that if a set based solution can be found it should be used but occasionally there are problems that can only be solved by using a cursor.

     

    Sample below.

     

    Declare

    @OpNum int, @TransDate SmallDateTime,@NextDate SmallDateTime

    DECLARE

    X_Cursor CURSOR FOR

    SELECT

    OperatorNo, Effective

    FROM

    HistoryWhse

    order

    by 1,2

    OPEN

    X_Cursor

    FETCH

    NEXT FROM X_Cursor INTO @OpNum, @TransDate

    WHILE

    @@FETCH_STATUS = 0

    BEGIN

    select @NextDate=(select min(Effective) from HistoryWhse where

    OperatorNo

    =@OpNum and Effective > @TransDate

    )

    if not @NextDate is null

    begin

    update HistoryWhse

    set FirstEffective=dateadd(dd,-1,@NextDate)

    where

    OperatorNo

    =@OpNum

    and

    Effective

    =@Transdate

    end

    FETCH NEXT FROM X_Cursor INTO @OpNum, @TransDate

    END

    CLOSE

    X_Cursor

    DEALLOCATE

    X_Cursor

  • Hmm...

    Ok, here is a sample recordset:

    20060403144140010000120504265320500006006001

    20060403144141010000120504265320500006006001

    20060406145613010000120504265320500006006001

    The actual record set is 690 characters in length so i will spare that much.

    Yeah, the cursor approach i have thought of, but as u say, it is expensive and we are looking at say...1 million records in a file to process in little time.

    So, what i exactly want to do in the end is something like...so i got this recordset, the first 8 characters are date(20060406), the next 6 are time(145613), so i want to detach these, and pass the rest of the string to a stored procedure which does some processing on the data and stores it in the proper Table.

    I tried to have a Text File Source connection and make a dataflow with an Execute SQl Task, but DTS Package designer doesnt allow that.

    So, yeah im still thinking about what u all hv said and will see which approach to follow.

  • look at http://www.sqldts.com/default.aspx?246, they have a dts package source code called looping, importing and archiving. Your process is similar to that one.

    The only modifications you'll have to implement is to introduce an activeX task and an execute sql task. this is needed for

    1.Casting your record into separate a G.variable within an activeX task

    2. set the sql command from the activex

    3. execute the sql task.

    Happy programming

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply