DTS problem with Excel file

  • We have just created a simple DTS package to import data from an Excel file which we receive every morning from an external entity. Our problem arises because the first line on the excel file states "FLAT FILE FROM SYSTEM" followed by the number of records in the file. We have been manually opening up the excel file every morning and deleting this line. Once we do this, the DTS runs smoothly.

    Is there a way we can either have the first line deleted automatically from excel or is there a way to skip the first record when running the DTS package?

    I have tried to use the "DTSTransformStat_SkipRow" but for some reason it skips most of the rows in the file and copies over just a few records.

    Any solutions or suggestions?

  • Many ways to skin that cat:

    1. Put the data into a staging table and then use that table to import into your live table using <> 'the string that is wrong'.

    2. If you know how many rows you are getting enter a first and last row from the options tab on the dts task. Alternatively, enlarge the last row to 65536 and then delete the empty rows afterwards.

    3. Create an activex task for the entire import. This is the cleaner way but more painful in terms of work upfront -

    Function Main()	If (DTSSource("Col001") = "offending row text" Then		Main  = DTSTransformStat_SkipRow	Else		DTSDestination("colname1") = DTSSource("Col1")		DTSDestination("colname2") = DTSSource("Col2")		DTSDestination("colname3") = DTSSource("Col3")
    		' carry on list all the source and destination columns		Main = DTSTransformStat_OK	End IfEnd Function
    -----------------------------------------------------
    Choice is entirly yours.

    ------------------------------
    The Users are always right - when I'm not wrong!

  • Actually I used the above function. I created it as under:-

    '**********************************************************************

    '  Visual Basic Transformation Script

    '  Copy each source column to the

    '  destination column

    '************************************************************************

    Function Main()

     If left(DTSSource("FLAT FILE FOR SYSTEM#"),9) = "FLAT FILE" Then

     Main  = DTSTransformStat_SkipRow

     Else 

     DTSDestination("VOUCHER_ID") = DTSSource("FLAT FILE FOR SYSTEM#")

     DTSDestination("INVOICE_ID") = DTSSource(" 2141")

     if isdate(DTSSource("F3")) then

     DTSDestination("INVOICE_DATE") = DTSSource("F3")

     end if

     DTSDestination("VENDOR_PS_ID") = DTSSource("F4")

     DTSDestination("VENDOR_NAME") = DTSSource("F5")

     DTSDestination("GROSS_AMOUNT") = DTSSource("F6")

     if isdate(DTSSource("F7")) then

     DTSDestination("ACCT_DATE") = DTSSource("F7")

     end if

     DTSDestination("REL_VOUCHER_ID") = DTSSource("F8")

     DTSDestination("PO_NUM") = DTSSource("F9")

     DTSDestination("PAYMENT_ID") = DTSSource("F10")

     DTSDestination("PAID_AMOUNT") = DTSSource("F11")

     Main = DTSTransformStat_OK

     end if

    End Function

    It worked well and skipped the first record but then only copied the next 254 records and then closed the file. The excel document has over 2400 records. I checked to see if the record after that had any problems and it looked just fine.

    So also, when I deleted the first line manually and then ran the DTS, all 2400 records were copied over.

    Is there some limitations which would stop it at 254 if we use it in conjunction with the "SkipRow"?

     

  • I can tell you for a fact that there is no limit of 254. I ran the same job for a flat text file that recorded web activity for a month and it had around 9 million rows and I skipped practically half of these rows using about a dozen combinations.

    What type of coulmn is the one where you get 'FLAT FILE FOR SYSTEM'. If it is a numeric coumn it may be getting confused and ending prematurely for some reason. See this link and try to see if it fixes for you:

    http://www.sqldts.com/default.aspx?254

     

    Other than that I am stumped.


    ------------------------------
    The Users are always right - when I'm not wrong!

  • I checked on the data in the file. While the first and second row first column are letters, the rest of the rows for the first column are numbers BUT all of this data is listed as "TEXT" in format. So that may not be the problem.

    I ran the same transact sql today morning on new data received and it did the same thing, which was to skip the first line and then copy 254 lines and then close the file. Todays file contains over 3000 rows. I manually deleted the first line in excel and it worked just fine copying all 3000 rows.

    So I am still stumped. Someone here suggested that 255 is the maximum length of data in a row and that may be a factor. That was only a guess and may or may not be right.

  • Are you able to dts the whole lot in after deleting the first row using the same function (ie - including the skip row statement)?

     


    ------------------------------
    The Users are always right - when I'm not wrong!

  • I just tried that and it worked perfectly. It skipped the first header line and copied all the data from the rows below, all 3000 lines of data.

  • In that case I cannot see why it is failing. Hope someone can help you out on the forum. I can only suggest looking for a hidden parameter in the package somewhere???


    ------------------------------
    The Users are always right - when I'm not wrong!

  • Thank you for your help. I should be able to use most of the ideas here for other data transfers. In this case, we are now trying to have the company which is sending us the excel file to delete the first row at source.

  • You can write a vb macro to do this for you. Or as I said before, import the entire set of data into a staging table, delete the first row of that table and then insert into the live table using your existing dts package.


    ------------------------------
    The Users are always right - when I'm not wrong!

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply