cant dts up flat file due to date column

  • im using the old dts 2000. i have an excel spreadsheed converted into a tab delimited text file.

    i have to upload it into a table.

    each column is allowed nulls, so if a cell has nothing in it it shouldnt make a difference, as its only going into a staging table anyway.

    my dts fails each time on the date column.

    the staging column has a smalldatetime type . the flat file is just a string as far as i know. i get this error each time:

     

    conversion invalid for datatypes on column pair 13(source column 'PaymentDate' (DBTYPE_STR), destination column 'PaymentDate'(DBTYPE_DBTIMESTAMP))

     

    am i formatting the excel sheet wrong before i convert it into a tab delimited text file?? what should i do go fix this? ive done it a 1000 times before and never got this error.

     

  • Look at how the null values are represented in the flat file. 

    You'll probably have to make the data pump a VB script which checks the value of that particular column.

  • Since this is an import to a staging table, make the staging column a varchar type. Once the import is done, run a subsequent SQL task to convert the varchar to a 'real' smalldatetime, and set it to null for cases where it overflows or won't convert to a smalldatetime.

     

  • This may not be a DTS problem. If you've done this 1000 times before with no problem then you have a data problem in your spreadsheet. Have you cut the column from excel into a text file and looked through it to find anomalies? Try going beyond the end of the data to see if there is something in a cell further down. Try cutting the import file down to find the exact row that the error occurs on. Then look carefully at that row. When you find exactly what is causing the problem you can either notify the author of the spreadsheet and have them correct it before sending it, or handle the problem in DTS with surgical precision.

    T

  • I agree with PW above.  I always import data to varchar fields and then run a series of data validation checks.  You'd be surprised what passes for dates in some peoples data entry ....

    Don't forget to run your data and referential integrity checks, and generate detailed reject messages to send to yourself and your data supplier.  Then "import" the good data into the production tables.

  • I agree. Remember that SQL2000 DTS is more like ELT than ETL. In the past I've loaded the table into a temp and used a series of strings/substrings to get the date desired by SQL during the transform to the real table.

  • I'm having a similar problem, and in my case, it's due to date/time fields having 24 hour time: the incoming will be something like 072520081341 and it will blow. If I change it to 1241, it's OK.

    I'm curious if you recall what your solution was, since it's been two years. I think I can work around it, I just have to think about it.

    And thanks, Tim, for your comment about ELT vs ETL. I should keep that in mind.

    -----
    [font="Arial"]Knowledge is of two kinds. We know a subject ourselves or we know where we can find information upon it. --Samuel Johnson[/font]

  • Wayne West (7/25/2008)


    I'm having a similar problem, and in my case, it's due to date/time fields having 24 hour time: the incoming will be something like 072520081341 and it will blow. If I change it to 1241, it's OK.

    I'm curious if you recall what your solution was, since it's been two years. I think I can work around it, I just have to think about it.

    And thanks, Tim, for your comment about ELT vs ETL. I should keep that in mind.

    I ended up importing the date/time field as char and converting it to a true D/T data type in the script that I inspect the data with. Works fine. I find it interesting that the date/time conversion in DTS can't cope with 24 hour times.

    -----
    [font="Arial"]Knowledge is of two kinds. We know a subject ourselves or we know where we can find information upon it. --Samuel Johnson[/font]

Viewing 8 posts - 1 through 7 (of 7 total)

You must be logged in to reply to this topic. Login to reply