Unusal Rounding when importing data from csv with OpenRowset

  • Please help. I'm importing data from a csv file into a table where I can validate restults before inserting into final table. However I'm getting some inconsistent rounding errors. For example, column in csv is 23.75 gets inserted into staging table as 23.7499999 then when using convert( money, AMOUNT ) it's final value is 23.8000 not 23.7500. I am only getting this on 6 out of 200 values. And if I run select  23.74999999, convert( money,  23.74999999 ) then I get the expected result of 23.7500.

     

    Thanks in advance for any insight.

    dab

  • Have you tried decimal(18,2) instead of money?

  • It will be easy if we have all the fileds of staging table as VARCHAR or CHAR fields. We will get the values exactly as in the text or csv files. Then convert the data within SQL to insert into normal table. This approcah normally helps for this kinds of problems.

    Regards,
    gova

  • govinn, that was our original plan. We built the staging table with varchar fields and performed an INSERT INTO staging select * from OpenRowset. However, that caused a conversion error with ALL numeric fields. Now we are using SELECT * into Staging from OpenRowset.

  • Open rowset is used to get data from a datasource. If I understood correct you have a csv file. So just use

    BULK INSERT [ [ 'database_name'.] [ 'owner' ].] { 'table_name' FROM 'data_file' }

        [ WITH

            (

                [ BATCHSIZE [ = batch_size ] ]

                [ [ , ] CHECK_CONSTRAINTS ]

                [ [ , ] CODEPAGE [ = 'ACP' | 'OEM' | 'RAW' | 'code_page' ] ]

                [ [ , ] DATAFILETYPE [ =

                    { 'char' | 'native'| 'widechar' | 'widenative' } ] ]

                [ [ , ] FIELDTERMINATOR [ = 'field_terminator' ] ]

                [ [ , ] FIRSTROW [ = first_row ] ]

                [ [ , ] FIRE_TRIGGERS ]

                [ [ , ] FORMATFILE = 'format_file_path' ]

                [ [ , ] KEEPIDENTITY ]

                [ [ , ] KEEPNULLS ]

                [ [ , ] KILOBYTES_PER_BATCH [ = kilobytes_per_batch ] ]

                [ [ , ] LASTROW [ = last_row ] ]

                [ [ , ] MAXERRORS [ = max_errors ] ]

                [ [ , ] ORDER ( { column [ ASC | DESC ] } [ ,...n ] ) ]

                [ [ , ] ROWS_PER_BATCH [ = rows_per_batch ] ]

                [ [ , ] ROWTERMINATOR [ = 'row_terminator' ] ]

                [ [ , ] TABLOCK ]

            )

        ]

    to load the data into a table which has all VARCHAR or CHAR fields.

    Regards,
    gova

  • DAB, have you tried expanding the cell in csv file to 6 decimal places?

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply