Bulk Insert from CSV problem

  • Hi, i am trying to do a bulk insert from a comma delimited csv file.

    The fields in the csv file are enclosed in double quotes and when i try execute the bulk insert the quotes are also inserted into the table.

    how do you get rid of the quotes. I'm sure that this is really simple, but i have searched all over, so any help is greatly appreciated.

    Thanks,

    Ryan.

  • As far as I know bulk insert and bcp will not remove quotes. DTS gives you the choice. Alternatively write a query after the bulk insert to update columns removing the quotes using REPLACE.

    Far away is close at hand in the images of elsewhere.
    Anon.

  • Hi, an easy way out will usually be to remove the double quotes before running bulk insert. That is, if the file is not tooo big, use an editor and do a search (") and replace (blank). Unless some of your columns include the same character that is used as field-terminator (for example ; as part of a text within double quotes) will usually work.

  • We had the same problem - I used a perl script to remove them. As pereke said, you must be careful of quotes within the quoted text - which was the problem I ran into.

    I basically changed the delimiter to pipe, working out if I was "within quoted section" or not, and either nuking the quotes or not as appropriate.

    Drop me a line if you are after a copy of the perl script.

    Regards,

    Dave

  • I appreciate all your help and suggestions.

    Using some fieldterminators ("\",\"") in the format file, I am able to get rid of the double quotes.

    I now have a new problem though and would really appreciate some suggestions.

    The date format in the input file is "dd/mm/yyyy" but when it inserts into SQL Server 2000,

    i get an error message saying "Bulk insert data conversion error" when it tries to insert dates such as "14/12/2003"

    i.e. it wants dates in the format of "mm/dd/yyyy".

    Any suggestions?

    Thank you,

    Ryan.

  • OK, sorted out my date problem - "SET DATEFORMAT dmy" before the bulk insert.

  • Re: data type conversion issues

    Another approach to consider when importing data into SQL Server is to use a staging table with all varchar fields to hold the raw data. This prevents the import process from choking on bad data. Transformation and validation can then be performed on the staged raw data prior to subsequent processing.

Viewing 7 posts - 1 through 6 (of 6 total)

You must be logged in to reply to this topic. Login to reply