Import CSV Data Into Existing Tables

  • Hi Guys,

    I am having issues trying to import data from CSV files into existing tables in my database.  The first row is field names, the values are separated by commas and the values are enclosed in quotes.  The files were generated by an MS Access Query Export.  I'm using Import Data. Flat File Source as my means of importing the data.  Some of the files import without issues while others are a no go despite the fact that everything looks fine as I step through the import process.  I'm not sure what the issue is.  Importing data this way is a common practice and should not be a hair pulling exercise.  I would appreciate any help you can provide.   I've included two files as an example.

    Attachments:
    You must be logged in to view attached files.
  • It would help if you provided the errors you are getting - no idea what the problem is if we cannot see that information.

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • Removed inaccurate response.

  • Hi Guys,

    Problem solved.  Rather than pull my hair out trying to figure out why MS had to take something so simple and turn it into a complicated Charlie Foxtrot, I found a web site that converted the records in my CSV files into Insert Statements.  I saved the files did a few edits and problem solved.  It really is a shame that Server can't handle something so simple as importing a properly formatted CSV file that already contained ALL the information it needed to do the job without making a train wreck out of it.

  • No Data Is Missing

    B014","S006" Bit Rate and Sample Rate Keys

  • VSCurtis wrote:

    Hi Guys,

    Problem solved.  Rather than pull my hair out trying to figure out why MS had to take something so simple and turn it into a complicated Charlie Foxtrot, I found a web site that converted the records in my CSV files into Insert Statements.  I saved the files did a few edits and problem solved.  It really is a shame that Server can't handle something so simple as importing a properly formatted CSV file that already contained ALL the information it needed to do the job without making a train wreck out of it.

    Actually, SQL Server can handle this quite easily.  You just need to know the tools.  It sounds like you don't actually need a T-SQL solution anymore though.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • VSCurtis wrote:

    No Data Is Missing

    B014","S006" Bit Rate and Sample Rate Keys

    If this is representative of the data in the files - then no wonder you had issues.  That is not valid CSV data - missing leading quote and data following a quote without a delimiter.

    The tool is not the problem - in almost every case for importing data into SQL Server the problem is the data in the files or how the tool is being used.  Especially with the import wizard, which uses a 'default' data type of varchar(50) for every column and many do not change those values to appropriate data types.  This often leads to truncation errors - and if you do attempt to set those values by using the wizard, it often gets the data types wrong because it only samples the first 200 rows.

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • VSCurtis wrote:

    No Data Is Missing

    B014","S006" Bit Rate and Sample Rate Keys

     

    You didn't have anything like that in the sample data in your original post.

    Like Jeff Williams stated, that's not a tool problem... that's a bad data problem.

     

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • That was a quick cut and paste, I accidentally chopped off a quote.  Did you even bother to look at the files I posted or READ my post?  No, obviously not because I stately clearly that the files were generated by an MS Access  Export.  The files were correctly formatted CSV files and as such Server should have been able to process them without issue, one and done, NO HAIR PULLING INVOLVED.  Fields were separated by a comma and values enclosed in quotes therefore column lengths are irrelevant.  Field headers were provided to facilitate mapping to the correct field in the table.  CSV's are an industry standard file format used for import and export of data.   I could take those same files and import and export them all day long in any other application that understands the file format with NO HAIR PULLING, ONE AND DONE, as it should be.  MS has taken something that's supposed to be simple and made it more complicated than it should be, plain and simple.  You people are too focused on pointing a finger at human error and missing the point. But that's OK, I won't bother coming back here.  I have a workaround to avoid dealing with Server's inability to process a simple CSV file.

  • if you are using a SSIS package and have correctly defined the input file then it will work well with the sample you posted - but if not defined correctly then it may fail.

    as you didn't tell us what error you were having we can't help you with it either - so giving out when you did post a bad example of data yourself is not good for anyone willing to help.

    Fact that you gave us some sample data that looked ok does not mean that ALL your files are ok - so next time does give us a BAD file and the full error instead of some explanation that "it fails sometimes"

  • All the files came from an MS Access Export.  They are properly formatted CSVs.  They ALL came from the same source.  CSV is a  standard file format and there should be NO issues, one and done.  Server's import process complicates something that should be simple.  The problem is solved I found a web site that read those files with NO issues and generated insert statements from them.

  • Okay - you say the data was properly formatted, but it generated an error.  Since you never provided the error - the only conclusion is that there was a problem with the file, or there was a problem with how you configured the import process.

    To repeat: the import/export wizard defaults every column from a CSV file to a varchar(50) data type.  If one column contains 51 characters it will cause the import process to fail.  If you attempt to configure the data types using the wizard - it often gets them wrong and will also generate errors.

    I guess it is just easier to blame the tool.

    Jeffrey Williams
    Problems are opportunities brilliantly disguised as insurmountable obstacles.

    How to post questions to get better answers faster
    Managing Transaction Logs

  • To Repeat.... CSV files are industry standard files.  There are NO default values for field size.  The fields are delimited by commas and may or may not be further delimited by quotes.  There is no defining field sizes.  By doing so you've defeated the purpose of a CSV file. The import process has taken what should be a simple "one and done" and made it more complicated than it should be.  I can take those same files and import them into a variety of other applications until the cows come hone with NO HAIR PULLING because that's their purpose,  simple import/export with NO HAIR PULLING.  So YES I blame the tool for taking a simple process and elevating it to the level of stupidity and defeating the purpose of a CSV file

  • To Repeat.... CSV files are industry standard files. There are NO default values for field size.

    But in SQL Server all columns must have a size. No matter if CSV files are a standard are not. Yes, the import wizard could default to nvarchar(MAX), but that is bad for a couple of more reasons. So the table has nothing to do with the industry standard. And it is something you have to create. The tool may be nice and *try* to create it for you, but it needs to create the table before it reads the file.

    Then again, we have no idea if the columns size was the problem, since you have not cared to share an error message or given us any details of what you wanted to do.

    This is a technical forum to get help with SQL Server. It's not a place where you do if all you want to do is to express your frustration.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Don't worry about it.  I won't be coming back here for help since all you guys want to do is make excuses for a tool that turns a simple data import unto a Charlie Foxtrot.  The tool has defeated the purpose of a CSV file plain and simple.  The files are properly formatted and contain ALL the necessary information to import the data.  Fields are separated by a comma, fields are enclosed in quotes and the file contains a header so that the fields in the file can be mapped to the fields in the table.   It should be as simple as select file, answer a few simple questions, quotes, no quotes, is the first row a field header, target table done.  But NO, server has to turn it into a science fair project.  Yes I blame the tool.  You clowns just keep making excuses for it

Viewing 15 posts - 1 through 15 (of 37 total)

You must be logged in to reply to this topic. Login to reply