Import CSV Data Into Existing Tables

  • Good - but do remember that in order to use a tool you need to learn how to use it - if you are like this with a utility like Data Import which has a small number of features that you didn't bother reading about or even play with then your travel on the world of SQL Server is bound to be disastrous - hopefully your clients will see it before your actions break their systems.

  • This was removed by the editor as SPAM

  • Using the defaults with the wizard I received the following error attempting to import the tblAlbum.txt file:

    - Executing (Error)

    Messages

    Error 0xc02020a1: Data Flow Task 1: Data conversion failed. The data conversion for column ""MediaPath"" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".

    (SQL Server Import and Export Wizard)

    Error 0xc020902a: Data Flow Task 1: The "Source - tblAlbum_txt.Outputs[Flat File Source Output].Columns["MediaPath"]" failed because truncation occurred, and the truncation row disposition on "Source - tblAlbum_txt.Outputs[Flat File Source Output].Columns["MediaPath"]" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.

    (SQL Server Import and Export Wizard)

    Error 0xc0202092: Data Flow Task 1: An error occurred while processing file "C:\Users\lapet\OneDrive\Documents\ssc\csvimport\tblAlbum.txt" on data row 3.

    (SQL Server Import and Export Wizard)

    Error 0xc0047038: Data Flow Task 1: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on Source - tblAlbum_txt returned error code 0xC0202092. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.

    (SQL Server Import and Export Wizard)

     

    Looks like user error using the tool. When using the tool it is helpful to appropriately change the length of the columns that are set with default values.

     

  • FYI made changes in the import wizard and the data imported just fine. I do suggest that when you post a problem, whether here or another site, that you post the error message(s) you are getting. It isn't a problem with the tool. It is a matter of learning how to use the tool.

     

  • To beat this dead horse a little more: CSV may be an industrial standarad - and I think everyone here knows that. But while the CSV format may include column headers, it does not include information about data types. And when storing something in a relational database, you need to have a data type for every column.

    Certainly, one could think of a tool that applies heuristics and figures out the data types, but this is iffy. Say that the file is several gigabytes - read the entire file first to make a guess on data types? Or just read a couple of lines - and then load of the file crashes because the guess was wrong? And even if you read the entire file, what about next file? Because often you import files into a table many times.

    No, in practice, you need to know your schema. The import wizard may be helpful and give you suggestions, but you need to babysit it.

    Had you had an XML files with an XSD included, you would have had an argument.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • VSCurtis wrote:

    Don't worry about it.  I won't be coming back here for help since all you guys want to do is make excuses for a tool that turns a simple data import unto a Charlie Foxtrot.  The tool has defeated the purpose of a CSV file plain and simple.  The files are properly formatted and contain ALL the necessary information to import the data.  Fields are separated by a comma, fields are enclosed in quotes and the file contains a header so that the fields in the file can be mapped to the fields in the table.   It should be as simple as select file, answer a few simple questions, quotes, no quotes, is the first row a field header, target table done.  But NO, server has to turn it into a science fair project.  Yes I blame the tool.  You clowns just keep making excuses for it

    Heh... I sense some frustration here, yes?

    Ok.  First of all, I absolutely agree.  The tool that you're trying to use and a lot of the other tools in SQL Server absolutely suck when it comes to CSV.  It even says in the documentation that things as powerful as BULK INSERT were (notice the were) NOT designed to handle CSVs.  The tools are absolutely awesome for a whole lot of things but not TRUE CSVs.  People like me have been frustrated by Microsoft's inattention to CSVs for more than two decades.

    Yep... you can go a whole lot of places and find tools that will work.  They work especially well for one or two time manual junk just fine.  They might, if you're lucky, even have a CLI (command line interface, to be sure) where you can automate such imports.  Still, it's extra junk and it's outside of SQL Server.  That may or may not be a problem for you but I hate having to have a whole bunch of extra tools just to do stuff like what should be a simple import of even an inconsistently formatted file like the second file you posted in your original post.

    I did tell you that this is easy in SQL Server even though I knew it didn't use to be for one and only one reason.  You're previous post was asking how to Migrate SQL Sever 2012 Database To Server 2014 and Subsequent Versions .  Grant Fritchey made the recommendation that you shouldn't update to 2014 because it's actually no longer supported.  You didn't reply but the title of your post does contain the words "AND SUBSEQUENT VERSIONS".

    So, I was assuming (I know... really bad thing to do) that you had updated to something more recent because Microsoft finally fixed the CSV junk starting in 2107.

    Now... if you want to stop all the damned name calling and IF you've updated to 2017 and IF you've updated SSMS to the latest, as well, the stupid tool (yep... I agree... it was massively stupid for CSVs before) you were using now works just fine (if you've done the updates... which you apparently haven't). 😀

    But, let's say it didn't for a minute, that it didn't work yet.  I agree that anything working with CSV files should be able to handle the inconsistency that you have in your second file but it just flat out didn't prior to 2017.  Are you stuck with SQL Server prior to 2017?  If so, are you going to have import more than 1 or 2 of these files more than say, once per week?  Then maybe you should let one of us "clowns" help you take that pork chop necklace off while you're cleaning crap out of the lions den. 😉  The question is, are you interested in how to do this in T-SQL both before and after the nice "fixes" they made in 2017?  Or do you want to "stay stupid" just because you think you've seen a herd of clowns. 😀

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Lynn Pettis wrote:

    FYI made changes in the import wizard and the data imported just fine. I do suggest that when you post a problem, whether here or another site, that you post the error message(s) you are getting. It isn't a problem with the tool. It is a matter of learning how to use the tool.

    Which version of SQL Server and SSMS were you using and what changes did you make to import the second file from the original post, which has the format inconsistencies?

    I ask because I have 2017 on my personal laptop (and didn't have to make any changes) and I can't test this on my 2016 boxes at work.  It'll also help others that may read this thread.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Reply, SQL Server 2019 and the latest version of SSMS. I changed the length of the output for the media_path column in the import and ensured the length of the destination table matched that length. I posted the error messages I got when I did a straight default import making no changes. I just did what someone who didn't know how to use the tool would do it.

    I haven't had to import csv files in a while, been finding myself actually importing JSON files.

    And with PowerShell it is a simple matter to convert a CSV file to a JSON file.

     

  • Jeff Moden wrote:

    Lynn Pettis wrote:

    FYI made changes in the import wizard and the data imported just fine. I do suggest that when you post a problem, whether here or another site, that you post the error message(s) you are getting. It isn't a problem with the tool. It is a matter of learning how to use the tool.

    Which version of SQL Server and SSMS were you using and what changes did you make to import the second file from the original post, which has the format inconsistencies?

    I ask because I have 2017 on my personal laptop (and didn't have to make any changes) and I can't test this on my 2016 boxes at work.  It'll also help others that may read this thread.

    the import data tool is SSIS for all that matters (restricted in functionality but still SSIS) - it works the same way since version 2008 when dealing with CSV files (although added more codepages processing).

    as pointed out before there is a need to define the sizes and datatypes of the columns being imported - if this is not done then errors will occur.

    and unlike bcp on previous versions of SQL it does handle correctly built CSV files without any issues.

  • Understood on the SSIS thing but the Op wasn't using SSIS.  For that matter, I typically try to make it so that I never have to use SSIS anywhere.

    Having had to overcome the kind of problem that occurs in the 2nd file in the past, I just make a call to Powershell (through xp_CmdShell, which is NOT the security issue everyone makes it out to be) to do an IMPORT-CSV followed by and EXPORT-CSV (I think those are right... haven't had to do it in a while) and that "normalizes" the CSV to ensure that all the "fields" in the file are double-quoted and, from there, it's a piece of cake to import using BULK INSERT without a format file (but a lot faster with one) and not having to do any of the manual stuff at all and not have to use SSIS either.

    It works for everything no matter the version of SQL Server since and including 2005.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • Lynn Pettis wrote:

    Reply, SQL Server 2019 and the latest version of SSMS. I changed the length of the output for the media_path column in the import and ensured the length of the destination table matched that length. I posted the error messages I got when I did a straight default import making no changes. I just did what someone who didn't know how to use the tool would do it.

    I haven't had to import csv files in a while, been finding myself actually importing JSON files.

    And with PowerShell it is a simple matter to convert a CSV file to a JSON file.

    That's kind of what I thought... using something > 2106.  In 2017, I didn't have to make any changes at all... it created the table and the whole 9 yards all by itself.  The tool is much improved for CSVs as of 2017.

    I wonder what the Op was using.  Guess we'll never know.

    --Jeff Moden


    RBAR is pronounced "ree-bar" and is a "Modenism" for Row-By-Agonizing-Row.
    First step towards the paradigm shift of writing Set Based code:
    ________Stop thinking about what you want to do to a ROW... think, instead, of what you want to do to a COLUMN.
    "Change is inevitable... change for the better is not".

    Helpful Links:
    How to post code problems
    How to Post Performance Problems
    Create a Tally Function (fnTally)
    Intro to Tally Tables and Functions

  • I'm not sure why Jeff keeps talking about the version of SQL Server. We are not talking about BULK INSERT here.

    As I understood the original post, VSCurtis right-clicked the database in Object Explorer, selected Tasks and then "Import Flat File". This is more or less an SSMS-only feature - I don't think SSIS is involved here. But a modern version of SSMS helps, obviously. (This feature was introduced during the 17.x time frame, if memory serves.) I tested with SQL 2016, but it should work with SQL 2008 as well.

    I was able to import the files without any adjustment to the files. I had to adjust the data types, though. I selected "Use Rick Data Type Detection", which did not really do a great job. For many of the ID columns it selected tinyint, which is a bad choice when the IDs are on the form XNNN, where X is a letter and N are digits. To confuse matters, for some of these ID columns it opted for nvarchar(50). In this mode, the selection for the MediaPath column was nvarchar(50).

    Even if the wizard could have done a better job, the moral is still that you need to understand your data type. And, yes, you will have to dance the foxtrot with Charlie S Vanderbildt, because there is no type information in CSV files.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Jeff Moden wrote:

    Understood on the SSIS thing but the Op wasn't using SSIS.  For that matter, I typically try to make it so that I never have to use SSIS anywhere.

    .

    he is using the " I'm using Import Data. Flat File Source as my means of importing the data.  " ' this is SSIS for all intents and purposes.

    It will even create a SSIS package that can be executed on the server.

  • This was removed by the editor as SPAM

  • This was removed by the editor as SPAM

Viewing 15 posts - 16 through 30 (of 37 total)

You must be logged in to reply to this topic. Login to reply