Excel upload problem; (Datatype)

  • Hi,

    I need to upload data in the excel sheet to SQL Server 2005. Fr doing this I have created a SSIS package. Data is getting uploaded but that is a typical problem while getting uploaded.

    I have column in the excel sheet, which contains alphanumeric and numeric values. I have defined the corresponding column on the database side as NVARCHAR(50). SSIS package is uploading only the alpahumeric values. Where ever it encounters a numeric value it is getting uploaded as NULL.

    Info: Excel sheet as around 15 columns & 50,000 rows.  The column to which I am refering to has around 45,000 alphanumeric records & around 5000 rows with numeric values.

    Please help me in this regard, how to upload data from excel without getting those null values into DB.

    Because of this my report is getting generated with wrong values as I am loosing nearly 5000 rows (becoz of null values in that column).

    Thanks in advance.

    svcet_it.

  • can we see an example of the column that is getting imported and some that are not getting imported?

    I would convert the excel doc to a tab delmited file and go from there, also you probably dont need it to be nvarchar, varchar will do just fine, nvarchar is for unicode character support.

  • Hi,

    I have had this problem recently and found it quite frustrating. Check out the following KB.

    http://support.microsoft.com/kb/194124/EN-US/

    I have also seen a solution where are user has modified the spreadsheet to add a derived column which appenda a charcter to the front of the column forcing every row in the column to be a varchar.

    Hoper this helps

    Daniel

  • Hi Daniel Forrester & Eric Cogen,

    Thanks for your reply.

    Dear Daniel,

    I have checked this link. I have done the same what ever is written in that article. It didn't workout. Even I tried modifying the registry entry from TypeGuessRows=8 to TypeGuessRows=0. Even that didn't workout.

    TypeGuessRows=0 specifies the jet to look thru all the rows of that column in the excel sheet and decide on the datatype.

    This solved the problem for few excel sheets. Which I quote as 95%. But for the remaining files the problem still persists.

    If you have any other solution please let me know.

    Dear Eric Cogen,

    As requested by you, here is the sample format of the data in tha column.

    100L200609301234

    100L200609301321

    100L200609301876

    100L200609301972

    80200609302122

    80200609302345

    80200609304556

    Please help me in this regard.

    Note:

    The format of the column in the excel sheet is "GENERAL". I changed it to text and checked as well. But no use.

    Thanks in advance.

    Regards,

    SVCET.

    IT Department.

  • svcet_it,

    The values that you are importing from the excel spreadsheet are below? correct?

    if so then of course you are getting null values in your columns, my guess would be that the default value in your create table statement was the default value for that column is null, the below data shows this is true, the first 4 rows have data for your column_1 and the next few do not. here is what you should do.

    Modify your reporting script to handle nulls with the "COALESCE" function.

    IE

    select blah, blah2 from blahtable -- change this to

    select COALESCE(blah, 0) as blah, blah2 from blahtable

    100l200609301234

    100l200609301321

    100l200609301876

    100l200609301972

    80200609302122

    80200609302345

    80200609304556


    Kindest Regards,

    EC

  • Does your source have to be Excel? Can you save the Excel file to csv?

    Might save you time

  • Instead of importing the table/view (actually the sheet), you could try to use a SQL statement, which you can build using SSIS. The trick would be to convert your data to char on the Excel side:

     

    SELECT cstr(<fieldname&gt FROM <Sheet name>

     

    Hope this helps.


    Kindest Regards,

    DoktorBlue

  • I've done a lot of this over the past year and have come to the conclusion that the Excel importing that SQL does is crap. You can't trust the data that ends up in SQL.

    I ended up overcoming this hurdle by importing a comma separated file using

    OPENROWSET(BULK N'<file>', SINGLE_CLOB) a

    This is inserted into a table <A>. I then parse through the table and divide this CLOB into separate lines (looking for the CRLF control characters) into table <B>. Then I grab the column headers in the first line and parse through them to make sure they are what I am expecting. Parse through each of the data lines and combine the two into an insert that is exec<insert>'d.

    My frustration with your method was that data would get left out with no indication. Say I had 1000 rows of numbers and a couple rows of strings. Even through I tell it not to look at every row this did not help.

    I also had problems trying to determine the errors as well.

    While this solution is slower the results have been great as I have complete control over the process.

  • Importing Excel files has given me more pain over the years then I care to remember. Save the Excel file to a csv and save yourself some pain.


    Cheers,

    Diane

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply