BCP in fails on file larger than 32Mb

  • I am currently experiencing the following problem. Has anyone else had it?

    I have a table with several char and integer fields, and with a single text field.

    I use BCP to output the data in native format. Now when I attempt to load the data back in, if the source data file being referred to in the BCP IN is larger than 32,767 Kbytes, the last record will fail to load regardless of first or last row settings.

    I have tried using the BCP.EXE from SQL7 and SQL2000 with the same errorenous results.

  • Does it give any error, or exactly what occurrs?

    "Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)

  • The error message is:

    ---

    SQLState = 37000, NativeError = 4813

    Error = [Microsoft][ODBC SQL Server Driver][SQL Server]Expected the text

    length in data stream for bulk copy of text, ntext, or image data.

    BCP copy in failed

    ---

    The text field is the last field in the table definition. There are multiple index on the table.

  • Here is a set of steps used to replicate the problem.

    It appears the problem only exists under the following conditions.

    ;- The Text field is the LAST field in the table definition. For comparison, swap the order of the row_id and pr_info fields in the create table definition.

    ;- the last record in the out file does NOT have anything in the Text field.

    For comparison, remove the update statement in step 1. for script below.

    /* start of script */

    -- 1. create a large table, so BCP out file is more than 32Mb

    if exists (select 1 from sysobjects where name = 'pub_info_large')

    drop table pub_info_large

    create table pub_info_large

    ( pub_id char (4),

    row_id integer identity(1,1),

    pr_info text

    )

    alter table pub_info_large add constraint PK1_PUB_INFO primary key (row_id)

    insert pub_info_large (pub_id, pr_info)

    select pub_id, pr_info from pub_info

    while @@ROWCOUNT < 2048 begin

    insert pub_info_large (pub_id, pr_info)

    select pub_id, pr_info from pub_info_large

    end

    update pub_info_large

    set pr_info = ''

    where row_id = (select max(row_id) from pub_info_large)

    /* end script step 1 */

    -- 2. bcp out the data

    /*

    bcp pubs..pub_info_large out pubinfo.out -n -U sa -S <server> -P <pword>

    */

    -- 3. truncate the table to load it back in.

    truncate table pub_info_large

    -- 4. run bcp data back in.

    /*

    bcp pubs..pub_info_large in pubinfo.out -n -U sa -S <server> -P <pword>

    */

    I would appreciate it if others can confirm that they get the same error using the code above.

    Robert

    *smile - its infectious*

  • Hi again,

    I forgot to mention that the script is run whilst in the pubs database. Grabs seed data from the pub_info table present in the pubs database.

    Robert

  • Ok I'll try it tomorrow and see what happens and what I can figure out.

    "Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)

  • Strangest thing I have ever seen. I can reporduce it from this which ends up with 4095 rows but the file outs with 4096, when I decreased number of rows to 2000 was fine and when I increased to 4192 things were fine. I looked at the bad file as binary format in C++ and get these extra items

    Hex "04 00 39 39 39 00 10 00 00 00 00 00 00"

    Text "9999"

    (You may not be able to see this.)

    I will have to collect this and pass to Microsoft but I found a work around. Do a -L with a number to specify the number of the last row (the number of rows in your table since it added 1 for some reason) and it will work fine. Seems like the datastream ends abruptly in the wrong place.

    Hope this helps, and sorry it took so long.

    "Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)

    Edited by - antares686 on 03/22/2002 10:23:24 AM

  • My workaround was to output the whole table, then a separate output with only the last record using -L n.

    When loading it back use -L n-1 from the first output. where (n) is the number of records.

    However this requires me to know how many records are in the table before performing the action.

    When you extended to 4192 records did you remove the notes from the textfield for the last record in the table?

  • I changed nothing just extended the number of rows that would output with all fields with data.

    "Don't roll your eyes at me. I will tape them in place." (Teacher on Boston Public)

  • I have now found a Microsoft KB that is similar to the condition demonstrated but is for DTS.

    Found it whilst scanning kbAlertz.com newsletter of 25th Mar 2002.

    Q259304 - FIX: DTS Fails to Import Last Row If There is No Last Row Terminator

    http://support.microsoft.com/default.aspx?scid=kb;EN-US;Q259304

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply