Insert bulk failed due to a schema change of the target table.

  • haiao2000 (2013-07-15)


    I think you are absolutely right, especially on #4, I know for the fact that our application has some built-in auto index feature that checks the table periodically and rebuild the indexes if it finds necessary.

    So, if you had said this from the start, we would have had the answer directly. This is exactly what kills you.

    In the constructor to the SqlBulkCopy class, add the option SqlBulkCopyOptions.TableLock. That will keep that nasty auto-indexer out from your table. (Or more precisely, it will be blocked until you are done.)

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Erland Sommarskog (7/15/2013)


    haiao2000 (2013-07-15)


    I think you are absolutely right, especially on #4, I know for the fact that our application has some built-in auto index feature that checks the table periodically and rebuild the indexes if it finds necessary.

    So, if you had said this from the start, we would have had the answer directly. This is exactly what kills you.

    In the constructor to the SqlBulkCopy class, add the option SqlBulkCopyOptions.TableLock. That will keep that nasty auto-indexer out from your table. (Or more precisely, it will be blocked until you are done.)

    Ah, Sorry, I didn't know what kind a monster I am after :-). Does it take care full text index stuffs as well? I will try this soon and let you know, because dropping index does not seems to be a good solution for this as during the import there are lots of instances the app also runs select statements and without present of those indexes, performance is going down the toilet. another words, gain on insert but loss back on the select.

    Thanks again!

  • I set both SqlBulkCopyOptions.TableLock , SqlBulkCopyOptions.CheckConstraints but still getting the same issue.

    Anything else you could think of? Maybe I did not set the TableLock correctly? Should it be:

    Dim sqlBulk As New SqlBulkCopy(connectionString, SqlBulkCopyOptions.TableLock Or SqlBulkCopyOptions.CheckConstraints)

    instead of

    Dim sqlBulk As New SqlBulkCopy(connectionString, SqlBulkCopyOptions.TableLock And SqlBulkCopyOptions.CheckConstraints)

    I used the bitwise AND, I guess that may be why it didn't work. I changed to the OR will see if it work

    System.Data.SqlClient.SqlException: Insert bulk failed due to a schema change of the target table.

    at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)

    at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)

    at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)

    at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)

    at System.Data.SqlClient.SqlBulkCopy.WriteToServerInternal()

    at System.Data.SqlClient.SqlBulkCopy.WriteRowSourceToServer(Int32 columnCount)

    at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table, DataRowState rowState)

    at System.Data.SqlClient.SqlBulkCopy.WriteToServer(DataTable table)

    ***UPDATED: Not added SqlBulkCopyOptions.TableLock but still run into error

  • You need bitwise or, that is the | operator, if remember the C# syntax correctly.

    Then again, if there are queries running against the table, while your loading, they will be blocked. And these queries generate autostats.

    I think you should investigate the BatchSize property of the SqlBulkCopy object instead. Set this to 10000 or so. Then the data will be committed after each 10000 rows. Which means that you will need to consider that you are reloading a file, because the first load was interrupted halfway through.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Erland Sommarskog (7/16/2013)


    You need bitwise or, that is the | operator, if remember the C# syntax correctly.

    Then again, if there are queries running against the table, while your loading, they will be blocked. And these queries generate autostats.

    I think you should investigate the BatchSize property of the SqlBulkCopy object instead. Set this to 10000 or so. Then the data will be committed after each 10000 rows. Which means that you will need to consider that you are reloading a file, because the first load was interrupted halfway through.

    Thank you, I will set batch size property. Could you explain what you meant by reloading a file? If there are 20,000 records to copy and batch size being set to 10,000, isn't SQL automatically split into 2 batches?

  • When 15000 rows have been loaded, someone trips on the network wire and the client is disconnected. With no BatchSize set, all 15000 rows are rolled back. With a batch size of 10000, only 5000 rows are rolled back. If you just redo the file load, you will either get a primary-key error, or worse: duplicate data.

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • Erland Sommarskog (7/16/2013)


    When 15000 rows have been loaded, someone trips on the network wire and the client is disconnected. With no BatchSize set, all 15000 rows are rolled back. With a batch size of 10000, only 5000 rows are rolled back. If you just redo the file load, you will either get a primary-key error, or worse: duplicate data.

    Thanks! I learned another new thing. That is the case our application already have a way to take care this issue as it is not a straight file-to-database copy, but it does lots of validation prior to inserting so all records being inserted already being validated.

    One more question, if I know for the fact that each of the bulkcopy batch usually has no more than 100 records. Is it still necessary to set batch size?

  • Good morning!

    It seems the SQL bulk copy failed has many things to do with my auto-reindex stored procedure. Our application called this stored procedure every 20 minutes or so to detect fragmented indexes, if found it then calls the same stored procedure to rebuild the fragmented indexes.

    I put together this SP a while back though I am unsure its performance is optimized especially with cursor being used. Per your suggestion, in my SqlBulkCopy I set sqlBulkcopyOptions to tableLock, so my assumption it should keep index rebuilding away from the table until the table is released but it didn't. it still seem to interfere with sqlbulkcopy. do you think additional condition should be added to check to prevent interfering with locked table?

    Could you please shed some lights? Could you see if you can help optimize this SP as our application is very data intensive so any improvement would be a great addition to it.

    Thanks a lot in advance!

    **please download the attached stored procedure **

  • haiao2000 (2013-07-17)


    One more question, if I know for the fact that each of the bulkcopy batch usually has no more than 100 records. Is it still necessary to set batch size?

    No, but I'm getting confused. As I understood it, you were trying to load millions of rows, and now it's only 100?

    I put together this SP a while back though I am unsure its performance is optimized especially with cursor being used.

    For that type of procedure, it's typical to use a cursor, since you can only rebuild/reorganise one table/index at a time.

    Per your suggestion, in my SqlBulkCopy I set sqlBulkcopyOptions to tableLock, so my assumption it should keep index rebuilding away from the table until the table is released but it didn't. it still seem to interfere with sqlbulkcopy. do you think additional condition should be added to check to prevent interfering with locked table?

    You run with ONLINE=ON, but REBUILD takes a schema lock at the start and the end, so it should be blocked. In any case, since the table lock blocks other queries, and you indicated that users wants to query the table while you are entering data, the table lock does not seem like a good idea. That was why I suggested setting the batch size. Have you tried this yet?

    [font="Times New Roman"]Erland Sommarskog, SQL Server MVP, www.sommarskog.se[/font]

  • No, but I'm getting confused. As I understood it, you were trying to load millions of rows, and now it's only 100?

    Let me try to explain: the feed to our app consists of a project type of structure, it has directories, subdirectories and many files of various file types. only the app can understand what to do with those files based on the set of rules. that being said, there may be thousands of files and thousands of objects and millions of properties associated to them within each files to process. The app reads these files and their relationships and determine which records to be written to which tables in database.

    ...say there are 10,000,000 objects to process, each object may contain around 100 properties. the .net code extracted these 100 property records into a source datatable, then for each object it uses sqlbulkcopy to write the source datatable to SQL server.

    I did try the regular SQL batch update query before, but performance was still very poor and I kept getting SQL timeout error as too much data get shovel to the DB. that is the main reason why I was exploring sqlbulkcopy

    I did try to set batchsize as you suggested, and let it runs thru the night, it will take more than 16 hours before I can determine if it is good or bad. :-). part of it because the desktop It is running on isn't that powerful and only has 4GB ram. 🙂

    ***UPDATED: I set batch size, and it kill performance it run extremely slow for some reasons. ****

  • Not sure if you are still having this issue. I recently ran into the same error & it appears to be related to DB maintenance job that was happening while the bulk copy was running. We disabled the maintenance job that was rebuilding indexes & updating statistics and it seems to be working.

Viewing 11 posts - 16 through 25 (of 25 total)

You must be logged in to reply to this topic. Login to reply