Problems with my bulk insert

  • Hi Professionals.

    I am running PHP, HTML which basically allows a user to select a csv file then load this into a dynamically created table based on the columns from the csv file.

    When the bulk insert runs in the backend the web browser just hangs, upon doing some analysis with sp_who2 etc I have noticed that it is the bulk insert operation that seems to be causing the problem.

    The are a few million rows but it only ever inserts exactly 500,000 rows, so my question is! does anyone know where I could be going wrong.

    Here is my bulk insert code

    USE [TestData]

    GO

    /****** Object: StoredProcedure [dbo].[bulkinsert] Script Date: 10/25/2013 11:36:12 ******/

    SET ANSI_NULLS ON

    GO

    SET QUOTED_IDENTIFIER ON

    GO

    ALTER procedure [dbo].[bulkinsert]

    /********************************Passed in variables from PHP script********************************/

    @importedfile nvarchar(50) /* The csv filename EG test.csv */

    AS

    BEGIN

    declare @cmd nvarchar(1000); /* The variable for the bulk insert */

    SET @cmd = 'BULK INSERT newtable

    FROM ''C:\inetpub\wwwroot\uploads\'+ @importedfile +

    ''' WITH ( FIRSTROW = 2, ROWS_PER_BATCH = 2500,

    DATAFILETYPE = ''char'',

    FIELDTERMINATOR = '','',

    ROWTERMINATOR = '''')'

    EXEC(@cmd)

    END

    thanks in advance

  • I think there is something incorrect with your text file at that particular row. I just created a table with half a billion rows, exported that into a text file, and used Bulk Insert to pump the data into another table using the code that you posted, and it worked well, all rows were pumped in. I did change the number of the batch to 10,000

    Andrew SQLDBA

Viewing 2 posts - 1 through 1 (of 1 total)

You must be logged in to reply to this topic. Login to reply