DTS Nightmare!

  • I am having tons of problems with IMPORTing data from a very large CSV file (64,000,000 records). I keep getting errors like "tempdb full" or "database full" (of the one that I am trying to import to).

    I thought that the maximum # of records is around 2 billion or so with SQL Server. I have also tried shrinking the transaction logs in both tempdb and in the database in question and that doesn't help.

    Any thoughts?

    In addition, I am trying to IMPORT data from two mirrored tables from two other databases into the same database that I am IMPORTing data into. I keep getting a "table lock" error since the databases I am attempting to import from are highly active with many INSERTS (around 100 or so) per minute and they are active 24-7.

    Here is a text visual of what I am trying to accomplish:

    machine1: database.dbo.table1

    machine2a: database.dbo.table1

    machine2b: database2.dbo.table1

    machine1 => machine2b

    [append and then delete records in machine1]

    machine2a => machine2b

    [append and then delete records in machine2a]

    Hope this makes sense.

    If any one has some suggestions for accomplishing this goal, I'd really appreciate it.

    Thanks in advance!

  • Are you doing this in one transaction? You can set the size of the batch to commit.

    Steve Jones

    sjones@sqlservercentral.com

    http://qa.sqlservercentral.com/columnists/sjones

  • This may be overstating the obvious, but, while SQL Server can handle very large databases, you are still limited by the available physical disk space. How big is your hard drive?

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply