June 8, 2012 at 1:18 pm
This is more of a general question,
I just completed a 6 hour process of inserting 255,000 rows into a table, I had issues from just trying to opening the insert script file. I got errors such as memory, out of storage, etc. I had to break about the file into 20k or so records a piece.
Can you not do an insert of that many records on SQL 2008? The issue that I will face is that I have to create an SSIS package to import these many records into one table that will have to run every night. Will I have any issues, if so how can I get around it?
June 8, 2012 at 1:22 pm
I'd look at either BCP or Bulk Insert for this. It would be a lot faster also. Where is the data coming from?
June 8, 2012 at 1:24 pm
it's coming from Oracle. I'm importing it to SQL for dashboards that will be viewed via SharePoint.
How will bulk Insert work via SSIS? I've used it via C# code, from a web app, but I haven't used it in SSIS yet.
What is BCP?
June 8, 2012 at 1:29 pm
BCP is the command line bulk insertion utility. It can be called like any other OS command line executable in SSIS or a batch file. If you can pull the data from Oracle into a file with a predictable format (comma or '|' delimited for example, or fixed size columns) the BCP utility can pump it into a SQL Server table very efficiently.
Link to more info on the MS site:
http://msdn.microsoft.com/en-us/library/ms162802(v=sql.105).aspx
June 8, 2012 at 1:36 pm
Thanks, I'll check it out.
Viewing 5 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic. Login to reply