SSIS Buffer Manager problem

  • I'm developing and running a rather simple package in SSIS which imports data from a flat file, does some transformations, and loads the data to an Oracle table. All is well until I go above a certain threshold of approx 32,00 records at which time the package stops loading the Oracle table and and I receive warning/error messages as follows:

    [DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 30 buffers were considered and 30 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.

    [IMPORT_PRODUCT_200 [169]] Error: An OLE DB error has occurred. Error code: 0x8007000E. An OLE DB record is available.  Source: "Microsoft Cursor Engine"  Hresult: 0x8007000E  Description: "Out of memory.".

    [DTS.Pipeline] Error: The ProcessInput method on component "IMPORT_PRODUCT_200" (169) failed with error code 0xC0202009. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.

    [Flat File Source [186]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.

    [DTS.Pipeline] Error: The PrimeOutput method on component "Flat File Source" (186) returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.

    Granted the record lengths on the flat file are large (there are over 300 fields) but should'nt SSIS be able to handle large volumes? I have 2G of RAM on my development machine. Can anyone offer a solution? I need to be able to process up to 3 or 4 million records.

     

  • What's the width of the column?

    How many is 32,00? Is it three thousand two hundred?

    -Jamie

     

  • Sorry, I meant it is dying after I process 32,000 records. The flat file is variable length and has 111 fields. Average record size is about 400 bytes. The Oracle table I'm loading has 206 columns, 200 of which are varchar2(256).

  • The average size of the record in the source file is irrelvant in this context. What is the width of a row when it is in the SSIS pipeline?

    If the average length is 400 then I'm making a wild guess that the length of a row in the SSIS pipeline is 500. A crude calculation indicates that you'll need about 16MB of memory to process this ((32000*500)/1000000).

    In other words - something else is afoot here. Can you check memory util using Task Manager or Performance Monitor?

    Regards

    Jamie

     

  • Jamie, are you implying that SSIS reads in the entire source at one time and processes it all at once rather than bringing in the source in chunks? 

  • Carol,

    Absolutely not. SSIS process data in the pipeline wthin memory buffers. The size of each buffer is determined at runtime and as many buffers as are required are created in order to process all the data.

    Information here about how SSIS determines the size of the buffers: http://www.microsoft.com/technet/prodtechnol/sql/2005/ssisperf.mspx

    -Jamie

    P.S. That's eaily the best whitepaper I've read in regard to SSIS. Everyone should read it.

     

  • Jamie Thomson (5/1/2006)


    Carol,

    Absolutely not. SSIS process data in the pipeline wthin memory buffers. The size of each buffer is determined at runtime and as many buffers as are required are created in order to process all the data.

    Information here about how SSIS determines the size of the buffers:

    -Jamie

    P.S. That's eaily the best whitepaper I've read in regard to SSIS. Everyone should read it.

    At the risk of getting flammed for re-viving a dead thread, I've got a similar problem, although my scale is a lot larger.

    I have a stored procedure OLE DB source. I have calculated a row width of 340 bytes from the SP. I have 13.5 million source rows for THAT data flow. Based on my calculation: that equates to approzimately 4.37 GB?, which is a factor of 43 times too large, which implies my buffers will be sized for only 232 rows?

    does that mean I should change my DefaultMaxBufferRows down from the default 10000 to 232 to improve performance/reduce the number of 'DTS pipeline: Buffer' information warnings?

    "[DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 32 buffers were considered and 32 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked."

    I've based this on the section from that SSISperfTuning.doc you pointed out:

    For example, if SSIS calculates an Estimated Row Size of 15,000 bytes per record, then the anticipated buffer size would be calculated as 15,000 bytes/record * 10,000 records, which is approximately 143 MB and about 1.5 times greater than the DefaultMaxBufferSize of 100 MB. Because the anticipated size exceeds the DefaultMaxBufferSize, SSIS reduces the number of records per buffer approximately by a factor of 1.5 to get below the 100 MB threshold. In this scenario, each buffer is sized to hold approximately 6,600 records.

    ps: I see above you upped an estimated 400 byte to 500 bytes, Jamie? Does that mean I should assume a 25% growth on my row size?

  • Regan Galbraith (2/29/2008)


    Jamie Thomson (5/1/2006)


    Carol,

    Absolutely not. SSIS process data in the pipeline wthin memory buffers. The size of each buffer is determined at runtime and as many buffers as are required are created in order to process all the data.

    Information here about how SSIS determines the size of the buffers:

    -Jamie

    P.S. That's eaily the best whitepaper I've read in regard to SSIS. Everyone should read it.

    At the risk of getting flammed for re-viving a dead thread, I've got a similar problem, although my scale is a lot larger.

    Don't worry, you won't get flamed for that!

    I have a stored procedure OLE DB source. I have calculated a row width of 340 bytes from the SP. I have 13.5 million source rows for THAT data flow. Based on my calculation: that equates to approzimately 4.37 GB?, which is a factor of 43 times too large, which implies my buffers will be sized for only 232 rows?

    does that mean I should change my DefaultMaxBufferRows down from the default 10000 to 232 to improve performance/reduce the number of 'DTS pipeline: Buffer' information warnings?

    "[DTS.Pipeline] Information: The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 32 buffers were considered and 32 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked."

    I've based this on the section from that SSISperfTuning.doc you pointed out:

    For example, if SSIS calculates an Estimated Row Size of 15,000 bytes per record, then the anticipated buffer size would be calculated as 15,000 bytes/record * 10,000 records, which is approximately 143 MB and about 1.5 times greater than the DefaultMaxBufferSize of 100 MB. Because the anticipated size exceeds the DefaultMaxBufferSize, SSIS reduces the number of records per buffer approximately by a factor of 1.5 to get below the 100 MB threshold. In this scenario, each buffer is sized to hold approximately 6,600 records.

    *Generally* speaking upping the size of the buffer will be beneficial but that's not always the case. The only thing I can suggest is to test and emasure. See what works best.

    ps: I see above you upped an estimated 400 byte to 500 bytes, Jamie? Does that mean I should assume a 25% growth on my row size?

    No, ignore that.

  • Has anyone mentioned before that this is probably an issue with Microsoft changing something on a Service Pack or Hotfix?

    I have the same issue, but I'm dealing with a server that has 72 GB of physical memory and Virtual Memory set to 110 GB. The amount of memory is irrellavant to me since it isn't even being completely used when I get the error.

    The reason that I say the problem has to do with a Service Pack is that I have a package that we have been running since shortly after SQL Server 2005 was released. We never received this error up until around 3 to 6 months ago at which time we began getting it constantly. This is the exact same package, running on nearly the same data (and the exact same data structure).

    Something changed, and it wasn't any settings that we adjusted. By process of elimination, that leaves Hotfixes or Service Packs, since those are the only other changes that have been made.

    I realize that I have some checking to do myself on the MaxBuffer settings based on my row size, but since the package isn't even using all of the Virtual Memory when it craps out, something tells me that this is something that I'm not going to be able to correct.

  • I have the same issue .

    If it stops after 3200 records, then definitely some thing wrong with the data.

    Check your source table design and destination table design.

    Make sure the fields that is "not null" is not getting any NULL values from the source table data

Viewing 10 posts - 1 through 9 (of 9 total)

You must be logged in to reply to this topic. Login to reply