SSIS out of memory/Buffer memory error. srv 48gb ram.

  • I have an SSIS code. This was running fine until last one week. Now its failing saying below error.

    The 32bit server has got 48gb Memory, with SQL and ssis installed in same server . boot.ini has /PAE option enabled. SQLmemorymanager.Total server memory goes up to 24gb.

    Enabled memory restriction in SSIS.Lookup is unchecked. I tried setting this up to 100mb the process gets completed but takes a whopping 43 hrs where as earlier it use to get completed in 5 hrs.

    Any guess what could be an issue.Any help/links would be appreciated.

    Error: 2010-02-02 08:08:49.07

    Code: 0xC0047012

    Source: DF Manual Insert TB records DTS.Pipeline

    Description: A buffer failed while allocating 35192080 bytes.

    End Error

    Error: 2010-02-02 08:08:49.07

    Code: 0xC0047011

    Source: DF Manual Insert TB records DTS.Pipeline

    Description: The system reports 33 percent memory load. There are 51538235392 bytes of physical memory with 34129588224 bytes free. There are 2147352576 bytes of virtual memory with 630755328 bytes free. The paging file has 72360525824 bytes with 55499239424 bytes free.

    End Error

  • Ananda-292708 (2/2/2010)


    I have an SSIS code. This was running fine until last one week. Now its failing saying below error.

    The 32bit server has got 48gb Memory, with SQL and ssis installed in same server . boot.ini has /PAE option enabled. SQLmemorymanager.Total server memory goes up to 24gb.

    Enabled memory restriction in SSIS.Lookup is unchecked. I tried setting this up to 100mb the process gets completed but takes a whopping 43 hrs where as earlier it use to get completed in 5 hrs.

    Any guess what could be an issue.Any help/links would be appreciated.

    Error: 2010-02-02 08:08:49.07

    Code: 0xC0047012

    Source: DF Manual Insert TB records DTS.Pipeline

    Description: A buffer failed while allocating 35192080 bytes.

    End Error

    Error: 2010-02-02 08:08:49.07

    Code: 0xC0047011

    Source: DF Manual Insert TB records DTS.Pipeline

    Description: The system reports 33 percent memory load. There are 51538235392 bytes of physical memory with 34129588224 bytes free. There are 2147352576 bytes of virtual memory with 630755328 bytes free. The paging file has 72360525824 bytes with 55499239424 bytes free.

    End Error

    I'm puzzled what's the point of having 48gb if you use 32bit OS? The PAE option will only increase the used by the OS memory only to 6gb. But even in that case a 32bit application cannot address more than 4gb of memory. So my question is why not install 64bit OS? You will be able to use all the memory and this issue will most probably disappear.

    ---
    SSIS Tasks Components Scripts Services | http://www.cozyroc.com/

  • Hi,

    I've been having a simliar issue on our production box recently, an etl solution that uses sql2008r2 ssis/rdbms to load data into an oracle 10g database. In production the daily load process was working fine for about 3 weeks , then boom every load one package would fail trying to load 27,000 records to a particular table in oracle with A buffer failed while allocating 10485184 bytes. ..

    system reported 61% memory load etc.

    All troubleshooting and searching web pointed to ssis running out of memory on the server. We however were confused when we ran the same process on the UAT boxes with the same data load of 27,000 records and the process worked.

    We then spent a lot of time troubleshooting differences between the production etl box and uat etl box. Both 4gb windows 2003 32 bit, and troublesome loader worked out as only 40mb in total, the memory usage perf mon traces showed prod just kept taking more and more memory until failure, but UAT took some about a third, and then levelled off . Also running this one package from dtexecui on prod box worked! slow but worked, instead of being called from a parent controller package during the full load process.

    In our situation it turns out the problem was due to the size of the destination table in the oracle database caused the problem. Once the table got over approx 500,000 rows in it ( 18 loads!, UAT only had 15) this error appears. We tried optimizing the table in oracle, by removing indexes and adding some all to no avail.

    We have resolved the problem by switching from Provider=MSDAORA.1 in the ole connection string to the oracle provided ones, Provider=OraOLEDB.Oracle.1 in our oracle OLE destination.

    We are now currently running performance tests with the new drivers to ensure its not going to flake out at some point too!

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply