'The buffer manager cannot write 33554032 bytes to file "". There was insufficient disk space or quota.'

  • I'm looking for any insight into this error message I'm getting.

    I have an SSIS package developed in BIDS 2008, executing on SQL Server 2008 64-bit Enterprise edition, residing on a Windows Server 2003 64-bit, 24 core, 64 GB RAM platform.

    I'm getting the exact error message seen in the subject, hundreds of times while the package executes. Rarely, there is a file name between the quotes. When there is, it's showing the BufferTempStoragePath designated in the package configuration, which is pointing to a SAN-based drive that has 30 - 50 GB of free space.

    The package reads from various tables via two SQL sources using SQL statements. Each of the two SQL statements returns about 15M rows. Each row in one of the statements is roughly 600 bytes, the rows from the other statement are roughly 300 bytes each. The query that returns the wider rows executes between 2x and 5x as fast as the other, according to the progress indicators in SSIS.

    The package does a merge join (inner join) on a single key field that is unique per record. Between the two row sets there are typically 14.8M matches.

    Out of the merge join, I multicast every record to two different outputs. One output does a little data type conversion and sends most fields of all records to a table in an Oracle 11g database via the Attunity connector. The other output does complete data type conversion in order to write to most fields of all records to a pair of fixed-width text files local to the server using the EBCDIC code page. Total output across the three destinations is about 13 GB.

    I have sized all the columns in my SQL statements so as to use the least number of bytes possible. DefaultBufferMaxRows = 50000, DefaultBufferSize = 33554032.

    I can execute the package successfully on my dual core Windows XP 32-bit laptop with 2 GB RAM, storing all the buffers etc. on the local disk. In this case it does buffer, reaching a high point of about 7 GB of buffer files which grow in number as the queries return rows and recede as the rows are written to the outputs. At no time does local disk usage exceed 10 GB (which is buffers + output files about 2/3 through execution).

    Any ideas?

  • I noticed that despite the nice writeup you had no responses so far. I am no expert in this area of SSIS but figured I might try to lend a hand anyway by asking some troubleshooting questions, maybe it will uncover something or jog another thought that yields a solution:

    - Is all this happening in one Data Flow Task? If not, check your temp storage paths on all data flow tasks. If you missed setting the property on one SSIS will write to the TEMP or TMP path which would likely be on C:\. On your workstation this may not have exposed the issue, but on the server it might.

    - Are you processing any BLOB data (text, next, image) or anything that would qualify as a BLOB after the data destined for the Oracle or EBCDIC file is converted? If so, and you did not set the BLOBTempStoragePath then same issue as above with SSIS possibly writing to C:\.

    - Have you tried running the package using the 32-bit runtime on the server? I am not sure why it would make a difference to buffer allocations but you never know, it may yield a different result.

    - When you run the package on the server are you seeing buffer files get created on your SAN volume?

    - Have you checked that the account running the SSIS package on the server has full control to the buffer path on the SAN volume?

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

  • opc.three (4/2/2012)


    I noticed that despite the nice writeup you had no responses so far. I am no expert in this area of SSIS but figured I might try to lend a hand anyway by asking some troubleshooting questions, maybe it will uncover something or jog another thought that yields a solution:

    - Is all this happening in one Data Flow Task? If not, check your temp storage paths on all data flow tasks. If you missed setting the property on one SSIS will write to the TEMP or TMP path which would likely be on C:\. On your workstation this may not have exposed the issue, but on the server it might.

    This is all happening in one Data Flow Task.

    - Are you processing any BLOB data (text, next, image) or anything that would qualify as a BLOB after the data destined for the Oracle or EBCDIC file is converted? If so, and you did not set the BLOBTempStoragePath then same issue as above with SSIS possibly writing to C:\.

    No BLOB data, the longest field is a 30-byte DT_STR. Total record length on the outputs is 272 bytes to the flat files and about 500 bytes to the Oracle destination.

    However, the BLOBTempStoragePath setting is the same as the other temp storage path.

    - Have you tried running the package using the 32-bit runtime on the server? I am not sure why it would make a difference to buffer allocations but you never know, it may yield a different result.

    I have not. I'll see whether I can; I don't have much control over the environment.

    - When you run the package on the server are you seeing buffer files get created on your SAN volume?

    Don't know, I wasn't watching last time. We're executing it now, I'll post results in an hour or so when it finishes.

    - Have you checked that the account running the SSIS package on the server has full control to the buffer path on the SAN volume?

    Yes, it does.

    Thanks very much for your reply, good questions.

  • It's executing now.

    It's writing buffer files to the designated location. A total of 27 files, about 650MB thus far (free space is nearly 1 TB on the disk). It's chewing through the files pretty fast, writing a new one and deleting an old one every few seconds. RAM usage is under 1 GB.

    And it's throwing the errors.

    I should take this opportunity to note 2 things:

    1) The output last time I ran was good, despite all the error messages. It threw 474 buffer errors but wrote all its output correctly.

    2) This is in my Production environment. In my QA environment we run the 32-bit dtexec and it throws no errors. Hardware and software is identical otherwise.

    So after this run completes we're going to run the 32-bit dtexec in Prod and see what happens.

    Dubya Tee Eff?!

  • william.l.gann (4/3/2012)


    Dubya Tee Eff?!

    Ditto. Interested to see if the 32-bit runtime runs clean. Thanks for posting back.

    There are no special teachers of virtue, because virtue is taught by the whole community.
    --Plato

  • The 32-bit dtexec gives the same errors in Production.

    Given that there are no errors in QA with the same make/model of hardware, I'm asking our Windows admins to look at SAN driver versions.

Viewing 6 posts - 1 through 5 (of 5 total)

You must be logged in to reply to this topic. Login to reply