1

We have about 60 or more SSAS cubes on one server. All the cubes have the same structure. Each cube has a different size because each cube has a different underlying data warehouse. However the schema of the data warehouses are the same, the only difference is the data.

Each data warehouse is populated using the same ETL job in SSIS. A parameter is passed into the SSIS package that changes the source and destination.

To speed processing we created partitions in the cube. The cubes are processed and ETL ran individually throughout the day with some overlap.

I am looking for advice about how to debug the following error messages we get from SSIS:

The buffer manager cannot write 8 bytes to file "C:\Users\SQLSER~1\AppData\Local\Temp\DTS{A627C703-7524-4906-84D5-6D239B9745FF}.tmp". There was insufficient disk space or quota

Failed to load task "DFT Load Target Stage", type "SSIS.Pipeline.2". The contact information for this task is "Performs high-performance data extraction, transformation and loading;Microsoft Corporation; Microsoft SQL Server v10; (C) 2007 Microsoft Corporation; All Rights Reserved;http://www.microsoft.com/sql/support/default.asp;1".

Unable to load XML data into a local DOM for processing.

Error 0x800705AF while loading package file "D:\eTrigue Library\eTrigue\eTrigue Deploy\BETA\Exec Packages\PKG_EventActivity_Staging.dtsx". The paging file is too small for this operation to complete.

Why am I not getting the errors every time the SSIS packages are run?

How can I go about finding the root of the problem?

The errors started to happen after I added the partitions to the cube. I use an SSIS script task to processes partitions that is called in the ETL.

Is there a way to find memory leaks in SSIS script tasks?

Anybody want to see the code?

Hannah Vernon
71.1k22 gold badges178 silver badges324 bronze badges
asked Sep 7, 2012 at 23:31
1
  • How much free space is on the drive? Are you running a FS with a 2Gb file limit? Commented Sep 8, 2012 at 1:18

1 Answer 1

4

You don't have sufficient disk space to process everything you're trying to do concurrently. Data flow operations that spill to disk and temp files that get created are just some of the many reasons you might run out of space.

You should be able to run Perfmon during one of these runs to see how disk space and activity play out.

It's next to impossible for us to recommend anything without intimate knowledge of your system. The fact you're running several processes concurrently - where the actual speed and state of each process can vary widely from run to run - makes it impossible to know when or why the processes are encountering the "perfect storm" of starving each other of disk space.

You will have to understand the resource usage of each one of your processes independently then you'll be able to know if you should even try to run them concurrently.

Hannah Vernon
71.1k22 gold badges178 silver badges324 bronze badges
answered Sep 8, 2012 at 15:51

Your Answer

Draft saved
Draft discarded

Sign up or log in

Sign up using Google
Sign up using Email and Password

Post as a guest

Required, but never shown

Post as a guest

Required, but never shown

By clicking "Post Your Answer", you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.