Analysis Services Crashes with large Parent-Child

  • I am having trouble processing a parent-child dimension in SQL Analysis services.

    I use the following table:

    CREATE TABLE [dbo].[Desc_brief_dat] (

    [Parent_ID] [int] NULL ,

    [Child_ID] [int] NULL ,

    [Level_Name] [varchar] (200) COLLATE SQL_Latin1_General_CP1_CI_AS NULL ,

    [Key] [int] NULL

    ) ON [PRIMARY]

    GO

    There are 3444766 rows in the table.

    It is loaded from a text file that is 175919 KB in size

    When I run the dimension it runs for a few minutes and then just stops.

    I have left it as long as 6 hours before killing the processing process.

    I am well under the 3 gig RAM limit on a 4 gig machine

    I am using 8 750 mhz processors

    I have this working on some very similar dimensions and they process in about 5 minutes

    10,000 fewer rows

    10,000 fewer KB

    This is a huge gap in functionality based on a small difference in data size

    What is the physical limitation that causes this?

    Is there anything that I can do to facilitate greater functionality?

    Is there a limit on Parent-Child Relationships in Analysis Services?

    Can that limit be tweaked with Data size, or something else?

    What is going on here?

    Kelly Baber

    http://www.PolyVista.com

    kelly.baber@polyvista.com

    713.825.5997

  • addendum to post:

    By stop I mean that is just stops, no errors, no I/O, no processor time, It just stops doing anything.

    could migrating the repository to sql help with this? - I am grasping.

  • Some questions...

    Are any processes accumulating CPU? Analysis Services? SQL Server?

    Is the DB server on a different box than Analysis Services?

    When does the processing "stop?" Is it after the query is visible in the processing window but no rows have been retrieved (no progress numbers at the bottom)? If yes, take the SQL visible in the processing window and see if you can get it to run in Query Analyzer.

    We ran into some similar (but not 6 hour) problems with complex cubes with 20+ dimensions (several moderately large) and tens of millions of fact rows. Under 32-bit processing, the cube query appeared to be stuck on the database server, usually during query optimization. Reducing dimension size and cube complexity helped, but moving to the raw power of a 64-bit box with 64-bit software really helped.

    I am not conversant enough with the repository to comment there.

    Good luck,

    Larry

    Larry

  • We found that moving the memory conservation threshhold to 2500 fixed the problem.

    it was automatically set to 1026 (or somewhere around 1000)

    this seems to be one of those grey areas for me.

    how does that property act. I have no clue as to why that worked, it just did

  • Do you know if you activated Very Large Dimension Management (VLDM)? This could be what needed the memory. VLDM dimensions actually lock space in memory to hold the dimension.

    Maybe this KB Article will help:

    http://support.microsoft.com/default.aspx?kbid=821103

    Hope that helps,

    Steve Hughes

    Magenic Technologies

    steve.hughes@magenic.com

Viewing 5 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic. Login to reply