How to represent date range in FACT Table

  • How can I represent a date range in FACT table.

    In OLTP application I have client holding represented as

    clientcode valid_from valid_till stock value

    123 1/1/2008 20/1/2008 100 500

    123 21/4/2008 25/6/2008 150 1000

    123 26/6/2008 31/12/2099 200 1500

    To get total stock holding at a given date It can be easily be found

    How should I desgin FACT table for above mentioned scenerio..

    Deepak

  • Depends on the date granularity you require, but assuming daily,

    FKClientKey

    FKDateKey

    Stock

    Value

    And your ETL package would load a Stock & Value record per day, based on the daily balance using your OLTP From and To dates

    Kind Regards, Will

  • Hi,

    So that means three records, have to be represented as (170+ records) and one record for each day..

    1st record in FACT Table ---> -----> 123 1/1/2008 100 500

    2st record in FACT Table ---> -----> 123 2/1/2008 100 500

    3st record in FACT Table ---> -----> 123 3/1/2008 100 500

    4st record in FACT Table ---> -----> 123 4/1/2008 100 500

    .

    .

    .

    .

    110th record in FACT Table ---> -----> 123 20/4/2008 100 500

    111th record in FACT Table ---> -----> 123 21/4/2008 150 1000

    112th record in FACT Table ---> -----> 123 22/4/2008 150 1000

    113th record in FACT Table ---> -----> 123 23/4/2008 150 1000

    .

    .

    .

    113th record in FACT Table ---> -----> 123 25/6/2008 150 1000

    114th record in FACT Table ---> -----> 123 26/6/2008 200 1500

    1 st Record OF OLTP -----> 123 1/1/2008 20/4/2008 100 500

    2 nd Record OF OLTP -----> 123 21/4/2008 25/6/2008 150 1000

    3 rd Record OF OLTP -----> 123 26/6/2008 31/12/2099 200 1500

    Is there better way of handling it.

    Deepak

  • It depends on type of your fact table. If it's snapshot: periods don't overlap .. and you have plenty of rows in production db - so what about:

    dateValidFromId --date dim in 2 roles

    dateValidToId

    timeValidFromId --if required, time dim in 2 roles

    timeValidToId

    clientId

    stock

    value

    + decision what to do in meet points (if there are any)

    Happy ETL!

  • dkhattar (11/15/2008)


    Hi,

    So that means three records, have to be represented as (170+ records) and one record for each day..

    ...snip...

    Is there better way of handling it.

    Deepak

    Absolutely...That's exactly the way you handle it. It takes a completely different mindset to Relational OLTP schematics, so don't worry if it looks crazy to you at first.

    If you don't believe me, buy a good book on dimensional modelling by someone like Ralph Kimball...

    Oh, and I'd recommend surrogate keys as opposed to natural keys in the fact table (representing your date and customer)

    Kind Regards, Will

  • Alternatively, take a look at this article: Handling Begin and End Times in SQL Server Analysis Services 2008 (http://www.sqlserveranalysisservices.com/OLAPPapers/Handling%20Begin%20and%20End%20Times%20in%20SQL%20Server%20Analysis%20Services%202008.htm)

  • So my fact table looks like this (excluding date fields)

    clientcode valid_from valid_till stock value ValidFromId ValidToID

    123 1/1/2008 20/1/2008 100 500 900 920

    123 21/4/2008 25/6/2008 150 1000 2300 2330

    123 26/6/2008 31/12/2099 200 1500 88888 888899

    Once the cube is ready and user wants to see a stock value for a date (between two dates) how system will give me the stock value.

    Deepak

  • You will need to create a new calculated measure to represent the available stock value on any given day. You already have your date dimension linked to the fact table twice, once as the from date and once as the to-date. Now add your date dimension to the cube again as yet another role-playing dimension, the "as-of" dimension, but do NOT link the dimension to any measure group on the Dimension Usage tab. This dimension will be the selector for the end-user but it will only be referenced by your calculated measure, it won't be hardwired to any of the measure groups.

    To create your calculated measure use the LinkMember function to map the key value of your "as-up" dimension onto your from and to-date dimensions. Use the range operator : along with the NULL value to specify the ranges of dates that you want on the from and to-date dimensions. For example, the set of all date from the beginning of time up to and including your to-date is NULL:ToDate while the set of all date from your from date to the end of time is FromDate:NULL. CrossJoin the two sets to get the intersection, i.e. the set of dates between the from date and to-date. You can then do a SUM across that set to get your stock value. You final expression should look something like:

    SUM( CROSSJOIN

    ( NULL:LinkMember( [AsOfDateDimension].[DateHierarchy].CurrentMember, [ToDateDimension].[DateHierarchy] ),

    LinkMember( [AsOfDateDimension].[DateHierarchy].CurrentMember, [FromDateDimension].[DateHierarchy]):NULL

    )

    ,[Measures].[YourStockMeasure]

    )

    For the original inspiration for this approach and another discussion of the problem see Mosha's blog entry http://sqlblog.com/blogs/mosha/archive/2007/06/01/counting-in-flight-events-in-mdx.aspx.

    John Wilder

  • dkhattar (11/18/2008)


    So my fact table looks like this (excluding date fields)

    clientcode valid_from valid_till stock value ValidFromId ValidToID

    123 1/1/2008 20/1/2008 100 500 900 920

    123 21/4/2008 25/6/2008 150 1000 2300 2330

    123 26/6/2008 31/12/2099 200 1500 88888 888899

    Once the cube is ready and user wants to see a stock value for a date (between two dates) how system will give me the stock value.

    Deepak

    You should take integers for the datefields like: 20080101 and use a date dimension to find the real date-values. You should also use a client dimension and include a FK in your fact to this dimension!

    Example:

    StockID ClientID ValidFromDateID ValidToDateID StockAmount ValueStock

    1 12 20080101 20080120 100 500

Viewing 9 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic. Login to reply