Advice on database architecture for reporting application

  • Hi There

    I have been tasked with rewriting a reporting and GIS application. The application imports data into about 2 dozen tables which vary in size from 1 million to 20 million records. There are a lot of queries run against this data mostly sums and calculating percentages. Most of these queries only vary by a few parameters. Each query joins a few of the tables together so they can not be used directly for performance reasons.

    The new database will be SQL server 2008. I am planning to use SSIS to import the data from the external data source. I was thinking of then using SSIS to extract the data from the imported tables into another database with a separate table for each distinct type of query and letting the reporting application run it's more specific queries against only those tables. I would need maybe 50 tables each holding the sums for the different queries. The tables would be small as they would hold just a few thousand rows and a few columns and the sum or percentage.

    I don't think this is the best way of doing this but I am a programmer and am not really aware of all my options using the business intelligence tools.

    If you have any pointers for me it would be much appreciated.

  • May depend upon a lot of factors, but could look at using Analysis Services instead?

    ETL -> Datastore -> Cube

    Then query the cube using MDX through Reporting Services. I won't lie and say that this is a simple route to take since it could require a lot of planning in advance, it may however be more appropriate.

    Cheers, Jode

  • Google "Dimensional Modelling" (specifically, "A dimensional modelling manifesto" for a kick off) and Kimball.

    That's your starting point

  • Thanks for the pointer guys. I guess I have some reading to do on SSAS.:-)

Viewing 4 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic. Login to reply