SSRS 2005 vs 2008 R2 report Performance difference

  • Currently we are using SQL Server 2005 in Production and we also have testing server with SQL Server 2008 R2. and there are same set of reports in both the servers. Now the problem I am seeing is that some of the complex reports are running slow in 2008 R2 server. And the main difference is in Processing time (TimeProcessing column from ExecutionLog table).

    One report I am running has one dataset and the query is returning around 86K rows and report has 9 matrix (in which I have Sum aggregation) and a chart.

    2005 Server : TimeDataRetrieval : 6 sec , TimeProcessing : 51 sec, TimeRendering : 0 sec

    2008 R2 Server: TimeDataRetrieval : 13 sec , TimeProcessing : 234 sec, TimeRendering : 0 sec

    Note: Index structure is not same in both servers, that's why there is a difference in Data Retrieval. I am not worrying about that for now.

    Here are the server comparison.

    2005 Server: Quad-Core AMD Opteron Memory: 16 GB OS: Windows Server 2008

    2008 R2 Server: Quad-Core AMD Opteron Memory: 32 GB OS: Windows Server 2008 R2

    Size of the ReportServerTempDB is almost same ~ 600 MB in both of them and both servers are used for Reporting only.

    I have Globals!TotalPages in report so I there is no On-Demand Processing in 2008 server.

    So my question is what am I missing here? Why is the huge difference in Processing? I know I can do all the aggregation in sql side and reduce the number of rows. My only concern is the difference in processing time by two servers for the same report.

  • Got the answer from MSDN forum.

  • I cannot view the response in the provided link. Can someone post the information here so it can be seen? Thank you.

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply