large report / large XML file delivery to web users (any date range)

  • Not really a SQL question sorry but is very closley related.

    We deliver reports to end users by creating XML output from the DB which is then passed to IIS worker proceess (w3wp.exe (IIS7)) for xslt transform into HTML or CSV files.

    Finding queries are executing quickly but the xslt transform process and delivery mechanism can slow other web users.

    Just wondering how other developers deliver large report files to end users ?

    -As an example a 100,000 row report in html format could be 70mb.

    -We have a strategic aim of no date limits on reports ... easily creating 200mb - 500mb report files.

    -To do this we need to seperate the user/login app experices from the delivery of large reports so have the grandiose idea of a separate web service (worker process) that can be moved to another hardware node if needed (or at least managed seperately).

    Im taking that stance that the web user experience should not be influence by a large report that can be executed randomly hence we need to seperate the processes.

    Just curious to understand other possible options that you may have considered.

    Thank you

    Scott

    ps totally understand to much info could be too much for end users too.

  • Aside from the totally ridiculous notion that a user may want a report that is 500MB I am surprised you worried about how long the transformation takes. The download time is going to be so long that nobody will wait for a report with more information than they would find useful. Even if you can transform it in a minute or so the user is still looking at 30-40 minutes at a minimum to get this data to the screen. For such a rare occurrence I would just tell the person(s) that want this much data that they need to request it as a special report from IT. Then you can just get your data and do whatever you want to it.

    _______________________________________________________________

    Need help? Help us help you.

    Read the article at http://www.sqlservercentral.com/articles/Best+Practices/61537/ for best practices on asking questions.

    Need to split a string? Try Jeff Modens splitter http://www.sqlservercentral.com/articles/Tally+Table/72993/.

    Cross Tabs and Pivots, Part 1 – Converting Rows to Columns - http://www.sqlservercentral.com/articles/T-SQL/63681/
    Cross Tabs and Pivots, Part 2 - Dynamic Cross Tabs - http://www.sqlservercentral.com/articles/Crosstab/65048/
    Understanding and Using APPLY (Part 1) - http://www.sqlservercentral.com/articles/APPLY/69953/
    Understanding and Using APPLY (Part 2) - http://www.sqlservercentral.com/articles/APPLY/69954/

  • Thank you for the reply Sean , agree its ridiculous to a degree, indeed that has always been my argument , what the heck would a user do with a 70Mb report with 100,000 rows that they can currently spin ???

    I guess part of the answer is that reports can be exported in CSV format. Its possible end users utilise this data for other purposes.

    Another part of the answer is that 1 object may be sending 1 packet per day , another could be sending 10 per second. In this environment and if a date range is used how to you legislate for the amount of data the potential reports could create if the same date range is used ?

    Thanks for the perspective.

    PS (regarding the download time. The idea was write a report service to commit and manage the report and its delivery via an different application pool. i.e send a user a link , download a zip file from another location away from the web app. Entirely possible to build a stable scalable solution to deliver large quantities of data).

Viewing 3 posts - 1 through 2 (of 2 total)

You must be logged in to reply to this topic. Login to reply