Problem solve Get help with specific problems with your technologies, process and projects.

How to reuse datasets with Reporting Services

Rather than continually create, destroy and recreate the same datasets to run a report repeatedly, get a trick to create the dataset once and retain it until no longer needed.

When you run a report in Reporting Services, the dataset created for that report is created, used and then discarded. If you're running the same report many times against the same dataset for different departments or users, that dataset is recreated and destroyed again and again. This is pointless, especially if it's a particularly long-running report, which can tie up server resources and bring operations to a grinding crawl.

One of the tricks that I've learned as a SQL programmer is that any large amount of data that is being constantly recreated should be created once and held until it's been rendered obsolete. The best way to do this is to create a stored procedure that's used to collate the needed data, store it in a table and then provide the data from that table when necessary. Whenever the stored procedure runs in the future, it checks the table to see if the needed data is there and up-to-date; if it is, it simply provides that static data instead of recreating everything from scratch.

For datasets that are in the tens of thousands of rows or greater, this can be a good way to optimize back-end processing and avoid redoing the same work. The exact way to do this varies enormously depending on the kind of data you needed to obtain, but a typical stored procedure written in this fashion might work as follows:

    1. Check to see if there is existing data that matches the needs of the user, based on any passed parameters. For instance, if you're dealing with a report that would be updated once a day, look for the most recent version of the report data, which would have some form of date stamp. (One way to speed this up is just to look for a single matching record by using SELECT TOP 1 or the SET ROWCOUNT command.)

    2. If there is no data that matches, or if the existing data is old, flush the existing data (if needed) and compile the new data to the table in question, all in the context of a transaction.

    3. Return the compiled data as the result of the stored procedure.

    4. Uses the returned dataset to format the report.

As you can guess, returning static data from a table (especially if you use NOLOCK or other read-acceleration mechanisms) is much faster than compiling it anew from a query. This approach might not work for all queries -- for instance, queries that involve sophisticated multiple joins or child reports might not be written easily to a flat table -- but it's an approach worth keeping in mind.

About the author: Serdar Yegulalp is editor of the Windows Power Users Newsletter. Check it out for the latest advice and musings on the world of Windows network administrators -- and please share your thoughts as well!


More information from SearchSQLServer.com

  • Tip: Speed CSV imports into Excel
  • Book Excerpt: Installing and configuring Reporting Services
  • Tip: Unreadable snapshots in Reporting Services


  • Dig Deeper on Microsoft SQL Server Reporting Services (SSRS)

    Start the conversation

    Send me notifications when other members comment.

    Please create a username to comment.

    -ADS BY GOOGLE

    SearchBusinessAnalytics

    SearchDataCenter

    SearchDataManagement

    SearchAWS

    SearchOracle

    SearchContentManagement

    SearchWindowsServer

    Close