
Is there a way to optimize a report for use with large datasets?
What would you recommend as the max amount of records to include on a report?
22 Answers, 1 is accepted
How exactly do you bind the report? I've just performed a couple of tests with 1000+ records on our end and the retrieval time was more than satisfactory - around 15 seconds. Slowdown usually comes when using filtering, grouping sorting and cascading parameters along with subreports. If this is your case you might want to try and move these operations on database level so that you do not have to retrieve the whole data and then work with it.
Regards,
Steve
the Telerik team
Check out Telerik Trainer, the state of the art learning tool for Telerik products.

I'm not using filtering, grouping, sorting, cascading parameters, or sub reports. I'm binding the report dynamically using code similar to below:
- Dim dt as DataTable = GetData()
- Dim report1 as New report1()
- report1.DataSource = dt
- ReportViewer1.Report = report1
I tried a larger dataset ( > 5000 records) and it still takes some time and a huge CPU hit.
Well 5000 or even 2000 records are quite a few and it would take some time for the rendering engine to calculate the layout and render the report. It is normal for the processor to get hit with the report processing. What is not normal is the time - I've just made a test with 5000 records and did not have any timeout problems. On the contrary - the report was rendered withing a minute and something. I've attached it here for your convenience.
Kind regards,
Steve
the Telerik team
Check out Telerik Trainer, the state of the art learning tool for Telerik products.

I am facing same problem in my web app.
I have 88,000 records in my database and I am binding it without any filter, sorting or other operations.
I have used only one field of my table (in report) and it is taking 5+ minutes to display the report.
I think the reporting tool is caching records and shows it back when user navigates between pages/records !
And I think this is happening because large amount of data came in memory for user's further request (navigation of records).
I have used crystal report tool also, it is taking hardly 20 seconds to display my data.
For your information I am binding datatable object to my report.
Regards,
Divyesh Chapaneri
As you have noticed correctly, every time the report is previewed, paged or exported through the viewer it is processed and rendered from scratch. Obviously, this is not the best way to do this but we currently do not have any caching mechanism. Every report instantiation will cache the whole report in the server's memory (because we do not know at design time how many items will fit on a single page) - this is of course per session. This cache is not only data, but the whole report being processed - and every report item has numerous properties.
Our current goal is to cover some reporting functionality we still lack and then we will move to the performance optimizations and productivity. Anyway, this task is already in our TODO list and we hope we will soon have the chance to work on it.
Generally reports' purpose is to transform information into "knowledge" and rendering thousands of pages is not that kind of transformation, rather it is a huge document that one might (or not) go through. So if your main goal is to show this data for web presentation only, we can suggest a grid control instead, moreover our RadGrid control exposes API for pdf, csv, word, excel exporting.
Sincerely yours,
Steve
the Telerik team
Instantly find answers to your questions on the new Telerik Support Portal.
Check out the tips for optimizing your support resource searches.

I am having similar issue.
I keep coming back to look at telerik reporting as i like some of the features and love the other tools telerik provide but keep going back to SQL reporting.
This is due to the speed. Its simply unusable with any sort of real data ie 10k + records.
While telerik have an approach of load it all in SQL reporting seem to have load the page it needs and uses SQL logic like SQL counts etc to maximize performance.
I not sure if Telerik plan to optimize and do a real world comparison against SQL server reporting and Telerik reporting. I think you'd be suprised on how much quicker SQL is .. ie 30k record 25 sec, SQL 2 seconds.....
Andrei

Has there been any progress since my last post on perforrmance of telerik reporting ?
I love the solution but to slow compared to SSRS last time i checked
Andrei
We're making improvements to the product with each version, so using the latest version is always recommended. Specifically if you are using the Table item to show large number of records, you would notice significant improvement in the Q3 2010 SP1 version, namely "Table body generation performance." If you have not tried the Table item, we highly recommend that you change your report layout to take advantage of it.
Additionally the Data Source Components introduced in Q1 2010 allow you to have greater control over your data which brings performance benefits.
Greetings,
Steve
the Telerik team

Unfortunately still too slow for real world use with our databases. SSRS still 10x quicker than telerik reporting on the same data and same output format.

You don't have a reporting system, you have a reporting toy.
<Admin: removed offensive comments>
First let's note that this thread has been opened in 2008 and a lot has changed since then, namely we've introduced our own Data Source Components that are optimized to work with Telerik Reports and we have made various memory footprint and performance improvements, especially in the latest Q2 2011 version.
We would like to ask for clarification/elaboration:
- define what 5000 lines means - is that 5000 database records?
- what is the report layout i.e. how many report items have you used?
- how many pages does the report generate and do you observe horizontal paging?
- what is the format in which you encounter problems?
- do you receive any errors?
Regards,
Steve
the Telerik team
Explore the entire Telerik portfolio by downloading the Ultimate Collection trial package. Get it now >>

I'm binding using a Dataset currently as that is what our Data access layer was already returning. What type of performance gains can I expect if I rework things to use Telerik.Reporting.SqlDatasource instead of the DataSet.
The export capabilities of the Telerik Reporting services is top notch and the number one reason I am trying to push through with it. It will enable us to easily setup a scheduling service around the reports. Where using the RadGrid limits the export smoothness and proves more complicated to run when no on a web server. (Think Worker Role in Azure).
The benefit of switching to SqlDataSource Component would be mainly the fact that you would work with it directly in the report designer and that your whole data would not be serialized in case of out proc session.
There would be little to none performance gains, as in the first case you fill a dataset, while in the second we fill our internal objects.
The good news is that we continually march forward optimizing the supported export formats, so expect those times to change with subsequent versions of the product.
Best wishes,
Steve
the Telerik team
Explore the entire Telerik portfolio by downloading the Ultimate Collection trial package. Get it now >>

I have same problem with huge size of data in telerik report.
I use Q3 2011 and it is not efficient for load huge data.
I have a very simple report layout but loading about 13000 records in 385 page is very time consuming about 5 min.
I use report parameter as a criteria to filter my data.but sometimes I have need to load all data.
Now my question is:
Is there any way to alert some text before render report with huge data.
It means when I hit preview button in Telerik report criteria:check if data is huge beefore rendering alert to user and confirm from him?

- I allow the users to define a MAX number of rows to return per query.
- Gave them the option to either clip the results at the MAX row or be notified that to many rows were being returned.
I then handle data binding in the NeedDataSource event handler. If the report returns more than MAX rows and the user choose clip. I truncate the query results to be MAX. Otherwise if they choose the warning, I throw a ToManyRowsException custom exception that notifies the user that they should apply a filter to reduce the result set further. The report viewer handles displaying the Exception message to the user.

Hi Steve,
I have 16000 record in table and to load those record report taking almost 9 min. there is no sub report, filter, no any calculation, nothing. just fetching records from database and assigning to report.
I am using this report in ASP.NET so I put debugger and I check that whether my code is taking time or what, but no after executing all code then after it is taking 9 min.
It is showing only loading symbol for 9 min. Client want to load more data then this How to give performance..?
Is there any suggestion that will improve my report performance.
Thanks

I guess the definition of "large" is somewhat relative. I have several reports that are 8-9 hundred pages long consisting of detailed billing statements for a local municipality. One would expect the time it takes to format hundreds of pages to take some time however I have found the telerik suite to be quite responsive.
How are you binding the data to your report? Instead of having my report go out and talk to the database, I provide a business object which holds the data required to print a given report. I use List<BusinessObjectName> type objects and simply bind that to my report data source. That method provides "all" the data to the report in one whack. On complex reports (with multiple sub-reports) you can use the various events to bind to the current row of data as required. I mentally think of this as a equivalent to a dataset.
This allows you to use whatever technologies you want to retrieve the data in whatever optimized method you have at your disposal. The report simply binds to your supplied list of business objects. If it becomes necessary to rebind the data, such as the operator choses to export to .PDF, it already has all the data in memory.
NOTE: I found exporting directly to PDF to be extremely fast using the report book options. I have a budget application that renders thousands of pages to PDF in minutes rather than hours.

Hi David,
Thanks for reply, and yaa I come to know what you want to say.
Is there any way there we can give some page limit for displaying data and while we export it should export all data. so that I can save rendering time.
thanks

Hi Guys, I am having the similar problem. My report don't show any data when records are more than around 2000 rows. How to fix this problem? Also any option of paging for a table like 100 rows per page? Can't see any paging property for a table.
Plz reply guys with guidance.
Please check last update in the split large table on multiple report pages forum thread.
I hope this information is helpful.
Regards,
Stef
Telerik
.jpg)
It's amazing how long these reports take. Our application is 64bit, so the limit of memory is no longer an issue like it was before, but it takes forever to load data into a report. Just like someone else stated, 9-10 minutes sitting at "Generating Report..." then when the report comes up, it still has to render all the pages. I know the query is quick, becuase I can use LinqPad to run the exact same query it it only takes about 13 seconds to run the query and download the data to LinqPad's data grid.
So, my datasource is a VAR from a LINQ query. Should I be using some other type?
Thanks
The data-retrieval is only the first step in the document's generation - Report Life Cycle. The retrieved data will cause the designer report template to be developed for each data record, which requires additional resources (memory, CPU).
To avoid processing and rendering the report for the whole data in preview, we can suggest you to filter the data on retrieval (server-side) via report parameters, and get smaller bits in preview. If the user needs the whole document, it can be exported programmatically with all data records - Exporting Report Programmatically, that will save you resources from displaying the whole document.
Depending on the viewer, you can test enabling the reporting cache mechanisms to reduce the memory usage - Cache Management. More suggestions can be found in Performance Considerations.
I hope this information is helpful.
Regards,
Stef
Telerik