Large datasets

23 posts, 0 answers
  1. kelly moore
    kelly moore avatar
    7 posts
    Member since:
    Dec 2005

    Posted 18 Nov 2008 Link to this post

    I've noticed when binding a report to a large dataset (>1000 records) the response time of the report creation on the browser is extremely slow and the CPU takes a big hit. Most of the time the browser times out and the report never loads. Retrieving the data from the server is not the problem, as it is very fast and efficient. I would really like to use the reporting tool for large datasets but I don't know if its going to be possible. My questions are below:

    Is there a way to optimize a report for use with large datasets?
    What would you recommend as the max amount of records to include on a report?
  2. Steve
    Admin
    Steve avatar
    10941 posts

    Posted 18 Nov 2008 Link to this post

    Hi Kelly,

    How exactly do you bind the report? I've just performed a couple of tests with 1000+ records on our end and the retrieval time was more than satisfactory - around 15 seconds. Slowdown usually comes when using filtering, grouping sorting and cascading parameters along with subreports. If this is your case you might want to try and move these operations on database level so that you do not have to retrieve the whole data and then work with it.

    Regards,
    Steve
    the Telerik team

    Check out Telerik Trainer, the state of the art learning tool for Telerik products.
  3. DevCraft banner
  4. kelly moore
    kelly moore avatar
    7 posts
    Member since:
    Dec 2005

    Posted 18 Nov 2008 Link to this post

    I'm not using filtering, grouping, sorting, cascading parameters, or sub reports. I'm binding the report dynamically using code similar to below:

    • Dim dt as DataTable = GetData()
    • Dim report1 as New report1()
    • report1.DataSource = dt
    • ReportViewer1.Report = report1

    I tried a larger dataset ( > 5000 records) and it still takes some time and a huge CPU hit.

  5. Steve
    Admin
    Steve avatar
    10941 posts

    Posted 19 Nov 2008 Link to this post

    Hi Kelly,

    Well 5000 or even 2000 records are quite a few and it would take some time for the rendering engine to calculate the layout and render the report. It is normal for the processor to get hit with the report processing. What is not normal is the time -  I've just made a test with 5000 records and did not have any timeout problems. On the contrary - the report was rendered withing a minute and something. I've attached it here for your convenience.

    Kind regards,
    Steve
    the Telerik team

    Check out Telerik Trainer, the state of the art learning tool for Telerik products.
  6. Gourangi
    Gourangi avatar
    83 posts
    Member since:
    Mar 2008

    Posted 23 Feb 2009 Link to this post

    Hi,

    I am facing same problem in my web app.
    I have 88,000 records in my database and I am binding it without any filter, sorting or other operations.
    I have used only one field of my table (in report) and it is taking 5+ minutes to display the report.
    I think the reporting tool is caching records and shows it back when user navigates between pages/records !
    And I think this is happening because large amount of data came in memory for user's further request (navigation of records).
    I have used crystal report tool also, it is taking hardly 20 seconds to display my data.

    For your information I am binding datatable object to my report.

    Regards,
    Divyesh Chapaneri
  7. Steve
    Admin
    Steve avatar
    10941 posts

    Posted 24 Feb 2009 Link to this post

    Hello Divyesh,

    As you have noticed correctly, every time the report is previewed, paged or exported through the viewer it is processed and rendered from scratch. Obviously, this is not the best way to do this but we currently do not have any caching mechanism. Every report instantiation will cache the whole report in the server's memory (because we do not know at design time how many items will fit on a single page) - this is of course per session. This cache is not only data, but the whole report being processed - and every report item has numerous properties.
    Our current goal is to cover some reporting functionality we still lack and then we will move to the performance optimizations and productivity. Anyway, this task is already in our TODO list and we hope we will soon have the chance to work on it.
    Generally reports' purpose is to transform information into "knowledge" and rendering thousands of pages is not that kind of transformation, rather it is a huge document that one might (or not) go through. So if your main goal is to show this data for web presentation only, we can suggest a grid control instead, moreover our RadGrid control exposes API for pdf, csv, word, excel exporting.

    Sincerely yours,
    Steve
    the Telerik team

    Instantly find answers to your questions on the new Telerik Support Portal.
    Check out the tips for optimizing your support resource searches.
  8. Andrei
    Andrei avatar
    7 posts
    Member since:
    Jan 2010

    Posted 02 Jul 2010 Link to this post

    Hi

    I am having similar issue.

    I keep coming back to look at telerik reporting as i like some of the features and love the other tools telerik provide but keep going back to SQL reporting.
    This is due to the speed. Its simply unusable with any sort of real data ie 10k + records.
    While telerik have an approach of load it all in SQL reporting seem to have load the page it needs and uses SQL logic like SQL counts etc to maximize performance.
    I not sure if Telerik plan to optimize and do a real world comparison against SQL server reporting and Telerik reporting. I think you'd be suprised on how much quicker SQL is ..   ie 30k record 25 sec, SQL 2 seconds.....

    Andrei
  9. Andrei
    Andrei avatar
    7 posts
    Member since:
    Jan 2010

    Posted 23 Feb 2011 Link to this post

    Hi

    Has there been any progress since my last post on perforrmance of telerik reporting ?

    I love the solution but to slow compared to SSRS last time i checked 

    Andrei
  10. Steve
    Admin
    Steve avatar
    10941 posts

    Posted 23 Feb 2011 Link to this post

    Hello Andrei,

    We're making improvements to the product with each version, so using the latest version is always recommended. Specifically if you are using the Table item to show large number of records, you would notice significant improvement in the Q3 2010 SP1 version, namely "Table body generation performance." If you have not tried the Table item, we highly recommend that you change your report layout to take advantage of it.
    Additionally the Data Source Components introduced in Q1 2010 allow you to have greater control over your data which brings performance benefits.

    Greetings,
    Steve
    the Telerik team
    Do you want to have your say when we set our development plans? Do you want to know when a feature you care about is added or when a bug fixed? Explore the Telerik Public Issue Tracking system and vote to affect the priority of the items
  11. Andrei
    Andrei avatar
    7 posts
    Member since:
    Jan 2010

    Posted 28 Feb 2011 Link to this post


    Unfortunately still too slow for real world use with our databases. SSRS still 10x quicker than telerik reporting on the same data and same output format.
  12. mitchell
    mitchell avatar
    5 posts
    Member since:
    Jun 2009

    Posted 22 Sep 2011 Link to this post

    5000 Lines is a big report.   Are you kidding?

    You don't have a reporting system, you have a reporting toy.

    <Admin: removed offensive comments>
  13. Steve
    Admin
    Steve avatar
    10941 posts

    Posted 28 Sep 2011 Link to this post

    Hi Mitchell,

    First let's note that this thread has been opened in 2008 and a lot has changed since then, namely we've introduced our own Data Source Components that are optimized to work with Telerik Reports and we have made various memory footprint and performance improvements, especially in the latest Q2 2011 version.

    We would like to ask for clarification/elaboration:
    • define what 5000 lines means - is that 5000 database records?
    • what is the report layout i.e. how many report items have you used?
    • how many pages does the report generate and do you observe horizontal paging?
    • what is the format in which you encounter problems?
    • do you receive any errors?
    I've just made a quick test with ~20 000 database records in a simple report and did not encounter any problems. The sample is attached here for your convenience.

    Regards,
    Steve
    the Telerik team

    Explore the entire Telerik portfolio by downloading the Ultimate Collection trial package. Get it now >>

  14. Marcus Kellermann
    Marcus Kellermann avatar
    56 posts
    Member since:
    Jul 2004

    Posted 03 Oct 2011 Link to this post

    In my case, large data set means ~2200 rows, which results in about 35-55 pages, depending on the layout, But could be as much as 17,000 rows and 250 pages. There is 1 grouping on the page in which the Group header is repeated for each group and on each page.   Each row in the detail section has 10 text boxes in it.  With two conditional formatting statements.  One does the alternating row logic and the other changes the color of the row font if a value is exceeded.    According to the trace statement in VS 2010's output window.  It takes .4 seconds to retrieve the data and 45 seconds to render the report.  That's a pretty poor ratio.

    I'm binding using a Dataset currently as that is what our Data access layer was already returning.  What type of performance gains can I expect if I rework things to use Telerik.Reporting.SqlDatasource instead of the DataSet.

    The export capabilities of the Telerik Reporting services is top notch and the number one reason I am trying to push through with it.  It will enable us to easily setup a scheduling service around the reports.  Where using the RadGrid limits the export smoothness and proves more complicated to run when no on a web server. (Think Worker Role in Azure).



  15. Steve
    Admin
    Steve avatar
    10941 posts

    Posted 06 Oct 2011 Link to this post

    Hello Marcus,

    The benefit of switching to SqlDataSource Component would be mainly the fact that you would work with it directly in the report designer and that your whole data would not be serialized in case of out proc session.
    There would be little to none performance gains, as in the first case you fill a dataset, while in the second we fill our internal objects.
    The good news is that we continually march forward optimizing the supported export formats, so expect those times to change with subsequent versions of the product.

    Best wishes,
    Steve
    the Telerik team

    Explore the entire Telerik portfolio by downloading the Ultimate Collection trial package. Get it now >>

  16. Alirzea Fatehi
    Alirzea Fatehi avatar
    2 posts
    Member since:
    Nov 2009

    Posted 27 Dec 2011 Link to this post

    Hi
    I have same problem with huge size of data in telerik report.
    I use Q3 2011 and it is not efficient for load huge data.
    I have a very simple report layout but loading about 13000 records in 385 page is very time consuming about 5 min.
    I use report parameter as a criteria to filter my data.but sometimes I have need to load all data.
    Now my question is:
    Is there any way to alert some text before render report with huge data.
    It means when I hit preview button in Telerik report criteria:check if data is huge beefore rendering alert to user and confirm from him?

  17. Marcus Kellermann
    Marcus Kellermann avatar
    56 posts
    Member since:
    Jul 2004

    Posted 28 Dec 2011 Link to this post

    My solution for this is actually two fold. 
    • I allow the users to define a MAX number of rows to return per query.  
    • Gave them the option to either clip the results at the MAX row or be notified that to many rows were being returned.

    I then handle data binding in the NeedDataSource event handler.  If the report returns more than MAX rows and the user choose clip.  I truncate the query results to be MAX.   Otherwise if they choose the warning, I throw a ToManyRowsException custom exception that notifies the user that they should apply a filter to reduce the result set further.   The report viewer handles displaying the Exception message to the user.



  18. Rahul
    Rahul avatar
    5 posts
    Member since:
    Dec 2014

    Posted 16 Jun 2015 in reply to Steve Link to this post

    Hi Steve,

     I have 16000 record in table and to load those record report taking almost 9 min. there is no sub report, filter, no any calculation, nothing. just fetching records from database and assigning to report.

    I am using this report in ASP.NET so I put debugger and I check that whether my code is taking time or what, but no after executing all code then after it is taking 9 min.

    It is showing only loading symbol for 9 min. Client want to load more data then this How to give performance..?

    Is there any suggestion that will improve my report performance.

    Thanks

  19. David
    David avatar
    90 posts
    Member since:
    Jan 2011

    Posted 16 Jun 2015 in reply to Rahul Link to this post

    I guess the definition of "large" is somewhat relative.  I have several reports that are 8-9 hundred pages long consisting of detailed billing statements for a local municipality.  One would expect the time it takes to format hundreds of pages to take some time however I have found the telerik suite to be quite responsive.

    How are you binding the data to your report?  Instead of having my report go out and talk to the database, I provide a business object which holds the data required to print a given report.  I use List<BusinessObjectName> type objects and simply bind that to my report data source.  That method provides "all" the data to the report in one whack.  On complex reports (with multiple sub-reports) you can use the various events to bind to the current row of data as required.  I mentally think of this as a equivalent to a dataset.

     This allows you to use whatever technologies you want to retrieve the data in whatever optimized method you have at your disposal.  The report simply binds to your supplied list of business objects.  If it becomes necessary to rebind the data, such as the operator choses to export to .PDF, it already has all the data in memory.

     NOTE: I found exporting directly to PDF to be extremely fast using the report book options.  I have a budget application that renders thousands of pages to PDF in minutes rather than hours.

  20. Rahul
    Rahul avatar
    5 posts
    Member since:
    Dec 2014

    Posted 19 Jun 2015 in reply to David Link to this post

    Hi David,

    Thanks for reply, and yaa I come to know what you want to say.

    Is there any way there we can give some page limit for displaying data and while we export it should export all data. so that I can save rendering time.

    thanks

     

     

     

  21. Yogi
    Yogi avatar
    4 posts
    Member since:
    Sep 2015

    Posted 30 Oct 2015 in reply to Rahul Link to this post

    Hi Guys, I am having the similar problem. My report don't show any data when records are more than around 2000 rows. How to fix this problem? Also any option of paging for a table like 100 rows per page? Can't see any paging property for a table. 

    Plz reply guys with guidance.

  22. Stef
    Admin
    Stef avatar
    3044 posts

    Posted 30 Oct 2015 Link to this post

    Hi Yogi,

    Please check last update in the split large table on multiple report pages forum thread.

    I hope this information is helpful.

    Regards,
    Stef
    Telerik
    Do you want to have your say when we set our development plans? Do you want to know when a feature you care about is added or when a bug fixed? Explore the Telerik Feedback Portal and vote to affect the priority of the items
  23. Mark
    Mark avatar
    35 posts
    Member since:
    Oct 2014

    Posted 14 Apr Link to this post

    It's amazing how long these reports take. Our application is 64bit, so the limit of memory is no longer an issue like it was before, but it takes forever to load data into a report. Just like someone else stated, 9-10 minutes sitting at "Generating Report..."  then when the report comes up, it still has to render all the pages.  I know the query is quick, becuase I can use LinqPad to run the exact same query it it only takes about 13 seconds to run the query and download the data to LinqPad's data grid.

    So, my datasource is a VAR from a LINQ query.  Should I be using some other type?

     

    Thanks

  24. Stef
    Admin
    Stef avatar
    3044 posts

    Posted 15 Apr Link to this post

    Hello Mark,

    The data-retrieval is only the first step in the document's generation - Report Life Cycle. The retrieved data will cause the designer report template to be developed for each data record, which requires additional resources (memory, CPU).

    To avoid processing and rendering the report for the whole data in preview, we can suggest you to filter the data on retrieval (server-side) via report parameters, and get smaller bits in preview. If the user needs the whole document, it can be exported programmatically with all data records - Exporting Report Programmatically, that will save you resources from displaying the whole document.
    Depending on the viewer, you can test enabling the reporting cache mechanisms to reduce the memory usage  - Cache Management. More suggestions can be found in Performance Considerations.


    I hope this information is helpful.

    Regards,
    Stef
    Telerik
    Do you want to have your say when we set our development plans? Do you want to know when a feature you care about is added or when a bug fixed? Explore the Telerik Feedback Portal and vote to affect the priority of the items
Back to Top
DevCraft banner