We have currently come into the situation where we will need to be showing large sets of data. Currently our reports will not get over a few hundred pages in length at the max. When I say larger, I'm not exactly sure how large just yet. All I know is that a few of the reports I've tried to test have ended with the server running out of memory trying to "Generate Report".
My question here is what are the stress limitations of the Telerik Reporting tool? How many pages of data with say 10 columns can the viewer show before it physically won't do it? I know a lot of that is dependent on hardware specs of the server but there has to be some number somewhere.
Also I've been reading over the
knowledge base article trying to tweek some performance issues. This however, hasn't helped that much considering sub reports will not work in my particular situation. Some of these larger reports are just flat forms of data that can't be broken out.
So from all this there are two major questions that I have.
1) Is there a way to put a timeout on the "Generating Report" so that the server doesn't sit there and spin, bogging down everyone else?
2) Is there a way to somehow not load the entire report into memory. Is there a built in way to not have each page stored or created at initial runtime?
Basically I'm just looking for any and all performance tuning practices that I could potentially use to make these very large reports run.
Also the database returns the datasets back fairly quickly. The database has never been a problem in any of these issues.