This is a migrated thread and some comments may be shown as answers.

Crazy report layout - huge memory

6 Answers 92 Views
General Discussions
This is a migrated thread and some comments may be shown as answers.
Steven
Top achievements
Rank 1
Steven asked on 21 Feb 2013, 09:52 PM
I have this crazy crazy report that is ported over from SSRS and I have allot of memory issues do the the number of elements on the report.

There are an infinite amount of accounts that have a user defined order.
Each account is a different type and has a very different layout of its content
Each account can span multiple output tables (AccountType #3 outputed in Table03 then Table07, #4 = Table02, 05 and 06)

The report works but it consumes allot of processing memory.
Each of the tables have a binding "DataSource=ReportItem.Parent.DataObject" and has Table.Filters as well as Row.Filters
Each table consts on average 20 columns and 10 rows.
Within the table ItemDataBound event I can see that each Processing.ReportSection of Group2 contains a full stack even though most of the tables are empty.   Using reflection I can purge empty tables from the ReportSections.items collection in the Group2 ItemDataBound event.  Doing this during the Table ItemDataBound event causes an error in the "ReportSectionBase.ProcessItem().items" enumeration.

The size and complexity of the tables cause an enormousness amount of processing elements. 
The processed report size is 48mb, but the processing memory is around 450mb.
If I remove the PageCount the report jumps down to 35mb and the processing memory drops to 350mb.



[PageHeader]
[Group1] - static group "=True"
  - SubReport - used for static subHeader
[Group2] - loops about 20 - 40 account types
 - Table01
 - Table02
 - Table03
 - Table04
 - Table05
 - Table06
 - Table07
 - Table08
 - Table09
 - Table10
[DetailSection] - empty/hidden to push iteration of group content
[Group2-footer] - hidden
[Group1-footer] - hidden
[ReportFooter] - FootNotes
[PageFooter]

6 Answers, 1 is accepted

Sort by
0
Chavdar
Telerik team
answered on 26 Feb 2013, 11:29 AM
Hello,

Thank you for the detailed explanation and the screenshots.

According to your description, in a single report you are using between 200 and 400 Tables with around 200 cells for each table which is not quite a small amount. It is pretty normal in this case to experience an increase in the utilized memory given the functionality that the Table item provides.

Nevertheless, we have short-term plans to review the performance and the memory usage of the reporting engine and make optimization wherever they are possible.

Kind regards,
Chavdar
the Telerik team

See what's new in Telerik Reporting Q1 2013. Register for the March 4 webinar to witness the impressive new visualizations in Telerik Reporting. Just for fun, 10 webinar attendees will be randomly selected to win a Telerik T-shirt and a $50 Gift Certificate to ThinkGeek. Register now! Seats are limited!

0
Steven
Top achievements
Rank 1
answered on 27 Feb 2013, 03:28 AM

I was able to recover some of the processing memory by deleting/purging empty tables from the Telerik.Reporting.Processing.ReportSectionbase.items collection through reflection in the ItemDataBound event and was able to claim even more in the Table's ItemDataBound event by iterating the table cells and setting the empty textbox visibility to false.


With this I was able to shave the processing memory from the rendering thread from 450mb down to 380mb.


My next plan is to consolidate everything into one table with extensive row hide show logic.
Then with binding to objects I can run through the set and create markers that will allow me to create simple filters in the table, kind of like 2-pass video encoding.

This would eliminate all the generations of the extra tables that are not used.

It would be nice if you could apply or reuse global Eval Functions to multiple objects rather then repeating it 10-20 times.
Like StyleRule, when I go to add an expression I can just select from one that already exists on the form and re-attach it to the new item rather then re-creating including TableGroup Filters.  I often have very complex TableGroup Filters which need to be applied to multiple rows.
0
Stef
Telerik team
answered on 02 Mar 2013, 07:23 AM
Hello Steven,

Thank you for your update. Consolidating data into one table is a good approach, because as you have said you  can use filter expressions in a simpler way instead of reassigning over and over again the same expression. In general, you have noticed that messing with definition declarations and items leads to unexpected errors, which is the reason not to recommend a scenario requiring modifications of the report's ItemCollection once the processing has begun - while the processing has reached one item, changing the definition affects the processing of the following items. Needless to say that additional hiding and showing of Table items takes time and memory for object which will not appear on screen.
As my colleague mentioned, we are going to revise our reporting API and investigate possible improvements, so any suggestions are considered welcome.

Thank you again and let us know if we can be of any further help for you.

Greetings,
Stef
the Telerik team

See what's new in Telerik Reporting Q1 2013. Register for the March 4 webinar to witness the impressive new visualizations in Telerik Reporting. Just for fun, 10 webinar attendees will be randomly selected to win a Telerik T-shirt and a $50 Gift Certificate to ThinkGeek. Register now! Seats are limited!

0
Steven
Top achievements
Rank 1
answered on 06 Mar 2013, 10:26 PM
I've re-factored the crazy report from 

[Group1]
   [Group2]
      [Tables 1-10]


To the following

[Table]
  [Panel1 with 5-10 texboxes]
  [Panel2 with 5-10 different located textboxes]
  .... 60 Rows = 60 Panels with 5-10 textboxes each

Every OLD table layout now equates to 4 Panels (table Rows) with 5-10 Textboxes Each.
Each Table row now has a very simple TableGroup Expression based on the bound type "=Fields.SchedulType = {Name}"

In Theory I though it would iterate the table groups and if the TableGroup expression equaled true it would then append a processing item of that row definition which would result in allot less items being processed.

From the JustTrace Memory profiler the opposite is happening as there seems to be a leak with the Panel DataMember, DataMemeberSlots.

Now I'm assuming that It creates the Processing Row from the Row Definition, Checks the TableGroup Filters and then just doesn't process the inner items even though they have already been created and the row is BOUND.


What are my options to strip down the excessive "Total Allocated Instances" and cleanup the Memory retained by the DataMember off of the Panel within the Table.


Thanks

Attached are the screen captures of JustTrace
01_{name} === Crazy Table Layout
04_{name} = New Table Panel Layout






0
Chavdar
Telerik team
answered on 11 Mar 2013, 03:44 PM
Hello,

It is not possible to give an advice only by looking at screenshots. Please, open a support ticket and send us the report definition along with some sample data so we can profile the report and try to figure out something. Thank you in advance.

All the best,
Chavdar
the Telerik team

Telerik Reporting Q1 2013 available for download with impressive new visualizations. Download today from your account.

0
Steven
Top achievements
Rank 1
answered on 12 Mar 2013, 01:37 AM
I was able to clean up the memory allot by purging the DataMember on the ItemDataBound event with Reflection.

I figured that in the ItemDataBound event, the reportItem is already populated and the data is no longer relevant to the rendering engine and exporting from the report viewer should execute against the already processed item.

A refresh of the form refreshes the report from the datasource and would then rebuild the data groups.


Attached are some before and afters.

I know this isn't the right way to do it, but it will work as a band-aid until we get a better solutions.
We will be running integration tests tomorrow and will reply with any issues.
Tags
General Discussions
Asked by
Steven
Top achievements
Rank 1
Answers by
Chavdar
Telerik team
Steven
Top achievements
Rank 1
Stef
Telerik team
Share this question
or