I have a question about what happens "behind the scenes" with a report filter.
Scenario: I have a web app with a report that uses a SQLServer view as its data source. The view contains data for all clients. For sake of an example, let's say there are 10,000 clients. The report has 1 ReportParameter, the ClientID, and that parameter is used to Filter the results shown on the report.
What happens behind the scenes to pull data onto the report? I can imagine two possibilities:
1) The report viewer opens the view, and 10,000 rows of data are passed from the SQLServer to the WebServer. The report viewer filters the data down to 1 row by ClientID and displays the report to the end user.
OR
2) The report viewer reads the ReportParameter and passes a query string to the SQLServer like "Select * from View WHERE ClientID = 'XYZ'". Then, the SQLServer passes just 1 row of data to the Webserver, and the report viewer displays the report.
I expect the answer is #1, but I hope for #2. Clearly, #2 results in a LOT less network traffic.
Scenario: I have a web app with a report that uses a SQLServer view as its data source. The view contains data for all clients. For sake of an example, let's say there are 10,000 clients. The report has 1 ReportParameter, the ClientID, and that parameter is used to Filter the results shown on the report.
What happens behind the scenes to pull data onto the report? I can imagine two possibilities:
1) The report viewer opens the view, and 10,000 rows of data are passed from the SQLServer to the WebServer. The report viewer filters the data down to 1 row by ClientID and displays the report to the end user.
OR
2) The report viewer reads the ReportParameter and passes a query string to the SQLServer like "Select * from View WHERE ClientID = 'XYZ'". Then, the SQLServer passes just 1 row of data to the Webserver, and the report viewer displays the report.
I expect the answer is #1, but I hope for #2. Clearly, #2 results in a LOT less network traffic.