Thank you for the files.
1. My previous explanation was in regards to an action step, where a built-in "wait for exists" always occurs first. In this case, step 5 is a Wait step itself and not an action step. The time taken to find the LogOutGrid element is added to step 4, and the 1.298 seconds reported in step 5 is the time between when it was found and when it became visible.
Keep in mind that Waits and Verifications don't really have a place in Performance testing, as explained here, and it's hard to interpret those types of steps in this context.
2. I did not see a dramatic difference in times between running your test with and without Test Studio.
I first ran through the steps in IE, outside of Test Studio, and saw the CPU usage spike as high as 80% at certain times. When run through Test Studio the average CPU usage was slightly higher, which I expected due to the additional automation overhead. The maximum usage never exceeded what was seen when run manually, however.
3. The Size column represents the incoming data size received from the server. No client information is included.
4. Set RecycleBrowser to True on the Web tab of Test List Settings.
the Telerik team
Do you want to have your say when we set our development plans?
Do you want to know when a feature you care about is added or when a bug fixed?
Telerik Public Issue Tracking
system and vote to affect the priority of the items