How are performance metrics collected for my tests?
Sauce Labs collect history for performance metrics based on the test name and platform combination. The browser version is currently ignored.
Can everyone in my organization view the front end application performance stats?
As performance is part of the job details, visibility of the performance data is determined by the visibility of the job.
Why aren't performance metrics captured for all of the URLs that were accessed during the test?
We are currently only able to capture performance for page transitions (hard or soft) that are triggered for example via the
navigate command, or a click. There are plenty of ways that can cause such page transition and we might not catch all of them. Please file a support ticket if we miss to capture performance metrics for a specific page load on your page.
Which browsers are supported?
As of now, performance metrics can only be captured on Google Chrome.
How are baselines defined?
Baselines are determined by calculating a confidence interval over the prior runs of the same test name for each URL visited during the test. The confidence interval is calculated to estimate the range of values which are expected in future runs, based on past observed performance. Observations outside of the baseline range are statistically unlikely to be observed in the absence of some fundamental change in the application’s performance.