The Sauce Labs Cookbook

Sauce Headless

Front End Performance Testing

Analytics

External Resources

More Info


Page tree
Skip to end of metadata
Go to start of metadata

Capturing the page load performance of an application behind a specific URL covers a lot of use cases and catches the majority of your performance problems. However there are a lot of scenarios where problems are discovered after the user interacts with the application in a certain way. Since Sauce Labs has integrated the capability to capture performance into WebDriver you can now run arbitrary automation steps to set up the state of your application before asserting its performance.

As a reminder, a best practice for performance testing is to split your existing functional test from your performance tests because:

  • Front end performance testing and functional testing are a separate part of the testing process concerns and should not be mixed
  • if the test fails it might be harder to find the root cause if it might be either a functional or performance problem

Sauce Performance will automatically detect page refreshes to execute a measurement of the page performance metrics. You only need to add the necessary capabilities to your test to enable the performance measurements.

Chrome Only

This feature is available only for testing web applications with the Google Chrome browser.

Enterprise Plans Only

This feature is available for Enterprise plans only. For more information about other benefits that are included with Enterprise plans, check out our Pricing page.

See the following sections for more information: 


Enable Your Script

Adding the Necessary Capabilities

In order to enable performance capturing you need to set the extendedDebugging and capturePerformance flags in your capabilities as follows:

PyTest Example
desired_caps = {
    "platformName": "Windows 10",
    "browserName": "chrome",
    "browserVersion": "latest",
    "sauce:options": {
        "extendedDebugging": True,
        "capturePerformance": True
    }
}

desired_caps['name'] = request.node.name
username = environ.get('SAUCE_USERNAME', None)
access_key = environ.get('SAUCE_ACCESS_KEY', None)

selenium_endpoint = "http://{}:{}@ondemand.saucelabs.com:80/wd/hub".format(username, access_key)
executor = RemoteConnection(selenium_endpoint, resolve_ip=False)
browser = yield webdriver.Remote(
    command_executor=executor,
    desired_capabilities=desired_caps
)
Selenium Java Example
DesiredCapabilities capabilities = new DesiredCapabilities();

// set desired capabilities to launch appropriate browser on Sauce
capabilities.setCapability(CapabilityType.BROWSER_NAME, browser);
capabilities.setCapability(CapabilityType.VERSION, version);
capabilities.setCapability(CapabilityType.PLATFORM, os);
capabilities.setCapability("name", methodName);
capabilities.setCapability("extendedDebugging", true);
capabilities.setCapability("capturePerformance", true);

// Launch remote browser and set it as the current thread
webDriver.set(new RemoteWebDriver(
        new URL("https://" + username + ":" + accesskey + "@ondemand.saucelabs.com:443/wd/hub"),
        capabilities));
JavaScript (WebDriverIO)
const { remote } = require('webdriverio')
browser = await remote({
    user: process.env.SAUCE_USERNAME,
    key: process.env.SAUCE_ACCESS_KEY,
    capabilities: {
        browserName: 'chrome',
        platformName: 'macOS 10.13',
        browserVersion: 'latest',
        'sauce:options': {
            extendedDebugging: true,
            capturePerformance: true
        }
    }
})
Watir Example
opt = {
    name: "Performance Test",
    url: "https://ondemand.saucelabs.com:443/wd/hub",
    username: ENV['SAUCE_USERNAME'],
    accessKey: ENV['SAUCE_ACCESS_KEY'],
    extended_debugging: true,
    capture_performance: true }

@browser = Watir::Browser.new opt.delete('browser_name'), opt

Adding Assertions

The sauce:performance command acts as a connecting point between your tests and our data analytics engine. Every time you run a test on Sauce Labs with Extended Debugging enabled, we collect performance metrics and establish expected baselines based on the previous historical data. Once the command is called, the system checks if there is a significant change in performance, and returns an appropriate value.

Performance metrics can be asserted by using a custom Sauce Labs command in your WebDriver script. Please make sure that the performance information is being collected for your jobs. 

Use Case

By using this command, you are able to discover performance regressions as part of a WebDriver test. Sauce Labs services will check if there was a new performance regression introduced and return `true` if everything is intact, or `false` if there is a new performance regression. The feature works by learning from historically collected information and checking if the latest runs are exceeding the established baselines. 

Command

sauce:performance

Arguments

name
Required

A name of the test as it would appear on the Sauce Labs UI dashboard.

metrics
Optional

Specifies one or more metrics you want to assert. By default, if no metric is specified performance is checked against all of them.

Metrics options include load, speedIndex, pageWeight, firstMeaningfulPaint, firstPaint, timeToFirstByte, timeToFirstInteractive, firstContentfulPaint, domContentLoaded


Sample Response

// Response from our API when Page Load Time is above the expected baseline
"reason": "Metrics [load] are significantly out of the baseline",
"result": "fail",
"details": {
 "load": {
   "actual": 12178,
   "lowerLimit": 813.6839391929948,
   "upperLimit": 2910.759098781689
 }
}

Script Examples

Here are a few examples of how to use our custom command across Selenium tests written in various languages (Java, Python, Javascript). Please note that in Java you must place captured metrics in a HashMap before asserting against them.

Full code is available in this repo https://github.com/saucelabs/extended-debugging-Python-examples

PyTest Example
def test_performance_page_load(self, driver, request):
    self.setUpClass(driver)
    performance = driver.execute_script("sauce:performance", {"name": request.node.name, "metrics": ["load"]})
    assert performance["result"] == "pass"

Full code is available in this repo https://github.com/saucelabs/extended-debugging-JAVA-examples

Examples are available for Cucumber, JUNIT & TESTNG

Full code is available in this repo https://github.com/saucelabs/extended-debugging-JS-examples

NightWatch JS
//full code sample https://github.com/saucelabs/extended-debugging-JS-examples/blob/master/ProtactorJs/specs/performance_spec.js
'Performance Testing': (browser) => {
		browser
			.url('https://www.saucedemo.com/')
			.waitForElementVisible('body', 1000)
			.setValue('input[data-test="username"]', process.env.PERF_USERNAME || 'standard_user')
			.setValue('input[data-test="password"]', 'secret_sauce')
			.click('.login-button')
			.url('https://www.saucedemo.com/inventory.html')
			.waitForElementVisible('body', 1000)
			.getLog('sauce:network', (network) => {
				const isRequestExists = network.some(req => req.url.includes('main.js'));
				assert.strictEqual(isRequestExists, true);
			})
			.execute('sauce:performance', {
				name: browser.currentTest.name,
				metrics: ['load'],
			}, ({ value }) => {
				assert.equal(value.result, 'pass', value.reason);
			})
	},

	afterEach(client, done) {
		client.customSauceEnd();
		setTimeout(() => {
			done();
		}, 1000);
	},
};
Protractor JS
// full code sample https://github.com/saucelabs/extended-debugging-JS-examples/blob/master/ProtactorJs/specs/performance_spec.js
it('(sauce:performance) custom command should assert performance has not regressed', async () => {
		const output = await browser.executeScript('sauce:performance', {
			name: config.capabilities.name,
			metrics: ['load'],
		});
		assert.equal(output.result, 'pass', output.reason);
	});
WebdriverJS-Mocha
// full code sample https://github.com/saucelabs/extended-debugging-JS-examples/blob/master/WebdriverJs-Mocha/tests/performance.js 
it('(sauce:performance) custom command should assert pageLoad has not regressed', async () => {
    const output = await driver.executeScript('sauce:performance', {
        name: title,
        metrics: ['load'],
    });
    assert.equal(output.result, 'pass', output.reason);
});
WebDriverIO
// full code sample https://github.com/saucelabs/extended-debugging-JS-examples/blob/master/WebDriver.io/tests/performance.js
it('(sauce:performance) custom command should assert pageload has not regressed', () => {
    const output = browser.execute('sauce:performance', {
        name: title,
        metrics: ['load'],
    });
    assert.equal(output.result, 'pass');
});

Full code is available in this repo https://github.com/saucelabs/extended-debugging-Ruby-examples

Cucumber Watir Example
Then(/^I assert the sauce:performance custom command identifies pageWeight regressions/) do
  performance = @browser.execute_script("sauce:performance", {"name":@feature_name, "metrics": ["pageWeight"] })
  expect(performance['result'] == "pass").to be true
end
Cucumber Selenium
Then(/^I assert the sauce:performance custom command identifies pageWeight regressions/) do
  performance = @driver.execute_script("sauce:performance", {"name":@feature_name, "metrics": ["pageWeight"] })
  expect(performance["result"] == "pass").to be true
end


Additional examples in 
RubyJavaJavascript, and Python are also available.

Discovering Regressions

Once the performance regression is discovered, users have an option to either reverse the change or accept the new performance values as the new baselines. If new baselines are accepted, the command will continue to return pass until another regression is introduced.

Since the command can be called throughout the test, we recommend creating tests that would check for performance regressions across core business flows and screens. For example, pages that are behind a login or require multiple steps.

sauce:performance is only aware of the performance metrics of the get URL command that was called before it and not able to capture metrics for views that were navigated via webdriver actions (e.g button clicks). In this example, the custom command returns performance metrics for the /inventory.html URL.

describe('Performance Demo Test', function () { 
    const { title } = this;

    it('Asserts for regressions on the inventory page', () => {
        // log into the app
        browser.url('https://www.saucedemo.com');
        const username = $('[data-test="username"]');
        username.setValue('standard_user');
        const password = $('[data-test="password"]');
        password.setValue('secret_sauce');
        const loginButton = $('.login-button');
        loginButton.click();
        browser.url('/inventory.html');
        // assert performance of pageLoad has not regressed.
        const performanceOutput = browser.execute('sauce:performance', {
        name: title,
        metrics: ['load'] });
        assert.equal(performanceOutput.result, 'pass', performanceOutput.reason);
    });
});

Logging Performance Metrics

Instead of asserting performance metrics you can also just retrieve the data to either log them directly in your test or have a second assertion to ensure that metrics are within your defined performance budget. Since the log command that is supported in Selenium is not part of the WebDriver specification and therefore not always supported in various frameworks, Sauce Labs provides a custom command to receive performance log information.

Full code is available in this repo https://github.com/saucelabs/extended-debugging-Python-examples

PyTest Example
def test_speed_index(self, driver):
	self.setUpClass(driver)
	metrics = ["load", "speedIndex", "pageWeight", "pageWeightEncoded", "timeToFirstByte",
               "timeToFirstInteractive", "firstContentfulPaint", "perceptualSpeedIndex", "domContentLoaded"]
	performance = driver.execute_script("sauce:log", {"type": "sauce:performance"})
	for metric in metrics:
		assert performance["speedIndex"] < 1000

Full code is available in this repo https://github.com/saucelabs/extended-debugging-JS-examples

NightWatch JS
browser.getLog('sauce:performance', (performance) => {
	assert.ok(performance.speedIndex < 1000);
});
Protractor JS
it('(sauce:performance) should check speedIndex', async () => {
	const performance = await browser.executeScript('sauce:log', { type: 'sauce:performance' });
	assert.ok(performance.speedIndex < 1000);
});
WebdriverJS-Mocha
it('(sauce:performance) should check speedIndex', async () => {
	const performance = await driver.executeScript('sauce:log', { type: 'sauce:performance' });
	assert.ok(performance.speedIndex < 1000);
});
WebDriverIO
it('logs (sauce:performance) should check speedIndex', () => {
	const performance = browser.getLogs('sauce:performance');
	assert.ok(performance.speedIndex < 1000);
});

Full code is available in this repo https://github.com/saucelabs/extended-debugging-Ruby-examples

Cucumber Watir Example
Then(/^I check for sauce:performanceLogs/) do
	performance = @browser.execute_script("sauce:log", {"type": "sauce:performance"})
	expect(performance["speedIndex"] < 1000).to be true
end
Cucumber Selenium
Then(/^I check for sauce:performanceLogs/) do
	performance = @driver.execute_script("sauce:log", {"type": "sauce:performance"})
	expect(performance["speedIndex"] < 1000).to be true
end


Sample Response

{ 
   "estimatedInputLatency": 16,
   "timeToFirstByte": 687,
   "domContentLoaded": 7,
   "firstVisualChange": 214,
   "firstPaint": 186,
   "firstContentfulPaint": 186,
   "firstMeaningfulPaint": 326,
   "lastVisualChange": 13042,
   "firstCPUIdle": 2388,
   "firstInteractive": 2388,
   "load": 3271,
   "speedIndex": 972
}

Assert Metrics Using a Performance Budget

A performance budget is a set of limits imposed on metrics that affect site performance. You can define these limits for various pages in a separate file. For example:

budget.json
{
  "https://saucelabs.com/": {
    "speedIndex": 2300,
    "lastVisualChange": 2200,
    "load": 4200
  },
  "https://saucelabs.com/platform/analytics-performance/sauce-performance": {
    "score": 0.78
  }
}

You'll import this file in your test script and dynamically assert against the values:

Assert Performance Budget (WebdriverIO Example)
const budgets = require('./budget.json')

for (const [url, budget] of Object.entries(budgets)) {
    await browser.url(url)
    const performanceLogs = await browser.execute(
        'sauce:log',
        { type: 'sauce:performance' })

    for (const [metric, value] of Object.keys(budget)) {
        assert.ok(performanceLogs[metric] < value
            `metric ${metric} is over the performance budget`)
    }
}

This way you can dynamically automate your performance tests to cover performance quality over a lot of pages.

The Performance Report

When your test completes, it generates a set of metrics and that you can access through the Performance tab of the Test Details page. If you detect a regression in your website's performance, you can download a Full Trace report, or you can use the Chrome DevTool, which records Javascript method signatures in a hierarchical view for each thread in each process, to get a better idea of how the Chrome browser is interacting with your website.

The full Performance Report provides you access to all performance metrics over time:

In Lighthouse report you will find useful information and recommendations that can make your website faster.

Performance Metrics

Please see Page Load Metrics Reference for more details.

  • No labels