Xk6-browser for pages behind authentication

I’m evaluating xk6-browser to get the web performance metrics like FCP, page load etc and evaluate the application for performance decreases against new code.

I will use facebook.com to explain my scenario.

I want to get the web performance metric like browser_dom_content_loaded, browser_loaded etc. for

(1) facebook.com/
(2) facebook.com/marketplace

Before that I need to go to Log into Facebook page and login to the application.

My code is as below:

import launcher from "k6/x/browser";

function login(page){
    page.goto('https://app/login');
    //login to application
    page.$('input[name="username"]').type('sashika.wijesinghe');
    page.$('input[name="password"]').type('welcome');
    page.$('a[id="loginSubmit"]').click();

    // Wait for next page to load
    page.waitForLoadState('networkidle');
    page.waitForNavigation();
    // Wait for a selector in the very first page after login to make sure it loaded
    page.waitForSelector('div[class="newsfeed-item-container"]');
}


export default function() {
    const browser = launcher.launch('chromium', { headless: false });
    const context = browser.newContext();
    const page = context.newPage();
    // This is for my authentication
    login(page);
    // This is the actual page that I want to ge the metrics
    page.goto('https://app/marketplace');
    page.waitForNavigation();
    // this is to wait until the required page loads
    page.waitForSelector('ul[data-testid="carousel-container"]');
    page.close();
    browser.close();
}

My questions are:

(1) When I run the code I’m getting some metrics, but I’m not sure how that metrics are collected?
(2) Does it only include the web performance metrics for the page that I actually want to verify ( which is /marketplace) or is that an average metrics of all the pages that I go though the process ( like /login, /mainpage, /marketplace)

Hi @sashi1, welcome to the forum :slight_smile:

  1. Metrics are emitted by xk6-browser, and collected and processed by any k6 output you have enabled, so they work the same way as with plain k6. Let me know if you have a specific question about this.

  2. The end-of-test summary shows an aggregated view of all metric data. But you can see individual metric samples by enabling a k6 output.

    For example, if you enable the JSON output with xk6-browser run --out json=result.json script.js, you’ll see all raw metric data in the created result.json file. There you can see DOM related metrics for any loaded pages, and HTTP related metrics for any URLs loaded by the page, such as static assets. There will be no averages here, just raw metric samples for each URL.

    So to answer your question, you’ll see metric data separately for all pages loaded in your script (/login, /marketplace, etc.).

Hope this helps,

Ivan

Hi Ivan,

Thank you for your answer. I got the json output and analyze how the summary value os calculated. I noticed from the log there are some record without a program url

{"type":"Point","data":{"time":"2022-06-06T14:17:00.969412-04:00","value":0.259,"tags":{"scenario":"default","url":"","group":""}},"metric":"browser_dom_content_loaded"}
{"type":"Point","data":{"time":"2022-06-06T14:17:01.178794-04:00","value":0.143,"tags":{"scenario":"default","url":"","group":""}},"metric":"browser_dom_content_loaded"}
{"type":"Point","data":{"time":"2022-06-06T14:17:18.066766-04:00","value":981.501,"tags":{"group":"","scenario":"default","url":"https://xxx/catalog"}},"metric":"browser_dom_content_loaded"}

In my scenario, there were 3 records with the URL’s I mentioned, and some other URL’s that are empty as above. so the summary calculation is made considering all the instances ( 3 URL instances + 4 empty instances , divides by 7). So the summary doesnt actually reflect what I’m looking for.

Is it possible to set the thresholds for results.json (Thresholds) file?

For example: I want to check the browser_dom_content_loaded for 'https://xxx/catalog' is above some given value

Hi @sashi1

If you create a threshold like this:

export const options: {
  thresholds: {
     'browser_dom_content_loaded{url:https://xxx/catalog}': ['p(95)<500']
  }
}

You will see the metrics for browser_dom_content_loaded for the URL specified in the end-of-test summary. Here’s some example output:

browser_dom_content_loaded...............: avg=745.4ms  min=57µs     med=652.46ms max=1.58s p(90)=1.39s   p(95)=1.49s
     ✗ { url:http://ecommerce.test.k6.io/ }...: avg=1.58s    min=1.58s    med=1.58s    max=1.58s p(90)=1.58s   p(95)=1.58s
2 Likes

I noticed from the log there are some record without a program url

That’s certainly unusual and shouldn’t happen. I was able to reproduce this with a more complex script we have, so I created issue #381 to track this. Feel free to subscribe to it for updates.

In any case, the metric value and summary calculation should be correct. This seems like a bug with getting the right frame URL, so you’ll have difficulty correlating it, but the aggregated metric values should be fine.

If you can share a runnable script that reproduces the issue, we can take a closer look.

And like Tom says, thresholds should work in the same way as they do in k6.

1 Like

Thank you Tom. This solved my issue