K6 json output not correct json format

Hi,
Sorry for long post :pray: . I’m trying to figure out how to send all points and metrics as event-strings (one event-string for each point and metric) to Splunk. I was thinking about doing the following:

  1. k6 run MyLoadtest.js --out json=‘LoadTestResult.json’
  2. In a JavaScript to be executed one time by k6 (to make use of the k6 built in http-functionality to post event-string to Splunk) doing the following:
    let resFileJson = JSON.parse(open(‘LoadTestResult.json’));
  3. In loops Traverse the resFileJson object point by point, metric by metric and for each one of them create and post to Splunk collector a “Splunk event string”. A string containing only the point and metric data that I need for further processing in Splunk.

But to my dismay the “json”-output from k6 is really not proper json-format (parser gives error) but something else, even if the k6 command says json. So now I don’t know how to solve this problem getting the points and metrics to Splunk in a way not requiring me to go all “code crazy” (I’m only a “hobby coder”) doing super advanced plugins to k6 etc.

  1. First I wonder if you know if a fix is coming soon for the incorrect json formatting? I read some in the git comments etc about this not being considered a bug (even if the command output says json) but that a flag for it might be implemented to switch the output to proper json. Unfortunately I beg to differ that the issue is not a bug and hope there will be possible to use this switch sometime soon.

  2. If the bug won’t be fixed anytime soon I’d be very grateful for any alternate solution or suggestions on how to get the k6 data to Splunk in a somewhat convenient way.

I tried the Influx-Grafana way but it doesn’t give me enough design possibilities for dashboards (aggregate on transaction/group level, choosing a specific execution not having to set time period, comparing two executions side by side etc). Plus we don’t want to use one more technique in the company for this purpose (App Dynamics and Splunk should be enough, hehe).

Thank you very much for any feedback and help with this.

Have a great day

/Fredrik

Hi Fredrik,

the JSON output file is indeed JSON, but of the JSON Lines variety. Meaning each newline is a valid JSON object.

It’s implemented this way to enable reading/writing the data as a stream, since k6 outputs a lot of metric data and it’s more efficient for readers to read the file one line at a time instead of having to load the entire file in memory for parsing.

While you won’t be able to do JSON.parse(open('file.json')) directly because of this, you can still read it with k6 using split() and map(), like so:

// slice removes the blank line at the end
const json = open('test.json').split('\n').slice(0, -1).map(JSON.parse);

export default function () {
  for (const el of json) {
    if (el.type == 'Metric') {
      // do something with metric
    }
  }
}

This is not the most efficient of approaches, but if your file is relatively small you might be able to make it work like this.

Though I do wonder if k6 is the right tool for the job for reading the file and uploading to Splunk, since you’re not really load testing in that case. You might find it easier to use an external tool/script for this, e.g. implementing it as a plain Node.js script where you could use libraries like node-jsonlines (which may or may not work directly with k6) or other helpers.

If you do want to try out an advanced approach, this could be done with an output extension, in which case you could push metrics to Splunk in real time without having to rely on an intermediate JSON file.

Good luck!

Hi,
Thank you very much for great input and I agree with you k6 not being a suitable tool to read those large json output files that a longer performance test results in. I will rethink my strategy and find a more suitable solution that you mentioned.

Have a great day!