Reading data from a file without repetition

Hi, I have a file with test data, it looks something like this:

    [
     { "username": "tes1t", "hash": "1cwgh47589546687df3cd0af81aab8a2852316206ccb7f4" },
     { "username": "test2", "hash": "2cwgh47589546687df3cd0af81aab8a2852316206ccb7f5" },
     { "username": "test3", "hash": "3cwgh47589546687df3cd0af81aab8a2852316206ccb7f6" },
     { "username": "test4", "hash": "4cwgh47589546687df3cd0af81aab8a2852316206ccb7f7" },
         .....
    ]

There are 50,000 such lines in my file

In my case, I can use the “hash” field only 1 time, and only one of the VUs

How can I make each VU take only unused lines from the file?

Tried it like this but it doesn’t work

var splits = 1;
//const data = JSON.parse(open("./dataA.json"));

if (__VU == 0) {
  open('./dataA.json');
} else {
  var data = (function () {
    var all_data = JSON.parse(open('./dataA.json'));
    var part_size = all_data.length / splits;
    var index = part_size * (__VU % splits);
    return all_data.slice(index, index + part_size);
  })();
}

export default function () {
  //let user = data[__VU - 1];
  var url = 'http://127.0.0.1:4554/issue';
  var payload = JSON.stringify({
    userId: data[__VU].userId,
    districtId: 52,
    version: '1123',
  });
  var params = {
    headers: {
      'Content-Type': 'application/json',
    },
  };
  let res = http.post(url, payload, params);
  check(res, {
    'is status 200': (r) => r.status === 200,
  });
}

Hi @allnull, welcome to the forum

This is explained in this post When parameterizing data, how do I not use the same data more than once in a test? and there is also some more discussion for a more … complicated scenario here Unique test data per VU without reserving data upfront, which currently isn’t as well supported.

You can also take a look at handling bigger data files which can be used to get the same effect(if you cut files in VUMAX pieces), but also having the benefit of potentially using less memory until https://github.com/loadimpact/k6/pull/1739 is merged.