Run one VU at a time but have varying Ramp up pattern within duration

I’m playing around with a scenario where I need to post bidding to an auction. This will be called by multiple users.
E.g. post request body
{
“amount”:
}

Now I want to mimic this bidding scenario for 10 users, but the amount in the post request gets outdated during execution as someone already used a higher value and all the bids with lower values fail.
Since I can’t hold of sequence how VUs are invoked, how can I make sure this pattern can be achieved?
I can’t think of what pattern/executor should I use here. My guess is I need to have only 1 concurrent user but I need to ramp up say 100-200 VUs within 1 minute but still executing one at a time.
Any ideas?

Hmm how do you handle this in your service normally - I assume the actual service users have a way to get appraised of the latest bid price? Can you do the same for k6? I don’t know your system, but it’s likely that it will also make for a more realistic load tests, to treat VUs as actual users who can’t synchronize state between them.

That said, if you really can’t get the price and need to synchronize state between VUs, you currently can’t do that natively in k6, sorry. You could use xk6 and an extension like GitHub - mstoykov/xk6-counter (or write your own similar one) to share some state between VUs on the same instance.

For now we don’t have plans to have something like xk6-counter built into k6, since it won’t be easy to make it work in the cloud/distributed k6 execution, where there are multiple independent k6 instances. A possible workaround in the future would be asynchronous HTTP requests, once we have event loops (Global JS event loops in every VU · Issue #882 · grafana/k6 · GitHub). Then you can just have a single VU make multiple concurrent requests in a way that’s much more flexible than http.batch().

Hello, are asynchronous HTTP requests already possible in k6? If so, do you maybe have an example for me how to send multiple concurrent http.post() requests for a single VU? I would like to be able to send a request a few milliseconds after another.