This answer is deprecated. Saved for posterity.
@Alexander I spoke to one of my colleagues with a stronger math background than myself. He came up with another solution that might be less confusing when dealing with even distributions:
let VUsTotal = 1000 // Set total script's total VUs amount here
let VUsPerInstance = 250 // minimum VUs per instance in the cloud execution
let InstancesTotalUpperEstimate = Math.ceil(VUsTotal / VUsPerInstance)
let uniqNum = (__ITER * VUsTotal + (__VU - 1)) * InstancesTotalUpperEstimate + __ENV["LI_INSTANCE_ID"]
Note that VUsPerInstance
requires some thinking on your part and the number above is representative of this example. 1000 VUs / 300 max VUs = 3.33 instances required. As we can’t have .33 of an instance, we round up to 4. 1000 VUs across 4 instances would be 250 per instance. This also assumes even distribution! If you start to have uneven distribution it gets a bit more complex.
As you can see there are multiple ways to go about this, I hope this clears things up a bit though!