Hi Community ,
Have a small question about the
In my test suites, I have to prepare unique data in the DB (
setup) before the actual run, and in the end to delete this data (
teardown) from the DB.
http.batch in both
teardown because I found that very quick and efficient (for loop taking too much time).
But now I’m concerned with its efficiency, when I will run with 1000/10000/… unique data the
http.batch will “kill” the DB API.
I tested 100 requests with
http.batch and saw (in the back-end API) that it doesn’t send simultaneously all the 100 at once - which is good!
- Am I doing right with this method of the
- Does the
http.batchhas kind of a threshold for simultaneous requests (if I put a million requests in it, will it send a million at once)?