I’m considering the usage of k6 to test the performance of a web socket app. The scenario is as follows:
Client continuously sends a sequence of messages, say, A, B, C in a loop. After each message, the client waits for the response from the server and then sends another message.
I’d like to have a report of how long each interaction - A, B and C - takes, ideally with basic stuff (average, mean, percentiles) calculated.
By default, k6 only reports overall WS statistics. So, it seems to me I have to manually keep track of what message was sent and how much time had passed since it was sent.
Am I correct in my assumption? Is there a better way to measure the performance w/o manual timers?