How can I get microtimestamps in csv output

When I use --export cvs=results.csv the timestamp has very low resolution (1s), so I cant plot timeseries.
I need to convert it into arrays, but some requests have way fewer entries in scenarios, so it is hard to compare.

metric_name,timestamp,metric_value,check,error,error_code,expected_response,group,method,name,proto,scenario,service,status,subproto,tls_version,url,extra_tags,metadata
http_reqs,1672766933,1.000000,,,,true,,POST,https://XXXXXXXXX/auth/realms/buypass/protocol/openid-connect/token,HTTP/1.1,default,,200,,tls1.2,https://XXXXXXXXX/auth/realms/buypass/protocol/openid-connect/token,,
:
:
:

The first 1000+ entries have the same timestamp

>>> datetime.datetime.fromtimestamp(1672766933).strftime('%Y%m%d %H:%M%S.%f')
'20230103 18:2853.000000'

I use matplotlib and it has no problems whatosever with micro timestamps:

>>> datetime.datetime.fromtimestamp(1672766933.123456).strftime('%Y%m%d %H:%M%S.%f')
'20230103 18:2853.123456'

I can use --out json=results.json, that that microtimestamps:

:
{"metric":"http_req_blocked","type":"Point","data":{"time":"2023-01-05T09:45:41.088631+01:00",....
:

Not good you support different output based on format, I like csv because I can easily manipulate and minimize gigabytes, then json bloat is way bigger.

Hi @mortenb123

Welcome to the community forums :wave:

Many thanks for bringing this to our attention. I can see someone already opened an issue yesterday for this and we have some comments on use same timestamp in csv results as in json · Issue #2839 · grafana/k6 · GitHub. Not sure if it was you or someone from your team. Thanks!

Cheers

1 Like

Yes it was me, hopefully it is a very easy fix, or an option 1 second is very much when you have 10K responses.

1 Like