S3ServiceError using S3Client

Following the basic example in the docs (S3Client.putObject(bucketName, objectKey, data))

Produces the following error:

S3ServiceError: There were headers present in the request which were not signed

I’m not sure what I’m doing wrong here. I just replaced the bucket name with one that I own and added my keys in the ENV.

Hi @shane.ontraccr

Welcome to the community forums :wave:

Thanks for reporting this issue. So far, from my investigations, it seems something introduced in version 0.7.0. While we keep digging, the example should work fine with version 0.6.0. I changed the import and the example worked for me:

import { AWSConfig, S3Client } from 'https://jslib.k6.io/aws/0.6.0/s3.js';

If you are looking for more examples you can also visit the repository: k6-jslib-aws/s3.js at main · grafana/k6-jslib-aws · GitHub

I’ll get back here once we figure out what changed. Again, many thanks for spotting this.


Thanks! It seems to have worked. Is there a way to customize the headers (such as Content-Type or Content-Disposition)?

Hi @shane.ontraccr

Thanks for your patience. I opened the issue Error some headers are not signed with version 0.7.0 · Issue #28 · grafana/k6-jslib-aws · GitHub and proposed a PR to fix this. I’ll discuss this with the maintainer, as this is simply my proposal. You can follow the issue to see how it’s finally resolved.

Looking at the code I think it’s not possible to add headers such as Content-Type or Content-Disposition. We are always passing empty headers and the code creates the authentication/signing headers. It looks like a good feature to add, so I went ahead and opened: Support for headers like Content-Disposition or Content-Type with putObject · Issue #30 · grafana/k6-jslib-aws · GitHub.

Many thanks for your contributions :+1:


Hi @shane.ontraccr

We published the fix in version 0.7.1. The following import should work now:

import { AWSConfig, S3Client } from 'https://jslib.k6.io/aws/0.7.1/s3.js';

Let us know if you are still hitting issues. And many thanks for reporting this.