Allow opening folders with the open function

#1

I’m writing some file-upload tests for the past two days and this restriction of having open() “init code” really hurts. I want to have a big set of files, which I want to upload, but I cannot do it dynamically. I have to define each file in the code. I’m testing a document management system and it is all about files.

What I really need, is to iterate over those files in a directory and upload and run checks after each upload. So if you enable the function open to open directory and give me the list of the files, It will solve that issue. On the other hand you can deploy the whole directory to the cluster (the described reason of the open restriction: https://github.com/loadimpact/k6/issues/557)

Right now I execute k6 per file and iterate over the files in a bash script, but I don’t feel this is the right solution. I cannot execute in a parallel and upload multiple files at the same time. And open() is essential for uploading files. Alternative I’m thinking about uploading those files somewhere and use the http.get() endpoint and then upload tha content. This is more infrastructure work, but should work.

What do you think about?

#2

Unfortunately, the restriction of opening files only in the init context is likely here to stay. As @robin has explained in this issue (which I see you’ve also found… :wink: ), we need to know which files the script will need during its execution, so we can package them together with all of its imported files in a single bundle for the cloud/clustered execution.

I created a new issue for adding support for listing and file system navigation in the init context though. That’s something that will improve the UX of working with files, while at the same time it won’t break the cloud execution. It also contains a few workarounds that might be useful to you and links to related issues that might also be helpful in your use case, once implemented.

#3

Are the current workarounds in the issue recommended? If yes, this might help us for now, to solve that issue and provide for example list of files, that we upload randomly.

Btw great tool! It really easy to learn and when you get the idea behind, it is easy to use! :slight_smile: Good Job!

For cloud customers: do you offer onboarding? Or do you provide something like test reviews, just to see if we use the tool properly? Most of your examples are really simple. API endpoint with a few tutorials, how to test it, would be great to see how to define such kind of tests

#4

They are, insofar as there isn’t a better alternative at the moment and we don’t plan to make any of the mentioned APIs obsolete. To be informed when improvements in the file APIs are coming, you can follow issues 1005, 532, 592, and the release notes of new k6 versions.

For cloud customers: do you offer onboarding? Or do you provide something like test reviews, just to see if we use the tool properly? Most of your examples are really simple. API endpoint with a few tutorials, how to test it, would be great to see how to define such kind of tests

AFAIK, we don’t offer onboarding at the moment, but we have a separate support channel for paying customers. And I’m not sure if you’ve stumbled upon it, but we have an extended documentation for the LoadImpact service located here, which includes a number of guides and examples, as well as information on how to interpret the results.