-
Notifications
You must be signed in to change notification settings - Fork 127
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Benchmark parameters #229
Comments
Here are the properties I'd like to see in this setup:
I can see two ways to implement this, with competing trade-offs:
|
I'm going to proceed with implementing this using a ParameterValue model, given that the upsides to the other approach are minimal. |
Going for the second option probably makes sense. However, this probably warrants some discussion. You brought up quite an important topic actually. Some heavy users of Codespeed were hitting it's limitations when wanting to store and compare results with many different "attributes" (your parameters) attached, and discussed possible implementations in their own companies. Have a read: So I see this as the start of Codespeed 2.0, this is good thinking. Would you like to start a discussion thread there? We might want to outline a roadmap where Codespeed 1.0 is released, and then we restructure the project around parameters/attributes. Of course incremental approach is also an option, only before adding more and more features a bit of restructuring would do well. |
The current codespeed model supports two kinds of groupings for benchmarks:
In my own use of codespeed, I'm finding that I want to write a particular benchmark, and then run it under various conditions within the same environment. Examples:
Currently I've been implementing these as separate benchmarks, but this uses up a lot of UI space, and doesn't convey the relationship between them. Parents are not sufficient either, as there isn't a parent benchmark as such, just a group of siblings.
I think a better model for representing these would be for a Result to specify a set of Parameters for the Benchmark it is using. Moving from this:
to something along these lines:
The text was updated successfully, but these errors were encountered: