Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add confidence intervals for benchmarks #158

Open
Razican opened this issue Mar 20, 2023 · 9 comments
Open

Add confidence intervals for benchmarks #158

Razican opened this issue Mar 20, 2023 · 9 comments
Labels
blocked Needs an external code change documentation Improvements or additions to documentation

Comments

@Razican
Copy link
Member

Razican commented Mar 20, 2023

Currently, our benchmarks are a bit messy. They show some points here and there, and there is a huge amount of noise. Criterion gives us nice confidence intervals that we can use. See how to implement them, you might find inspiration here: chartjs/Chart.js#6899

In the process, we might want to clean-up a bit that page, by stop using big "dots" in the graphs, and maybe having up to 2 graphs per row, so that the page is not so long. Maybe also some explanations on what each benchmark is checking.

@Razican Razican added good first issue Good for newcomers documentation Improvements or additions to documentation labels Mar 20, 2023
@Zeesky-code
Copy link

Hello, I would love to work on this issue. Please can I be assigned? :)

@HalidOdat
Copy link
Member

@Zeesky-code sure go ahead :)

@Zeesky-code
Copy link

I'm sorry for the delay in working on this issue, as I haven't had time to work on it yet. To prevent delays, I would unassign myself and give someone else the opportunity to take over. Sorry for any inconvenience caused :(

@Zeesky-code Zeesky-code removed their assignment Mar 27, 2023
@gavincrawford
Copy link

Can I take a shot at it?

@HalidOdat
Copy link
Member

HalidOdat commented May 28, 2024

@gavincrawford Sure, go ahead! :)

EDIT: The benchmark visualization page code can be found in this repo https://github.com/boa-dev/boa-dev.github.io/

@jedel1043
Copy link
Member

I'll transfer the issue to the other repo, since it doesn't make sense to have it here anymore.

@jedel1043 jedel1043 transferred this issue from boa-dev/boa May 28, 2024
@gavincrawford
Copy link

Looking for some clarification on the "confidence intervals". If I'm not missing something, those aren't present in the data hosted on the data repository, meaning displaying them isn't possible without a revision of the code that updates those JSON files.

Other than that, I've gotten the filler plugin up and running, with the absolute difference from the average used as the fill parameter, just for demonstration purposes. Thoughts?
fill demo

@gavincrawford
Copy link

Stale. I don't think that this issue is possible to complete unless someone clarifies where you want this data pulled from, as it isn't present in the source that we're using right now. I'm unassigning myself until further clarification.

@gavincrawford gavincrawford removed their assignment Jun 12, 2024
@jedel1043
Copy link
Member

Will mark it as blocked because I realized we previously had confidence intervals, but after moving to another benchmarking test suite we don't have that data anymore.

@jedel1043 jedel1043 added blocked Needs an external code change and removed good first issue Good for newcomers labels Jun 13, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
blocked Needs an external code change documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

5 participants