A repo for collaboratively building small benchmarks to compare languages. If you have a suggestion for improvement: PR! If you want to add a language: PR!
To run one of the benchmarks:
cd
into desired benchmark directory (EG$ cd loops
)- Compile by running
$ ../compile.sh
- Run via
$ ../run.sh
. You should see output something like:
$ ../run.sh
Benchmarking Zig
Benchmark 1: ./zig/code 40
Time (mean ± σ): 513.9 ms ± 2.9 ms [User: 504.5 ms, System: 2.6 ms]
Range (min … max): 510.6 ms … 516.2 ms 3 runs
Benchmarking C
Benchmark 1: ./c/code 40
Time (mean ± σ): 514.0 ms ± 1.1 ms [User: 505.6 ms, System: 2.8 ms]
Range (min … max): 513.2 ms … 515.2 ms 3 runs
Benchmarking Rust
Benchmark 1: ./rust/target/release/code 40
Time (mean ± σ): 514.1 ms ± 2.0 ms [User: 504.6 ms, System: 3.1 ms]
Range (min … max): 512.4 ms … 516.3 ms 3 runs
...
- For good measure, execute
$ ../clean.sh
when finished.
Hyperfine is used to warm, execute, and time the runs of the programs.
To add a language:
- Select the benchmark directory you want to add to (EG
$ cd loops
) - Create a new subdirectory for the language (EG
$ mkdir rust
) - Implement the code in the appropriately named file (EG:
code.rs
) - If the language is compiled, add appropriate command to
../compile.sh
and../clean.sh
- Add appropriate line to
../run.sh
You are also welcome to add new top-level benchmarks dirs
Each benchmark exists in a subdirectory with the corresponding name:
Several visuals have been published based on the work here. More will likely be added in the future, as this repository improves: