defmodule Benchmark do
def main do
module_function = Foo.main
_ = module_function
end
def run(params) do
number_of_trials = Enum.at(params, 0)
total_warm_up_cycles = Enum.at(params, 1)
IO.puts("Warming up . . . initiating first trial of #{number_of_trials}.")
start_time = System.monotonic_time(:millisecond)
main = main()
_ = main
end_time = System.monotonic_time(:millisecond)
elapsed_time = (end_time - start_time) / 1000
results = %{time: [elapsed_time], acc: 1}
run(results, number_of_trials, totaL_warm_up_cycles)
end
def run(results, number_of_trials, total_warm_up_cycles) do
start_time = System.monotonic_time(:millisecond)
main = main()
_ = main
end_time = System.monotonic_time(:millisecond)
new_time = (end_time - start_time) / 1000
updated_acc = results.acc + 1
average_time =
cond do
results.acc == total_warm_up_cycles -> new_time
results.acc > total_warm_up_cycles -> Enum.sum(results.time) / (updated_acc - (total_warm_up_cycles + 1))
results.acc < total_warm_up_cycles -> []
end
results =
if results.acc >= total_warm_up_cycles do
Enum.map(results, fn _map ->
%{
time: List.insert_at(results.time, 0, new_time),
acc: updated_acc,
average_time: average_time
}
end)
|> List.first
else
Enum.map(results, fn _map ->
%{
time: [],
acc: updated_acc,
average_time: 0
}
end)
|> List.first
end
cond do
results.acc < (total_warm_up_cycles + 1) -> IO.puts("Warming up . . . Trial number #{results.acc} of #{number_of_trials}.")
run(module_function, results, number_of_trials, total_warm_up_cycles)
results.acc == (total_warm_up_cycles + 1) -> IO.puts("Average execution time after #{results.acc} trials of #{number_of_trials}: #{Float.floor(results.average_time, 3)} seconds.")
run(module_function, results, number_of_trials, total_warm_up_cycles)
results.acc < number_of_trials -> IO.puts("Average execution time after #{results.acc} trials of #{number_of_trials}: #{Float.floor(results.average_time, 3)} seconds.")
run(module_function, results, number_of_trials, total_warm_up_cycles)
results.acc == number_of_trials -> IO.puts("Final trial - Average execution time: #{Float.floor(results.average_time, 3)} seconds.")
end
end
end
This is designed to accept the following command:
iex(1)> Benchmark.run([20, 5])
Which would then output:
Warming up . . . initiating first trial of 20.
Warming up . . . Trial number 2 of 20.
Warming up . . . Trial number 3 of 20.
Warming up . . . Trial number 4 of 20.
Warming up . . . Trial number 5 of 20.
Average execution time after 6 trials of 20: 0.754 seconds.
Average execution time after 7 trials of 20: 0.754 seconds.
Average execution time after 8 trials of 20: 0.549 seconds.
Average execution time after 9 trials of 20: 0.48 seconds.
Average execution time after 10 trials of 20: 0.448 seconds.
Average execution time after 11 trials of 20: 0.496 seconds.
Average execution time after 12 trials of 20: 0.471 seconds.
Average execution time after 13 trials of 20: 0.499 seconds.
Average execution time after 14 trials of 20: 0.485 seconds.
Average execution time after 15 trials of 20: 0.47 seconds.
Average execution time after 16 trials of 20: 0.457 seconds.
Average execution time after 17 trials of 20: 0.447 seconds.
Average execution time after 18 trials of 20: 0.438 seconds.
Average execution time after 19 trials of 20: 0.431 seconds.
Final trial - Average execution time: 0.427 seconds.
:ok
When attempting to compile this module, however, it triggers the following error:
== Compilation error in file lib/Benchmark.ex ==
** (CompileError) lib/Benchmark.ex:24: undefined function totaL_warm_up_cycles/0
It looks like total_warm_up_cycles = Enum.at(params, 1)
is being ignored? If so, why and how is this resolved?
Ideally I’d love to get this to run with something like iex(2)> Benchmark.run([MODULE.FUNCTION, 20, 5])
which would make this more dynamic since the particular module function to be benchmarked wouldn’t have to be hardcoded. After spending dozens of hours searching far and wide, though, I can’t find any resources that cover how exactly this is done. Is this even possible?
Thanks for all your help and opinions!