Techempower benchmarks

This will get moved to the TechEmpower mega thread, but if you look at the code you’ll see that’s it’s holding the connection for all of the queries.

Elixir/Ecto doesn’t really NEED to solve this one because Ecto hands back the connection after every query in case the beam moves on to another process that needs it.

It could be done with repo transaction or Ecto Multi IIRC.

I spent some time looking at the source for Ruby, Elixir and Go examples on there. They both have some very specific query tuning going on.

1 Like

yeah, see the other threads…

the benchmark is low concurrency (256) - the top frameworks doesn’t use the DB pool - eg 256 db connections and 256 keepalive clients…

so these frameworks does 256 db conn checkouts for the entire benchmark - the code is weird - and some of them have like 50-60 queries per request - ecto/phoenix does individual checkouts per query - so phoenix does thousands and thousands of db conn checkouts…

the benchmark setup is not real world production - from anything I know about or have ever heard about running in production like this.

also these tests are usually heavy on json serialization - does phoenix use the newer faster Jason - or even native jiffy? (the ruby json serializer is written in c - most likely also the .net one etc.)

ecto 3.0 should be getting the possibility for explicit checkout of the db conn - which will be beneficial for these benchmarks and for query heavy real world endpoints(real world: I do 8-10 db queries on channel join in my app - holding the conn might be beneficial).

using the jason json serializer should also 2x-ish the json part of the benchmark…

honestly, you should not use or give any value to the techempower benchmarks…

1 Like

This benchmark again. :expressionless:

Just because it is fun, here’s results of a recent test I did on a big server of a variety of frameworks that are testing just and only pure throughput, as always take all benchmarks (especially techempower’s and this one too) with even less than a grain of salt, but if anyone likes looking at fun benchmarks, here’s another (from):

╰─➤ …/stats.exs -w 1 -d 3 _
Total Cores: 16
Concurrent Connections: 1000

Processing servers:

  • bin/server_cpp_evhtp
  • bin/server_crystal_kemal
  • bin/server_crystal_lucky
  • bin/server_crystal_raze
  • bin/server_crystal_router_cr
  • bin/server_csharp_aspnetcore
  • bin/server_elixir_phoenix
  • bin/server_elixir_plug
  • bin/server_go_echo
  • bin/server_go_fasthttprouter
  • bin/server_go_gin
  • bin/server_go_gorilla_mux
  • bin/server_go_iris
  • bin/server_nim_jester
  • bin/server_nim_mofuw
  • bin/server_node_clusterexpress
  • bin/server_node_clusterpolka
  • bin/server_node_express
  • bin/server_node_polka
  • bin/server_python_django
  • bin/server_python_flask
  • bin/server_python_flask.py
  • bin/server_python_japronto
  • bin/server_python_sanic
  • bin/server_ruby_rack-routing
  • bin/server_ruby_rails
  • bin/server_ruby_roda
  • bin/server_ruby_sinatra
  • bin/server_rust_iron
  • bin/server_rust_nickel
  • bin/server_rust_rocket

Processing: bin/server_cpp_evhtp
Processing: bin/server_crystal_kemal
Processing: bin/server_crystal_lucky
Processing: bin/server_crystal_raze
Processing: bin/server_crystal_router_cr
Processing: bin/server_csharp_aspnetcore
Processing: bin/server_elixir_phoenix
Processing: bin/server_elixir_plug
Processing: bin/server_go_echo
Processing: bin/server_go_fasthttprouter
Processing: bin/server_go_gin
Processing: bin/server_go_gorilla_mux
Processing: bin/server_go_iris
Processing: bin/server_nim_jester
Processing: bin/server_nim_mofuw
Processing: bin/server_node_clusterexpress
Processing: bin/server_node_clusterpolka
Processing: bin/server_node_express
Processing: bin/server_node_polka
Processing: bin/server_python_django
Processing: bin/server_python_flask
Processing: bin/server_python_flask.py
Processing: bin/server_python_japronto
Processing: bin/server_python_sanic
Processing: bin/server_ruby_rack-routing
Processing: bin/server_ruby_rails
Processing: bin/server_ruby_roda
Processing: bin/server_ruby_sinatra
Processing: bin/server_rust_iron
Processing: bin/server_rust_nickel
Processing: bin/server_rust_rocket

Path URL Errors Total Requests Count Total Requests/s Total Requests Throughput Total Throughput/s Req/s Avg Req/s Stdev Req/s Max Req/s +/- Latency Avg Latency Stdev Latency Max Latency +/- 50% 75% 90% 99%
bin/server_cpp_evhtp http://127.0.0.1:3000/ 0 2156132 705937.67 131.60MB 43.09MB 90.19k 27.93k 127.42k 65.83% 2.27ms 3.58ms 205.00ms 88.00% 737.00us 2.23ms 6.91ms 15.64ms
bin/server_cpp_evhtp http://127.0.0.1:3000/user 0 2045693 672015.9 124.86MB 41.02MB 85.52k 23.76k 127.92k 57.50% 2.12ms 3.21ms 216.46ms 88.70% 751.00us 2.40ms 5.81ms 13.83ms
bin/server_cpp_evhtp http://127.0.0.1:3000/user/0 0 2029523 663386.33 181.94MB 59.47MB 84.87k 22.85k 121.17k 57.50% 2.10ms 3.09ms 204.78ms 88.46% 749.00us 2.29ms 5.80ms 14.08ms
bin/server_crystal_kemal http://127.0.0.1:3000/ 0 180744 59355.88 18.62MB 6.11MB 7.57k 743.38 8.70k 80.42% 16.65ms 4.70ms 59.28ms 88.20% 13.89ms 20.80ms 21.63ms 29.21ms
bin/server_crystal_kemal http://127.0.0.1:3000/user 0 141240 46747.22 14.55MB 4.81MB 5.91k 0.88k 8.71k 77.08% 20.20ms 8.01ms 227.16ms 94.81% 20.71ms 24.05ms 24.57ms 31.00ms
bin/server_crystal_kemal http://127.0.0.1:3000/user/0 0 121486 39714.25 15.99MB 5.23MB 5.08k 631.49 6.71k 73.75% 23.51ms 6.13ms 230.42ms 75.21% 25.36ms 26.08ms 27.92ms 33.63ms
bin/server_crystal_lucky http://127.0.0.1:3000/ 0 176189 57726.06 13.11MB 4.29MB 7.37k 578.14 8.81k 74.17% 16.46ms 3.95ms 30.37ms 65.87% 15.09ms 20.01ms 20.84ms 26.10ms
bin/server_crystal_lucky http://127.0.0.1:3000/user 0 149844 49541.12 11.15MB 3.69MB 6.27k 362.60 7.59k 76.67% 19.56ms 3.95ms 31.88ms 61.46% 21.22ms 22.41ms 22.94ms 28.06ms
bin/server_crystal_lucky http://127.0.0.1:3000/user/0 0 153565 50761.87 17.28MB 5.71MB 6.43k 781.17 8.32k 66.25% 18.63ms 4.06ms 29.49ms 61.24% 19.96ms 21.87ms 22.26ms 26.06ms
bin/server_crystal_raze http://127.0.0.1:3000/ 0 233135 76288.13 13.78MB 4.51MB 9.76k 786.15 12.50k 72.92% 12.29ms 3.89ms 220.79ms 82.95% 11.20ms 12.19ms 18.16ms 18.82ms
bin/server_crystal_raze http://127.0.0.1:3000/user 0 185399 61392.67 10.96MB 3.63MB 7.76k 653.98 9.99k 72.08% 15.50ms 3.76ms 216.45ms 55.24% 13.41ms 19.38ms 20.00ms 22.56ms
bin/server_crystal_raze http://127.0.0.1:3000/user/0 0 175584 57626.32 15.41MB 5.06MB 7.35k 694.44 8.84k 76.67% 16.46ms 3.36ms 26.91ms 53.15% 18.13ms 19.92ms 20.28ms 21.61ms
bin/server_crystal_router_cr http://127.0.0.1:3000/ 0 218501 71695.57 12.92MB 4.24MB 9.15k 683.06 10.11k 80.42% 13.71ms 3.52ms 37.88ms 84.16% 11.56ms 16.90ms 17.55ms 23.78ms
bin/server_crystal_router_cr http://127.0.0.1:3000/user 0 198660 65673.96 11.75MB 3.88MB 8.31k 1.48k 12.28k 76.67% 14.58ms 11.59ms 226.51ms 99.00% 12.43ms 18.04ms 19.46ms 26.10ms
bin/server_crystal_router_cr http://127.0.0.1:3000/user/0 0 182777 60266.05 16.04MB 5.29MB 7.65k 838.48 8.79k 76.67% 16.02ms 4.21ms 50.41ms 87.15% 13.38ms 19.25ms 20.32ms 26.49ms
bin/server_csharp_aspnetcore http://127.0.0.1:3000/ 0 378524 124730.41 38.26MB 12.61MB 15.83k 3.70k 25.83k 68.75% 9.63ms 25.53ms 457.99ms 98.18% 6.95ms 10.26ms 14.09ms 133.66ms
bin/server_csharp_aspnetcore http://127.0.0.1:3000/user 0 333324 109888.1 33.70MB 11.11MB 13.95k 4.26k 27.28k 72.50% 8.38ms 12.69ms 236.92ms 96.51% 7.77ms 10.60ms 15.26ms 31.31ms
bin/server_csharp_aspnetcore http://127.0.0.1:3000/user/0 0 306721 100775.99 41.24MB 13.55MB 12.83k 3.70k 27.89k 72.08% 9.41ms 14.55ms 248.14ms 96.81% 8.12ms 11.05ms 17.44ms 43.06ms
bin/server_elixir_phoenix http://127.0.0.1:3000/ 0 392165 127368.81 53.48MB 17.37MB 16.36k 2.79k 24.13k 69.17% 7.96ms 13.60ms 258.86ms 96.98% 7.08ms 11.53ms 16.03ms 28.81ms
bin/server_elixir_phoenix http://127.0.0.1:3000/user 0 342806 112356.8 46.75MB 15.32MB 14.31k 2.97k 22.12k 67.50% 9.11ms 13.55ms 248.43ms 95.16% 8.39ms 13.53ms 18.73ms 33.42ms
bin/server_elixir_phoenix http://127.0.0.1:3000/user/0 0 361732 117508.71 59.68MB 19.39MB 15.08k 3.51k 30.93k 67.50% 9.23ms 15.68ms 247.18ms 95.77% 7.76ms 13.64ms 19.63ms 41.16ms
bin/server_elixir_plug http://127.0.0.1:3000/ 0 405226 131613.59 55.26MB 17.95MB 16.93k 2.30k 24.00k 70.00% 8.06ms 13.98ms 228.96ms 97.61% 7.09ms 10.97ms 15.15ms 30.30ms
bin/server_elixir_plug http://127.0.0.1:3000/user 0 392856 129104.6 53.58MB 17.61MB 16.38k 2.71k 22.03k 68.75% 8.71ms 14.30ms 231.07ms 96.33% 7.91ms 12.49ms 17.58ms 34.64ms
bin/server_elixir_plug http://127.0.0.1:3000/user/0 0 374981 122544.93 61.51MB 20.10MB 15.66k 2.20k 21.16k 67.50% 8.32ms 13.38ms 225.04ms 97.03% 7.33ms 11.59ms 16.01ms 30.10ms
bin/server_go_echo http://127.0.0.1:3000/ 0 819571 264847.72 90.67MB 29.30MB 34.04k 4.84k 52.41k 70.42% 3.58ms 7.19ms 215.66ms 97.69% 2.50ms 4.45ms 6.57ms 14.80ms
bin/server_go_echo http://127.0.0.1:3000/user 0 807222 265386.98 89.30MB 29.36MB 33.64k 6.63k 53.95k 69.17% 4.34ms 11.17ms 221.52ms 97.64% 2.41ms 4.65ms 7.67ms 23.99ms
bin/server_go_echo http://127.0.0.1:3000/user/0 0 785742 255415.71 109.40MB 35.56MB 32.68k 4.90k 51.22k 72.92% 3.48ms 4.93ms 204.84ms 95.48% 2.61ms 4.66ms 6.75ms 11.88ms
bin/server_go_fasthttprouter http://127.0.0.1:3000/ 0 1621931 527439.74 143.85MB 46.78MB 67.60k 14.16k 115.33k 69.17% 2.04ms 6.78ms 207.12ms 98.96% 1.16ms 1.97ms 3.59ms 8.91ms
bin/server_go_fasthttprouter http://127.0.0.1:3000/user 0 1526712 502096.08 135.41MB 44.53MB 63.65k 13.35k 103.73k 65.83% 1.92ms 4.29ms 209.18ms 97.39% 1.23ms 2.14ms 3.68ms 8.49ms
bin/server_go_fasthttprouter http://127.0.0.1:3000/user/0 0 1444057 465506.19 225.85MB 72.81MB 61.11k 14.88k 117.29k 70.34% 1.85ms 2.34ms 205.80ms 91.16% 1.22ms 2.20ms 3.91ms 8.65ms
bin/server_go_gin http://127.0.0.1:3000/ 0 612884 199074.99 67.80MB 22.02MB 25.51k 4.58k 46.50k 69.17% 4.97ms 8.14ms 208.75ms 95.82% 3.67ms 6.01ms 9.10ms 23.62ms
bin/server_go_gin http://127.0.0.1:3000/user 0 591529 193788.08 65.44MB 21.44MB 24.62k 5.02k 39.03k 64.58% 6.66ms 14.53ms 411.65ms 95.60% 3.67ms 6.91ms 13.00ms 46.83ms
bin/server_go_gin http://127.0.0.1:3000/user/0 0 607260 197618.21 84.55MB 27.52MB 25.27k 4.29k 37.62k 70.42% 5.66ms 11.72ms 222.35ms 96.43% 3.56ms 6.26ms 10.21ms 35.96ms
bin/server_go_gorilla_mux http://127.0.0.1:3000/ 0 775874 253425.31 85.83MB 28.04MB 32.35k 4.69k 52.35k 68.75% 3.98ms 9.85ms 215.92ms 98.89% 2.68ms 4.52ms 6.39ms 14.63ms
bin/server_go_gorilla_mux http://127.0.0.1:3000/user 0 761568 249912.71 84.25MB 27.65MB 31.72k 6.29k 52.76k 73.75% 4.17ms 8.70ms 212.63ms 96.56% 2.59ms 4.84ms 7.58ms 22.54ms
bin/server_go_gorilla_mux http://127.0.0.1:3000/user/0 0 678855 219975.73 93.23MB 30.21MB 28.27k 3.50k 43.58k 68.75% 4.52ms 8.56ms 209.87ms 97.74% 3.27ms 5.41ms 7.87ms 18.69ms
bin/server_go_iris http://127.0.0.1:3000/ 0 925624 301753.59 102.40MB 33.38MB 38.49k 7.09k 67.67k 72.08% 3.84ms 11.59ms 211.21ms 98.76% 2.21ms 3.86ms 6.11ms 19.89ms
bin/server_go_iris http://127.0.0.1:3000/user 0 908462 296528.82 100.50MB 32.80MB 37.82k 6.18k 55.85k 68.33% 3.17ms 4.30ms 206.58ms 93.73% 2.29ms 4.05ms 6.31ms 13.02ms
bin/server_go_iris http://127.0.0.1:3000/user/0 0 885129 287740.15 123.24MB 40.06MB 36.80k 6.14k 71.85k 72.92% 4.11ms 12.50ms 212.94ms 98.70% 2.35ms 4.00ms 6.32ms 22.05ms
bin/server_nim_jester http://127.0.0.1:3000/ 0 104348 34268.04 7.66MB 2.52MB 4.37k 1.20k 8.59k 72.50% 67.11ms 202.64ms 1.98s 93.88% 18.81ms 22.42ms 25.69ms 1.16s
bin/server_nim_jester http://127.0.0.1:3000/user 0 86728 28680.98 6.37MB 2.11MB 3.63k 1.29k 6.85k 63.33% 67.98ms 198.61ms 1.97s 94.08% 21.83ms 25.40ms 28.05ms 1.14s
bin/server_nim_jester http://127.0.0.1:3000/user/0 0 98151 32163.86 10.02MB 3.28MB 4.11k 1.08k 6.54k 61.67% 68.95ms 203.43ms 1.97s 93.77% 19.97ms 23.47ms 26.09ms 1.16s
bin/server_nim_mofuw http://127.0.0.1:3000/ 0 1671295 539182.73 218.36MB 70.45MB 70.07k 22.02k 105.93k 59.41% 2.53ms 3.23ms 43.80ms 86.57% 0.88ms 3.43ms 6.93ms 14.61ms
bin/server_nim_mofuw http://127.0.0.1:3000/user 0 1633107 532965.14 213.37MB 69.63MB 68.21k 23.16k 103.35k 55.00% 2.60ms 3.31ms 41.45ms 86.72% 0.91ms 3.53ms 7.04ms 14.93ms
bin/server_nim_mofuw http://127.0.0.1:3000/user/0 0 1501165 484228.34 239.08MB 77.12MB 63.61k 23.52k 120.07k 56.78% 2.58ms 3.10ms 37.81ms 86.16% 1.00ms 3.53ms 6.83ms 13.92ms
bin/server_node_clusterexpress http://127.0.0.1:3000/ 0 318062 102691.44 49.14MB 15.87MB 13.15k 1.16k 17.59k 78.24% 8.74ms 5.69ms 222.79ms 85.92% 7.53ms 9.38ms 15.07ms 21.74ms
bin/server_node_clusterexpress http://127.0.0.1:3000/user 0 317782 103023.11 49.10MB 15.92MB 13.10k 0.91k 16.68k 77.92% 8.99ms 5.80ms 138.44ms 85.88% 7.59ms 10.93ms 15.19ms 20.04ms
bin/server_node_clusterexpress http://127.0.0.1:3000/user/0 0 299576 96774.53 54.85MB 17.72MB 12.41k 0.95k 18.04k 80.83% 9.83ms 7.34ms 161.71ms 95.52% 7.76ms 11.66ms 15.51ms 23.78ms
bin/server_node_clusterpolka http://127.0.0.1:3000/ 0 645788 209503.08 60.97MB 19.78MB 26.90k 9.70k 60.35k 56.67% 4.48ms 6.63ms 226.78ms 94.30% 3.20ms 5.58ms 7.77ms 19.59ms
bin/server_node_clusterpolka http://127.0.0.1:3000/user 0 593863 191684.9 56.07MB 18.10MB 24.64k 8.42k 56.39k 70.00% 4.70ms 3.70ms 68.53ms 83.68% 3.63ms 7.06ms 7.72ms 18.28ms
bin/server_node_clusterpolka http://127.0.0.1:3000/user/0 0 570068 183965.9 70.13MB 22.63MB 23.65k 7.04k 50.45k 70.42% 6.09ms 9.73ms 175.25ms 96.31% 4.00ms 7.13ms 9.27ms 46.69ms
bin/server_node_express http://127.0.0.1:3000/ 0 34993 11511.97 5.41MB 1.78MB 1.46k 280.53 2.04k 68.33% 60.17ms 33.93ms 570.70ms 97.65% 66.44ms 70.94ms 71.24ms 200.86ms
bin/server_node_express http://127.0.0.1:3000/user 0 33819 11198.21 5.22MB 1.73MB 1.42k 214.86 2.02k 72.92% 84.16ms 99.75ms 1.08s 95.66% 82.90ms 87.92ms 88.56ms 611.26ms
bin/server_node_express http://127.0.0.1:3000/user/0 0 32196 10591.63 5.90MB 1.94MB 1.35k 184.29 1.96k 71.67% 75.17ms 17.14ms 399.10ms 94.00% 78.91ms 80.32ms 80.56ms 105.86ms
bin/server_node_polka http://127.0.0.1:3000/ 0 76015 24987.94 7.18MB 2.36MB 3.18k 343.50 4.29k 75.42% 37.98ms 37.48ms 512.75ms 97.79% 37.73ms 38.35ms 40.62ms 250.85ms
bin/server_node_polka http://127.0.0.1:3000/user 0 70009 23143.53 6.61MB 2.19MB 2.93k 0.90k 7.92k 78.33% 40.43ms 47.99ms 548.47ms 96.88% 40.91ms 42.31ms 45.05ms 304.40ms
bin/server_node_polka http://127.0.0.1:3000/user/0 0 65674 21601.55 8.08MB 2.66MB 2.75k 340.76 3.97k 70.42% 43.86ms 35.44ms 548.26ms 97.57% 43.59ms 44.37ms 46.98ms 236.18ms
bin/server_python_django http://127.0.0.1:3000/ 2009 2009 662.34 478.71KB 157.82KB 131.60 93.46 390.00 70.00% 189.21ms 126.89ms 1.67s 91.19% 161.77ms 162.11ms 168.71ms 648.35ms
bin/server_python_django http://127.0.0.1:3000/user 1390 1390 457.19 331.21KB 108.94KB 78.47 55.33 232.00 65.24% 251.68ms 214.92ms 1.82s 80.79% 169.18ms 191.07ms 653.37ms 990.28ms
bin/server_python_django http://127.0.0.1:3000/user/0 1739 1739 574.2 414.37KB 136.82KB 84.44 78.37 505.00 86.87% 193.83ms 154.23ms 1.67s 90.97% 167.15ms 168.41ms 323.15ms 661.44ms
bin/server_python_flask http://127.0.0.1:3000/ 2400 2400 792.29 571.88KB 188.79KB 103.10 77.09 310.00 74.89% 137.11ms 37.08ms 982.96ms 74.38% 150.92ms 151.64ms 152.36ms 203.48ms
bin/server_python_flask http://127.0.0.1:3000/user 1917 1917 634.62 456.79KB 151.22KB 100.57 98.21 424.00 82.80% 179.99ms 143.72ms 1.82s 92.96% 160.01ms 161.39ms 173.67ms 650.35ms
bin/server_python_flask http://127.0.0.1:3000/user/0 2239 2239 739.73 533.51KB 176.26KB 98.72 44.05 171.00 66.37% 115.38ms 58.76ms 937.90ms 95.49% 106.06ms 106.54ms 107.33ms 330.52ms
bin/server_python_flask.py http://127.0.0.1:3000/ 0 6570 2157.47 0.96MB 322.36KB 420.35 211.76 0.87k 66.89% 54.86ms 84.03ms 1.71s 96.47% 45.96ms 46.33ms 46.55ms 530.84ms
bin/server_python_flask.py http://127.0.0.1:3000/user 0 3863 1247.86 577.19KB 186.45KB 241.25 135.24 555.00 67.11% 95.27ms 160.02ms 1.66s 93.09% 49.03ms 56.90ms 69.50ms 683.14ms
bin/server_python_flask.py http://127.0.0.1:3000/user/0 0 5994 1945.56 1.05MB 347.70KB 277.92 152.44 613.00 61.06% 54.43ms 57.86ms 882.16ms 94.98% 44.32ms 49.48ms 54.78ms 301.67ms
bin/server_python_japronto http://127.0.0.1:3000/ 0 351606 113929.6 26.49MB 8.58MB 15.77k 4.32k 60.80k 95.98% 7.94ms 1.42ms 24.67ms 91.49% 8.30ms 8.38ms 8.51ms 9.04ms
bin/server_python_japronto http://127.0.0.1:3000/user 0 341686 110321.4 25.74MB 8.31MB 14.80k 3.30k 36.56k 96.55% 8.47ms 1.50ms 18.28ms 91.15% 8.72ms 8.95ms 9.34ms 10.04ms
bin/server_python_japronto http://127.0.0.1:3000/user/0 0 312898 100973.34 32.53MB 10.50MB 14.04k 6.19k 71.09k 95.98% 8.93ms 2.38ms 19.28ms 86.98% 9.55ms 10.01ms 10.24ms 12.86ms
bin/server_python_sanic http://127.0.0.1:3000/ 0 580624 187388.49 65.89MB 21.27MB 27.42k 11.10k 72.61k 66.98% 5.66ms 6.19ms 62.27ms 85.36% 3.56ms 8.12ms 14.45ms 26.23ms
bin/server_python_sanic http://127.0.0.1:3000/user 0 554967 179550.77 62.98MB 20.38MB 25.93k 10.25k 100.85k 78.50% 5.34ms 5.09ms 59.48ms 87.21% 3.70ms 7.43ms 11.47ms 24.14ms
bin/server_python_sanic http://127.0.0.1:3000/user/0 0 567336 183433.45 80.62MB 26.07MB 26.80k 10.78k 84.82k 67.92% 5.28ms 5.13ms 44.22ms 85.32% 3.68ms 7.58ms 12.10ms 23.16ms
bin/server_ruby_rack-routing http://127.0.0.1:3000/ 0 15179 4951.28 563.28KB 183.74KB 635.38 339.65 1.91k 74.58% 3.15ms 5.25ms 89.85ms 87.96% 244.00us 4.18ms 9.57ms 23.86ms
bin/server_ruby_rack-routing http://127.0.0.1:3000/user 0 13006 4272.23 482.64KB 158.54KB 622.16 332.39 1.99k 70.48% 3.69ms 5.45ms 63.49ms 87.51% 1.35ms 5.17ms 10.45ms 24.99ms
bin/server_ruby_rack-routing http://127.0.0.1:3000/user/0 0 12338 4011.15 819.32KB 266.37KB 516.25 187.29 1.32k 69.17% 3.88ms 5.03ms 51.09ms 86.64% 1.99ms 5.73ms 10.35ms 22.49ms
bin/server_ruby_rails http://127.0.0.1:3000/ 0 1875 613.3 320.43KB 104.81KB 125.37 9.19 141.00 84.00% 7.99ms 1.32ms 29.69ms 90.56% 7.79ms 8.14ms 8.87ms 10.92ms
bin/server_ruby_rails http://127.0.0.1:3000/user 0 1904 628.44 325.39KB 107.40KB 127.39 10.26 151.00 70.67% 7.87ms 1.01ms 22.48ms 85.24% 7.79ms 8.25ms 8.79ms 10.04ms
bin/server_ruby_rails http://127.0.0.1:3000/user/0 0 1657 542.23 480.59KB 157.27KB 110.92 9.06 131.00 50.67% 9.04ms 1.34ms 31.06ms 91.67% 8.85ms 9.33ms 10.12ms 11.67ms
bin/server_ruby_roda http://127.0.0.1:3000/ 0 18740 6098.28 1.13MB 375.19KB 784.14 305.62 2.28k 73.33% 2.56ms 3.68ms 34.64ms 85.97% 0.91ms 3.70ms 7.45ms 16.09ms
bin/server_ruby_roda http://127.0.0.1:3000/user 0 17809 5864.87 1.07MB 360.83KB 744.92 309.04 2.21k 72.92% 2.69ms 3.64ms 47.11ms 86.84% 1.10ms 3.95ms 7.39ms 15.84ms
bin/server_ruby_roda http://127.0.0.1:3000/user/0 0 15445 5022.63 1.36MB 451.25KB 646.34 116.34 1.02k 74.17% 3.10ms 3.51ms 34.21ms 86.05% 2.00ms 4.55ms 7.71ms 15.43ms
bin/server_ruby_sinatra http://127.0.0.1:3000/ 0 6785 2222.48 1.11MB 373.31KB 283.81 129.05 830.00 71.25% 7.05ms 10.10ms 94.58ms 86.75% 2.67ms 10.21ms 20.15ms 45.19ms
bin/server_ruby_sinatra http://127.0.0.1:3000/user 0 6105 2003.0 1.00MB 336.44KB 255.30 97.98 660.00 73.75% 7.83ms 10.76ms 160.95ms 87.26% 3.31ms 11.58ms 21.03ms 46.81ms
bin/server_ruby_sinatra http://127.0.0.1:3000/user/0 0 5760 1879.4 1.11MB 370.74KB 240.86 96.47 510.00 70.42% 8.31ms 10.74ms 98.79ms 85.99% 4.15ms 12.54ms 22.77ms 47.71ms
bin/server_rust_iron http://127.0.0.1:3000/ 0 1188777 387991.67 85.03MB 27.75MB 56.88k 38.75k 116.29k 44.29% 261.99us 287.77us 20.67ms 96.25% 221.00us 321.00us 411.00us 813.00us
bin/server_rust_iron http://127.0.0.1:3000/user 0 1134025 372690.66 81.11MB 26.66MB 63.33k 38.46k 118.83k 49.44% 259.09us 1.03ms 210.33ms 99.79% 210.00us 316.00us 400.00us 808.00us
bin/server_rust_iron http://127.0.0.1:3000/user/0 0 1057872 344931.0 132.16MB 43.09MB 44.31k 28.36k 105.51k 55.83% 326.64us 273.32us 27.81ms 91.85% 301.00us 419.00us 542.00us 1.05ms
bin/server_rust_nickel http://127.0.0.1:3000/ 0 1099128 360547.86 136.27MB 44.70MB 122.94k 16.13k 148.69k 54.44% 42.98us 64.37us 12.15ms 99.90% 39.00us 49.00us 70.00us 82.00us
bin/server_rust_nickel http://127.0.0.1:3000/user 0 920631 304764.17 114.14MB 37.78MB 154.26k 3.23k 159.85k 78.33% 47.97us 19.87us 6.96ms 84.82% 55.00us 60.00us 63.00us 73.00us
bin/server_rust_nickel http://127.0.0.1:3000/user/0 0 1067947 348500.72 168.05MB 54.84MB 119.31k 5.50k 129.25k 74.44% 48.61us 92.77us 12.04ms 99.91% 47.00us 55.00us 62.00us 77.00us
bin/server_rust_rocket http://127.0.0.1:3000/ 0 844189 277320.55 73.26MB 24.07MB 94.31k 45.87k 134.78k 66.67% 74.61us 104.76us 10.88ms 99.76% 65.00us 92.00us 123.00us 156.00us
bin/server_rust_rocket http://127.0.0.1:3000/user 0 1083600 358731.21 94.04MB 31.13MB 121.17k 8.30k 138.57k 61.11% 77.38us 265.66us 12.24ms 99.33% 56.00us 68.00us 80.00us 146.00us
bin/server_rust_rocket http://127.0.0.1:3000/user/0 0 754582 247564.57 116.58MB 38.25MB 126.60k 12.62k 148.16k 68.33% 80.26us 89.57us 12.12ms 89.14% 72.00us 100.00us 177.00us 219.00us

Rankings

Ranking by Average Requests per second:

  1. 680446 req/sec : bin/server_cpp_evhtp
  2. 518792 req/sec : bin/server_nim_mofuw
  3. 498347 req/sec : bin/server_go_fasthttprouter
  4. 368537 req/sec : bin/server_rust_iron
  5. 337937 req/sec : bin/server_rust_nickel
  6. 295340 req/sec : bin/server_go_iris
  7. 294538 req/sec : bin/server_rust_rocket
  8. 261883 req/sec : bin/server_go_echo
  9. 241104 req/sec : bin/server_go_gorilla_mux
  10. 196827 req/sec : bin/server_go_gin
  11. 195051 req/sec : bin/server_node_clusterpolka
  12. 183457 req/sec : bin/server_python_sanic
  13. 127754 req/sec : bin/server_elixir_plug
  14. 119078 req/sec : bin/server_elixir_phoenix
  15. 111798 req/sec : bin/server_csharp_aspnetcore
  16. 108408 req/sec : bin/server_python_japronto
  17. 100829 req/sec : bin/server_node_clusterexpress
  18. 65878 req/sec : bin/server_crystal_router_cr
  19. 65102 req/sec : bin/server_crystal_raze
  20. 52676 req/sec : bin/server_crystal_lucky
  21. 48605 req/sec : bin/server_crystal_kemal
  22. 31704 req/sec : bin/server_nim_jester
  23. 23244 req/sec : bin/server_node_polka
  24. 11100 req/sec : bin/server_node_express
  25. 5661 req/sec : bin/server_ruby_roda
  26. 4411 req/sec : bin/server_ruby_rack-routing
  27. 2034 req/sec : bin/server_ruby_sinatra
  28. 1783 req/sec : bin/server_python_flask.py
  29. 722 req/sec : bin/server_python_flask
  30. 594 req/sec : bin/server_ruby_rails
  31. 564 req/sec : bin/server_python_django
6 Likes

That lines up with what I’ve seen in anecdotally.

Haha, you’d be better off using Elixir/Phoenix to benchmark wrk than using wrk to benchmark Elixir/Phoenix :wink:

Well actually I’m using wrk controlled via Elixir. ^.^

Just a heads up, the new Techempower benchmarks are out. The Elixir code doesn’t look like it’s changed much at all. It’s still using Poison instead of Jason. No HiPE.

Should it be using hipe? I don’t think it would help at all in a web server benchmark …

On the JSON benchmark it will if Jason is being used, just based on the numbers we’ve seen in here before.

A few of the benchmarks use JSON, so getting that optimized will have a cascading effect.

What about the performance of the web server itself?

No idea there. I haven’t seen any numbers specific to it.

This page “rank” the web servers, then im curious about…

How nodejs can archive 997,327 and rank to 87 and Phoenix 186,774 and rank to 199

Something really strange…

They are running mix phx.server instead of building a release and run it as a binary.

3 Likes

And many issues more, there are a lot of threads in this forum that talk about the downsides of the Techempower benchmark or about any other.

As I’m on mobile I’ve a hard time to search and link, but I’m pretty sure, someone else will do soon.

2 Likes
1 Like

You have to understand what you’re measuring. Here’s a screenshot of the same test, Fortunes, but only with JS and Elixir implementations. You’ll see Phoenix far behind a bunch of JS versions.

Now look to the very right of the picture. This column indicates the type of implementation. You can see the JS ones are “Micro” or “Platform”, while Phoenix is “Full”. You’re comparing a full framework realistic implementation with a bunch of stripped-down optimized versions.

No real optimization work has been done on that Phoenix implementation for years too, so there’s no doubt it could be made faster. Although with any change you’d have to think about whether you’re making the implementation less realistic, and then what’s the point? Do a stripped-down version with only Plug or something instead.

I’m not sure where the idea that releases are faster than running with mix comes from, but AFAIK there’s no additional optimization steps for releases, other than preloading modules (which doesn’t matter here since there’s a warmup). I’d be happy to learn otherwise though!

The documentation for mix releases has a good summary of the benefits of releases

https://hexdocs.pm/mix/Mix.Tasks.Release.html#module-why-releases

4 Likes

At least one optimization is described in the linked article, if I’m not mistaken:

Code preloading. […] There’s a downside. When you start a new server in production, it may need to load many other modules, causing the first requests to have an unusual spike in response time. Releases run in embedded mode, which loads all available modules upfront, guaranteeing your system is ready to handle requests after booting.

Also: v8 is highly optimized for execution performance and fast startup times. Actually it’s not surprising that it can outperform the BEAM in certain benchmarks and under specific conditions.


Also OT: @NobbZ, your German autocorrect is leaking :smiley:

of the Rechenpower benchmark or about any other.

1 Like

And this is even though I have English as primary language in gboard…

1 Like

I did mention pre-loading in my post: “other than preloading modules”, and also why it’s not relevant for the benchmark: “which doesn’t matter here since there’s a warmup”.

Yeah, absolutely, V8 will totally win a silly benchmark like Fortunes because it’s just IO and V8 is heavily optimized for it, while the BEAM focuses on other things (like p99).