Concurrency in modern programming languages: Rust vs Go vs Java vs Node.js vs Deno

This popped up in my Google news feed last night…

Concurrency in modern programming languages: Rust vs Go vs Java vs Node.js vs Deno

That first thing that struck me about the article – given the title – is that Erlang and/or Elixir (I would think Elixir should considered a “modern programming language”) weren’t even mentioned, much less, even considered for a benchmark test.

Never the less, after reading through the article(s) (7 in all), I decided to tinker just for fun. Even though the benchmarks seem meaningless (as pointed out in one or more of the comments), I wondered how well Elixir would preform, so I created a simple Elixir app using only Cowboy w/Plug to serve the html file like the other examples in the benchmark tests.

mix new plug_test --sup

# lib/plug_test.ex
defmodule PlugTest do
  import Plug.Conn

  def init(options), do: options

  def call(conn, _opts) do
    Process.sleep(200)

    conn
    |> put_resp_header("connection", "keep-alive")
    |> put_resp_content_type("text/html; charset=utf-8")
    |> send_file(200, "hello.html")
  end
end

# hello.html
<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="utf-8">
    <title>Hello!</title>
  </head>
  <body>
    <h1>Hello!</h1>
    <p>Hi from Elixir</p>
  </body>
</html>

Notice the Process.sleep(200) in call/2 above. In the code the author wrote for rust, go, etc., he introduces a 2 second sleep every tenth request. I’m currently not aware of a way to do that (?), so I just decided to sleep each request process for 200 ms instead.

Here’s what I got after 2 back-to-back runs of ab -c 100 -n 10000 http://localhost:4000/

Server Software:        Cowboy
Server Hostname:        localhost
Server Port:            4000

Document Path:          /
Document Length:        178 bytes

Concurrency Level:      100
Time taken for tests:   20.483 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      4090000 bytes
HTML transferred:       1780000 bytes
Requests per second:    488.20 [#/sec] (mean)
Time per request:       204.833 [ms] (mean)
Time per request:       2.048 [ms] (mean, across all concurrent requests)
Transfer rate:          194.99 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.5      0       5
Processing:   200  202   1.1    202     211
Waiting:      200  201   1.0    201     210
Total:        201  202   1.5    202     215

Percentage of the requests served within a certain time (ms)
  50%    202
  66%    202
  75%    203
  80%    203
  90%    204
  95%    205
  98%    206
  99%    209
 100%    215 (longest request)
Server Software:        Cowboy
Server Hostname:        localhost
Server Port:            4000

Document Path:          /
Document Length:        178 bytes

Concurrency Level:      100
Time taken for tests:   20.490 seconds
Complete requests:      10000
Failed requests:        0
Total transferred:      4090000 bytes
HTML transferred:       1780000 bytes
Requests per second:    488.04 [#/sec] (mean)
Time per request:       204.901 [ms] (mean)
Time per request:       2.049 [ms] (mean, across all concurrent requests)
Transfer rate:          194.93 [Kbytes/sec] received

Connection Times (ms)
              min  mean[+/-sd] median   max
Connect:        0    0   0.4      0       4
Processing:   200  202   1.1    202     211
Waiting:      200  201   0.9    201     211
Total:        201  202   1.3    202     212

Percentage of the requests served within a certain time (ms)
  50%    202
  66%    202
  75%    203
  80%    203
  90%    204
  95%    204
  98%    206
  99%    208
 100%    212 (longest request)

I ran the above benchmarks on my workstation PC with the following specs:

Ubuntu Linux 20.04.1
Intel(R) Core™ i7-4770K CPU @ 3.50GHz w/ 8 cores & 16GB of memory

Erlang: 24.2.1
Elixir: 1.13.2-otp-24

I know the article is almost pointless, but what are your thoughts?

Stuff like this seems to pop up all time as well as the flurry of articles with titles like The Top 10 Backend Programming Languages You Should Know In 2022 etc., that show up near the beginning of the year, most all of which, hardly ever mention Elixir. :frowning:

2 Likes

If someone is trying to serve a static hello.html and benchmark it - they should also include nginx, apache in it :smile: .

Real fun is inside the comments section - don’t miss it.

All you demonstrated is that thread sleep works in all languages, and that overhead is significantly smaller than 200ms. You did not really benchmark the various language / server combos.

When you add 2 second delay every 10 requests you make the comparison totally meaningless. You are mesuring delays, not the code.

Reading the file in every loop mesures reading from disk, not actual program performance.

Author’s response - comment1, comment2.


Most of the performance and benchmark articles are pointless. I skim through these kind of articles and enjoy comments section.

5 Likes

Yeah. That’s exactly what I do 99% of the time as well. It’s just that this one was about concurrency, and when I saw how it was being benchmarked, I was mostly curious to see how well the BEAM’s Process.sleep/1 stacked up against all the others! :laughing:


I don’t have an account there or I may have left a comment about Erlang/Elixir with a link to the video of Saša’s awesome talk The Soul of Erlang and Elixir

3 Likes

Check out the code for process.sleep! It’s pretty instructive

2 Likes

Author updated the article - he has overwritten original images. Some of the context for comments is lost. He should have created 8th article in the series with new changes.

I see someone named Patrice Gauthier finally mentioned Elixir and the BEAM in this comment.

If this person is a member of the Elixir Forum - Thank you! :clap:

1 Like

HAHAHA :003: That really made me laugh for solid half a minute.

I absolutely loved how defensive the author became in the comments: “I wanted to write something like this for a long time and it started dragging on so I just put something out the door”, which is a super weird statement! Not like he was a journalist on a deadline, right? I guess a lot of people are addicted to getting attention on the net, I don’t know.

And he says that on a topic where a ton of people would love more benchmarks, and were curious and were of course bound to rip it apart. Not sure what was he expecting. :man_facepalming:

1 Like