Which is the fastest web framework? (Link/Repo and results in this Topic)

The graphic in the github page makes Elixir look bad?

When are they going to fix it?

That is because they tested in dev mode, and yeah they need to


maybe PR a cowboy server too if they want really low-level

If anyone actually wants to PR a cowboy version, I think it probably would look something like this

mix.exs

defmodule MyCowboy.Mixfile do
  use Mix.Project

  def project do
    [app: :my_cowboy,
     version: "0.1.0",
     elixir: "~> 1.4",
     build_embedded: Mix.env == :prod,
     start_permanent: Mix.env == :prod,
     deps: deps()]
  end

  def application do
    [mod: {MyCowboy.Application, []}]
  end

  defp deps do
    [{:cowboy, github: "ninenines/cowboy", tag: "2.0.0-pre.9"}]
  end
end

lib/my_cowboy/application.ex

defmodule MyCowboy.Application do
  @moduledoc false

  def start(_type, _args) do
    dispatch = :cowboy_router.compile([{:_, [{:_, MyCowboy.Handler, []}]}])
    {:ok, _} = :cowboy.start_clear(:http, 100, [port: 3000], %{env: %{dispatch: dispatch}})
  end
end

lib/my_cowboy/handler.ex

defmodule MyCowboy.Handler do

  def init(%{method: method, path: path} = req, opts) do
    handle(method, split_path(path), req, opts)
  end

  defp handle("GET", [], req, opts) do
    {:ok, :cowboy_req.reply(200, %{}, "", req), opts}
  end
  defp handle("GET", ["user", id], req, opts) do
    {:ok, :cowboy_req.reply(200, %{}, id, req), opts}
  end
  defp handle("POST", ["user"], req, opts) do
    {:ok, :cowboy_req.reply(200, %{}, "", req), opts}
  end

  defp split_path(path) do
    segments = :binary.split(path, "/", [:global])
    for segment <- segments, segment != "", do: segment
  end
end

Couldn’t benchmark it with his client.cr though.

1 Like

And here is elli

mix.exs

defmodule MyElli.Mixfile do
  use Mix.Project

  def project do
    [app: :my_elli,
     version: "0.1.0",
     elixir: "~> 1.4",
     build_embedded: Mix.env == :prod,
     start_permanent: Mix.env == :prod,
     deps: deps()]
  end

  def application do
    [mod: {MyElli.Application, []}]
  end

  defp deps do
    [{:elli, github: "elli-lib/elli", tag: "2.0.1"}]
  end
end

lib/my_elli/application.ex

defmodule MyElli.Application do
  @moduledoc false

  def start(_type, _args) do
    {:ok, _} = :elli.start_link(callback: MyElli.Callback, port: 3000)
  end
end

lib/my_elli/callback.ex

defmodule MyElli.Callback do
  @behaviour :elli_handler

  def handle(req, _args) do
    do_handle(:elli_request.method(req), :elli_request.path(req))
  end

  defp do_handle(:GET, []), do: {:ok, ""}
  defp do_handle(:GET, ["user", id]), do: {:ok, id}
  defp do_handle(:POST, ["user"]), do: {:ok, ""}

  def handle_event(_event, _data, _args), do: :ok
end
1 Like

Testing with wrk elli handles about 40% more get "/", get "/user/:id", and post "/user" requests than cowboy.

You can set PHP to skip the file system check and use a deploy approach fwiw.

For all BEAM based solutions, I would recommend playing with VM arguments. In particular setting +K true to enable kernel poll and increasing the number of async threads (e.g. +A 100) might produce a significant difference.

Also, the multi-pollset that will hopefully be merged in OTP 20 (not available in the RC1 yet), should speed things up a bit as well.

7 Likes

yeah, in this context one would do that tuning here:

makefile:

plug:
	cd elixir/plug; mix deps.get --force;MIX_ENV=prod mix release --erl="+K false +A 10" --no-tar
	ln -s -f ../elixir/plug/bin/server_elixir_plug bin/.

and then ‘make plug’ ("+K false +A 10" is the defaults)

tested it yesterday, but couldn’t get any consistent significant improvements compared to the defaults, ymmv.

multi-pollset does look exciting, we should revisit these benchmarks when otp 20rc2 is out.

1 Like

I actually did that in a test I did after my post, but it gained less than 1% speed so I figured it was not worth it to post it. ^.^

1 Like

Haven’t checked it in years but I’d bet they set the checks to a TTL at some point to avoid the overhead.

1 Like

Your solution of passing arguments to mix release only affects the VM spawned to build the release - not the actual production release. You need to use the vm.args file when working with distillery as described here.

For example:

# in rel/config.exs
environment :prod do
  # ...
  set vm_args: "rel/vm.args"
end
# in rel/vm.args (we can use eex)
-name <%= release_name %>@127.0.0.1
+A 100
+K true
4 Likes

thanks, but not sure that is correct - it does affect the release vm.

doing a MIX_ENV=prod mix release --erl="+K false +A 1000" --no-tar

makes the erl_opts show up in the "$RELEASES_DIR/$REL_VSN/$REL_NAME.sh" script which is used to start the release vm.

# Options passed to erl
ERL_OPTS="${ERL_OPTS:-+K false +A 1000}"
# Environment variables for run_erl
RUN_ERL_ENV="${RUN_ERL_ENV:-+K false +A 1000}"

also this +A 1000 release will consistently yield much worse benchmarks - so very much seems the erl_opts are in effect.

https://hexdocs.pm/distillery/Mix.Tasks.Release.html#module-examples

# Pass args to erlexec when running the release
mix release --erl="-env TZ UTC"
4 Likes

Oh, I had no idea distillery does that - you’re right, I’m sorry. It doesn’t seem to be documented anywhere, though.

2 Likes

I love this community. Not only did y’all improve the reporting of benchmarks for Phoenix and Plug, but y’all also improved and added the benchmarks for other frameworks not even elixir related.

12 Likes

Can’t know how well Elixir performs if the other things are not performing up to par as well. ^.^

4 Likes

Looking at the project repos, why is the plug version using elixir 1.4 and the phoenix version using elixir 1.2? Or is that going to be updated when Phoenix 1.3 comes out?

2 Likes

they use the same elixir version.

elixir: "~> 1.2" and elixir: "~> 1.4" are “approximately greater than” requirements and in this case running elixir 1.4, 1.5, 1.6, 1.7, 1.8, or 1.9 would satisfy those two requirements.

Just didn’t know what environment the suite was running in or see where it was defined.

2 Likes

At https://github.com/tbrand/which_is_the_fastest there’s comparison of the web frameworks.

It can give you a high level idea. And that is, Phoenix and Plug are slow compared to, for instance, Ruby and its framework Roda. Ruby/Roda is faster. Or at least roughly more or less the same as Phoenix and Plug.

I thought Phoenix and Plug would very well beat ruby and python with a magnitude of several times. But that’s not the case.

Why are Phoenix and Plug so slow?

1 Like

This benchmark does not benchmark anything useful. There is no business logic involved, and if you need three simple routes that do virtually nothing, I see no point in using any framework at all.

6 Likes