HTTP client libraries and wrappers



Here are the list of HTTP client libraries/wrappers, and some thoughts on HTTP client in general. I’d like to hear from others how they work with HTTP…

HTTP client libraries

HTTP Client Wrappers

Related links

Thoughts & Questions

  • Although HTTP spec says headers are case-insensitive, http client libraries should not automatically downcase such values (especially for outgoing request) since there are applications require specific cases :frowning:
  • headers, form data, and query string should be a list not a map to preserve orders (both order of keys and order of values)
  • For performance, what about using NIF to parse headers? For example, puma (app server in Ruby) uses C for this: - such parser may be shared across client/server.
  • How should HTTP libraries handle HTTP version upgrade?
  • Should HTTP libraries make pure functional (zero side effect) or leverage more global states (connection pools, keeping connection, etc.)?
    • For example, Tesla allows creating new client on the fly, so that it “builds” client without any configuration from “global” config; however as it may use HTTP client library (application) which maintains some state in it.

HTTP client libraries - need better one in standard library?
Your thoughts on advancing the Elixir ecosystem
Importance of Documentation

In the meta-sense of the question, yeah, I’m not loving the fragmentation in http client libraries, however, it’s not a huge detriment to my day to day experience. I’d need to see some specific use cases to get behind some of your arguments about case and ordering, but it’s not a big deal because none of those things you’re advocating would work against any use case I’ve dealt with.

In the end, are you expecting someone else to write this ultra http client library, or are you trying to determine whether to write it yourself?


So far I’m happy with Tesla (with httpc and hackeny) for small traffic. However for future it would be nice if I can use one library for http2 and further so this is my part of research.

For ordering - some services require signature of headers so it is required to keep the order.


I was wondering if there was a library that leveraged libcurl, and there is! It’s actively maintained too. I’m going to give it a try later to see if it works.


Why not do some code generation and make use of Elixir’s pattern matching, like so?

  def header_pair("Content-Encoding" <> rest), do: {"Content-Encoding", header_value(rest)}
  def header_pair("If-None-Match" <> rest),    do: {"If-None-Match",    header_value(rest)}
  # ^ these can be generated with a macro

  def header_value(val) do
    String.split(val, ~r/\s*:\s*/, trim: true)
    |> hd

I am sure C code will be faster, however crossing the VM <-> native barrier has an overhead as well. I have not measured the native approach but I would turn the question back to you: have you established with certainty that Elixir HTTP header parsing is a bottleneck in your workflow?

(EDIT: on a second thought, this is not practical if we want to accept all possible case combinations for the headers.)

BTW, awesome job compiling the list! :023: :041:


Good news and bad news for Katipo.

Bad news: It depends on a metrics library that is, as far as I can tell, out of date. It uses merl to dynamically build a metrics module and it’s not getting through erl_lint on OTP 21.

Good news: The metrics module can easily be faked so that there are no problems making requests.

Here’s the fake module:

defmodule :metrics_mod do
  def new(_name, _type, _config), do: :ok
  def update(_name, _probe, _config), do: :ok
  def update_or_create(_, _, _), do: :ok
  def update_or_create(_name, _probe, _type, _config), do: :ok
  def delete(_name, _config), do: :ok


I haven’t work with NIF, and I’m not arguing it’s a bottleneck.

However, as it may be called so many times, so even small improvement may give significant improvement (like JSON) in some cases. I guess it depends at which boundary NIF is used - for example, using NIF to parse each HTTP header may not worth it. However, we may overage NIF to take a stream (or string) of a HTTP response, and let it returns the whole parsed results - maybe less memory copying?


From ElixirConf 2018 keynote - see


Question here: in your opinion should client HTTP libs optionally deal with response caching, i.e. dealing with the cache-control, pragma, expires, etc. headers, and cache the response automatically? Or should it better be done by the application?
There’s for example that HTTPoison issue from 2 years ago. Not sure if it’d be a sane approach.


All libs should have “minimal” default behavior. I don’t think HTTP cache or auto-retry for idempotent requests should be turned on by default.

A good library is extensible (like Elixir), not full-featured. That’s why Tesla is my current choice of HTTP wrapper.

Note that httpc does not verify certificate by default


Neither do ibrowse, HTTPotion, gun, Tesla and SimpleHttp. And neither do hackney and HTTPoison if you pass any custom SSL options (e.g. select a custom CA, or suppress log messages with log_alert: false) without also passing verify: :verify_peer along with the right verify_fun