Centralized logging with console

My end goal is to have all logs be valid JSON and send to stdout, but I’ve seem to run into a wall.

Initial I wanted to re-use the functionality that’s built into
:console and just use my own formatter,
this way I could ensure everything coming from Logger would be valid and :console would handle message buffering and sending to stdio.

I ended up coming up with this:


config :logger, :console,
  format: {Logger.Formatters.Json, :format}
defmodule Logger.Formatters.Json do
  alias Logger.{Utils}

  @doc "log formatter replacement. Injected through the :format key"
  def format(level, msg, timestamp, metadata) do
    %{
      level: level,
      time: format_time(timestamp),
      msg: (msg |> IO.iodata_to_binary)
    }
    |> Map.merge(Enum.into(metadata, %{}))
    |> Poison.encode!
    |> Kernel.<>("\n")
  end

  defp format_time({date, time}) do
    [Utils.format_date(date), Utils.format_time(time)]
    |> Enum.join(" ")
  end
end
> Logger.debug('test')
:ok
{"time":"2017-02-13 14:32:11.106","msg":"test","level":"debug"}

This was a good start but now want to log more information like status, action, controller, etc. so I set off to create a new Plug. The plug would gather this information from conn but when it came to actually sending it to the logger I ran into my first problem, logging more than just a string.

Currently the logger handles some of the utility responsibilities before calling the formatter which is really nice but it prevents me from passing in anything besides a string for the message arg. Encoding the data structure could work but it would make it less readable and would require the log to be parsed twice.

Instead of doing this I would prefer to have a relatively flat JSON structure similar to how metadata is currently handled.
This actually lead me to try and put status, method, etc into the metadata which actually worked but it felt wrong and required two before send hooks:

def call(conn, opts) do
  start_time = current_time()

 Conn.register_before_send(conn, fn conn ->
    stop_time = current_time()
    duration = System.convert_time_unit(stop_time - start_time, :native, :micro_seconds)

    stats = []
    |> Keyword.put(:method, conn.method)
    |> Keyword.put(:path, conn.request_path)
    |> Keyword.merge(formatted_phoenix_info(conn))
    |> Keyword.put(:status, conn.status)
    |> Keyword.put(:duration, (formatted_diff(duration) |> IO.iodata_to_binary))

    Logger.info(fn -> {"", stats} end)

    conn
  end)
end
{"time":"2017-02-13 16:28:47.568","status":200,"request_id":"u3iaoaiid9u3ddi877573bee7v2hdl5f","path":"/","msg":"META","method":"GET","level":"info","format":"html","duration":"26ms","controller":"Sample.PageController","action":"index"}

Ok so now metadata is gathered and calling Logger directly works but what about Ectos logger?
I can write my own ecto logger as well but I’m not really sure if I’m going about this the right way.

Is there an easier way to do all of this?
Should I be using something like bleacherreport’s ecto_logger_json and plug_logger_json? If so, I just don’t know how to handle direct Logger calls.

I’d also want to disable the default Phoenix Logger since I’d be gathering most of that information through the custom plug.

Any input or feedback is greatly appreciated.

2 Likes

@josevalim and/or @fishcakez Do you have any feedback on how to accomplish the goal of always logging valid JSON?

1 Like

I’ve gotten a little further, I ended up taking advantage of being able to pass in metadata directly to the Logger using {msg, metadata}. I’m currently able to get valid json calling the logger directly, through the plug, and from ecto. The only weird thing is because metadata is being filtered my config is quite large

defmodule Logger.Formatters.Json do
  alias Logger.{Utils}

  @doc "log formatter replacement. Injected through the :format key"
  def format(level, msg, timestamp, metadata) do
    %{
      level: level,
      time: format_time(timestamp),
      msg: (msg |> IO.iodata_to_binary)
    }
    |> Map.merge(Enum.into(metadata, %{}))
    |> Poison.encode!
    |> Kernel.<>("\n")
  end

  defp format_time({date, time}) do
    [Utils.format_date(date), Utils.format_time(time)]
    |> Enum.join(" ")
  end
end

defmodule Plug.PhoenixStats do
  @moduledoc """
  A plug for adding phoenix metadata to Logger.metadata
  """

  require Logger
  alias Plug.Conn
  @behaviour Plug

  def init(opts) do
    Keyword.get(opts, :log, :info)
  end

  def call(conn, opts) do
    start_time = current_time()

    Conn.register_before_send(conn, fn conn ->
      stop_time = current_time()
      duration = System.convert_time_unit(stop_time - start_time, :native, :micro_seconds)

      stats = []
      |> Keyword.put(:method, conn.method)
      |> Keyword.put(:path, conn.request_path)
      |> Keyword.merge(formatted_phoenix_info(conn))
      |> Keyword.put(:status, conn.status)
      |> Keyword.put(:duration, format_time(duration))

      Logger.info(fn -> {"", stats} end)

      conn
    end)
  end

  def format_time(time), do: (time / 1000)

  defp formatted_phoenix_info(%{private: %{phoenix_format: format, phoenix_controller: controller, phoenix_action: action}}) do
    [
      {:format, format},
      {:controller, controller |> inspect},
      {:action, action |> Atom.to_string}
    ]
  end
  defp formatted_phoenix_info(_), do: []

  defp current_time, do: System.monotonic_time()
end
defmodule Ecto.Logger.Json do
  require Logger

  def log(entry) do
    %{query_time: query_time, decode_time: decode_time, queue_time: queue_time, query: query} = entry

    [query_time, decode_time, queue_time] = [query_time, decode_time, queue_time]
                                            |> Enum.map(&format_time/1)

    stats = []
    |> Keyword.put(:decode_time, decode_time)
    |> Keyword.put(:db_duration, Float.round(query_time + decode_time + queue_time, 3))
    |> Keyword.put(:query, query)
    |> Keyword.put(:query_time, query_time)
    |> Keyword.put(:queue_time, queue_time)

    Logger.debug(fn -> {"", stats} end)

    entry
  end

  defp format_time(nil), do: 0.0
  defp format_time(time) do
    System.convert_time_unit(time, :native, :micro_seconds) / 1000
  end
end

This is still pretty rough but heres some example logs:

{"time":"2017-02-16 18:04:57.325","request_id":"4nmik3se6tenbpg0kr5dkrop0d870hap","msg":"Processing by Sample.PageController.index/2\n  Parameters: %{}\n  Pipelines: [:browser]","level":"debug"}
{"time":"2017-02-16 18:04:57.325","request_id":"4nmik3se6tenbpg0kr5dkrop0d870hap","msg":"debug message","level":"debug"}
{"time":"2017-02-16 18:04:57.342","request_id":"4nmik3se6tenbpg0kr5dkrop0d870hap","queue_time":0.037,"query_time":1.917,"query":"SELECT n0.`id`, n0.`title`, n0.`text`, n0.`created_at`, n0.`inserted_at`, n0.`updated_at` FROM `notes` AS n0","msg":"","level":"debug","decode_time":0.033,"db_duration":1.987}
{"time":"2017-02-16 18:04:57.367","status":200,"request_id":"4nmik3se6tenbpg0kr5dkrop0d870hap","path":"/","msg":"","method":"GET","level":"info","format":"html","duration":58.839,"controller":"Sample.PageController","action":"index"}
{"time":"2017-02-16 18:05:32.059","msg":"#PID<0.369.0> running Sample.Endpoint terminated\nServer: localhost:4000 (http)\nRequest: GET /\n** (exit) an exception was raised:\n    ** (ArithmeticError) bad argument in arithmetic expression\n        (sample) web/controllers/page_controller.ex:7: Sample.PageController.index/2\n        (sample) web/controllers/page_controller.ex:1: Sample.PageController.action/2\n        (sample) web/controllers/page_controller.ex:1: Sample.PageController.phoenix_controller_pipeline/2\n        (sample) lib/sample/endpoint.ex:1: Sample.Endpoint.instrument/4\n        (sample) lib/phoenix/router.ex:261: Sample.Router.dispatch/2\n        (sample) web/router.ex:1: Sample.Router.do_call/2\n        (sample) lib/sample/endpoint.ex:1: Sample.Endpoint.phoenix_pipeline/1\n        (sample) lib/plug/debugger.ex:123: Sample.Endpoint.\"call (overridable 3)\"/2\n        (sample) lib/sample/endpoint.ex:1: Sample.Endpoint.call/2\n        (plug) lib/plug/adapters/cowboy/handler.ex:15: Plug.Adapters.Cowboy.Handler.upgrade/4\n        (cowboy) /Users/etoulson/source/sample/deps/cowboy/src/cowboy_protocol.erl:442: :cowboy_protocol.execute/4","level":"error"}
2 Likes

Sorry for bumping this post.
If anyone else is looking to do something similar, check out the logger I ended up creating.
We’ve been using it in production for a few months now and while it’s not perfect it gets the job done.