What are people using for equivalent of access logs when using Cowboy 2 and Phoenix 1.4 on their own?

Howdy,

In the past I’ve used nginx in front of phoenix and taken advantage of nginx's relatively standard access log set up. What are people using for access logs when you are just using cowboy and phoenix?

Hey I am currently using https://timber.io for processing logs from phoenix. It has a lot of cool features, is quite cheap and works quite well.

Some of the things I like about is

  • user context tracking (you can tail a user)
  • seaching is quite great
  • their UI is really nice / simple

It’s saved me countless hours of meddling with log files because I can just login somewhere and go back / search through my logs when problems happen.

1 Like

I was searching for a solution to this the other day and found https://github.com/mneudert/plug_accesslog which looks good at first glance. That said, I haven’t used it yet so not sure how well it works.

2 Likes

I send all information to the console, which since it’s hosted by systemd then goes to the syslog, in addition to some specific information to some specific log files in specific places for specific auditing purposes.

1 Like

Yeah, I’m using distillery's foreground command and systemd. Just getting the hang of journalctl options. Feels pretty low level which is why I was curious what others do.

1 Like

Hi, I have been trying out timber.io for the last few days and I have noticed some logs just don’t appear! I was wondering if you ever noticed this happening? That logs just don’t appear?

Wanted to throw logflare.app on this thread :slight_smile: Built in Elixir … about to launch v1. Our logger backend is: https://github.com/Logflare/logflare_logger_backend

@slouchpie should be a drop-in replacement if you want to try it.

Not only do we have alerts, tailing, an easy query language, etc. in our UI but I use it to power this dashboard for our Elixir backend (which I use daily for monitoring): https://datastudio.google.com/s/gIAFsUMbDS4

3 Likes

Very nice! I signed up now and am looking at it. Any chance of a simple configuration to use as a Gigalixir sink? Something like timber.io’s URL would be cool.
EDIT: ignore that. I see the logger_backend lib allows for direct logging of tuples and such, which is very very good. I am almost less inclined to use it, since I’ll be undoing all the to_string conversions sprinkled in with my logging XD but I’m joking I think this is the way to go for me.
Thanks again!

1 Like

The Gigalixir sink would be useful for other stdout stuff probably. I’ll get something up around that.

Okay yeah you should just be able to do:

gigalixir drains:add https://api.logflare.app/logs/json?api_key=YOUR_KEY&source=YOUR_SOURCE

Assuming they’re JSON request bodies… this will work.

2 Likes

Thanks very much! To confirm, I would have to, e.g. Logger.info(Jason.encode!(something)) when logging? I’ll try it later today.
EDIT: I see this is the “generic workbook” option.
EDIT: The problem with only accepting json encoding is that some libs log things that aren’t json encoded. These just don’t appear in logflare.

If you are brave enough you can check out 1.11 structured logging with Erlang’s logger instead of encoding JSON message directly into log message.

Nope. Just log it normally and the backend will handle the rest. We actually use binary encoded Erlang terms over the wire :slight_smile:

Are you actually seeing things that don’t appear in Logflare? Can you give me an example?

Edit: this should not be the case

Yeah we’re looking at redoing this to be based on Erlang’s logger now that things have converged. But you can do all that currently in Elixir like:

bq_stats = :hackney_pool.get_stats(Client.BigQuery)

Logger.info("Hackney BigQuery stats!", hackney_stats: bq_stats)

Which gets you this event shape:

And lets you do things like:

Nothing was showing up for me when I added the gigalixir sink as described. I will try again later and report back.
EDIT: Yup I tried again and I have 2 gigalixir sinks - a logflare one and a papertrail one. The papertrail app shows the logs, the logflare one doesn’t. Not sure what I’m doing wrong tbh. I’ll keep poking around.
EDIT: I’m logging things like this:

Logger.info(fn ->
      event = %{something_happened: %{detail: detail}}
      message = "Something happened with details #{detail}"
      {message, event: event}
    end)

Is there some reason that wouldn’t work?

Okay I’ll try and set one up and see. Probably not you!

Do it like:

event = %{something_happened: %{detail: detail}}

Logger.info("Something happened with details #{detail}", event)

Hmm, it doesn’t like your way. If you want to try something in iex use LogflareLogger.info like:

iex(3)> detail = "blah"
"blah"
iex(4)> (LogflareLogger.info(fn ->
...(4)>       event = %{something_happened: %{detail: detail}}
...(4)>       message = "Something happened with details #{detail}"
...(4)>       {message, event: event}
...(4)>     end))
%{
  count: 1,
  events: [
    %{
      "message" => "LogflareLogger formatter error: %Protocol.UndefinedError{description: \"\", protocol: String.Chars, value: #Function<45.97283095/0 in :erl_eval.expr/5>}",
      "metadata" => %{"level" => "error"},
      "timestamp" => "2020-08-06T22:29:06.953844+00:00"
    }
  ]
}

You should still see those log messages though:

I was not seeing any formatter errors. Thanks for all this great info. I really want to use logflare.app because of this kind of community engagement. I will endeavour fully now to make it work.

You are super cool for helping me with this and giving me knowledge. Many thanks.

So You are one of the 3 persons that use Logger functions with metadata that made my life miserable when I was rewriting logger…/s

3 Likes

I only do it to keep you miserable, I promise.

1 Like

I have changed my logging to use the Logger.info(message, event) syntax.

I have tried using https://github.com/Logflare/logflare_logger_backend in :dev env and the logs appear as I expect.

So I now just accept that gigalixir sinks don’t play nice with logflare at the moment.

I still want my logs to be in gigalixir logs and logflare so I just have to set up 2 backends.

Since, I am also using FlexLogger to filter logs (because I like to be as cheap as possible), I have settled on this config:

config :logger,
  backends: [{FlexLogger, :console_logger}, {FlexLogger, :logflare_logger}]

log_cfg = [
  [module: MyApp.Schema.Middleware.Logging, level: :info],
  [module: MyApp.SessionController, level: :info],
  # ...etc
]

config :logger, :console_logger,
  logger: :console,
  default_level: :warn,
  level_config: log_cfg

config :logger, :logflare_logger,
  logger: LogflareLogger.HttpBackend,
  default_level: :warn,
  level_config: log_cfg

So my phoenix app is just putting logs to gigalixir console and also to logflare http backend. I am not using a gigalixir sink at all now. I am happy with this setup.

Thanks again for all your help with this.

EDIT: It works super nicely! The structured logging is [chef-making-ok-symbol-with-hand-and-kissing-the-air emoji].

EDIT: The only downside is that I don’t see gigalixir errors in logflare with this particular config so I guess it still doesn’t fully satisfy my needs.

2 Likes