Best ways to process phoenix webserver logs in production

Hi, can someone guide me on what are the best ways to process and view logs produced by a phoenix server in production?

I use mix releases and use edeliver to deploy them to the production AWS instances.

Things that I have seen so far…

  • By default, when using a release the logs are generated and kept inside a folder in the release directory somewhere(I don’t remember the path).
    They are some sort of logs generated by erlang I think.
    I saw about 5 log files that are generated and the logs are written to one of the files at once(there is some sort of log rotation I think).

  • Logs are also written to syslog

  • If we use Nginx we also get Nginx logs however they are not as detailed as logs produced directly by the phoenix server.

Now, I recently read about the ELK(Elasticsearch Logstash Kibana) stack which is very popular for analyzing logs, and does many other cool stuff, and I very interested in setting this up for phoenix.

However, I think for this to work, Logstash needs to parse the logs for which it uses patterns.
Can anyone point me to how to configure logstash to parse phoenix server logs?

At first, my idea was to use logstash to parse all the logs produced by erlang and kept in a folder that I talked about earlier. I would use a logstsash configuration that uses some wildcard like /logs/* to load all log files in the folder. So no matter in which log file the logs are written, logstash would catch them and send them to elastic search.

Later, I also found this, which looks very interesting.

I don’t much experience with loggers, what are custom logger backends?

This is all I have found, I am a noob on this aspect, and anyone guides me or give any suggestions or point me to some good blogs, links, etc what are my best options here?