Logging for users

I am in the process of creating a Software-as-a-service to allow people to easily integrate a chat-service in their existing application using their app’s existing user-accounts.

(More details on the project in general will follow soon)

One thing we’d like to do, is to log all Phoenix Channel-requests that come in, to show each of the developers if their integration is working properly or not. This means we’d like to work with a specialized logger that logs all data related to a certain integration.

I am not entirely sure what the best approach would be to create this, so I’d love to hear your ideas!

1 Like

Not a complete solution, but take a look at AWS cloudwatch logs, you can create a separate stream for each channel (having a unique name using some kind uuid) and then you can log all events for that channel to this stream and show it to the user.

1 Like

In java you have logging frameworks log4j, in .net log4net. I once used the specs to build a logging framework in another language. I searched google with “logging framework” erlang. Could you use a logging framework like https://github.com/erlang-lager/lager?

So, one way of doing this is using metadata to associate a log entry with a particular phoenix channel, you could probably add a :channel_id in the meta data and push it as JSON to your logger which then persists it in some kind of a database (definitely not a file). A redis backed logger just popped up on my twitter feed: https://github.com/archydragon/elixir-redislog , I would usually build something of that sort, maybe not use postgres instead of redis though.

If you are on AWS you have a few options.

Kenisis -> S3
Then make your s3 keys something like intergration/YYYY/MM/DD/HH/MM.txt and to give devs their logs just do a list request and start pulling down files. You’d be very surprised how well this works in practice.

Dynamodb -> keep 30 days of data -> archive to s3 if needed. I’ve done this several times, it mostly just works and is good enough. Can be costly, but it’s a drop in the bucket compared to your development time.

Then… drumroll… just shove it into postgres. It will probably be fine for quite a while if you buffer, batch insert, and archive old data :slight_smile:

Cheers,
Ben