I am in the process of adding analytics to my SaaS app, and I’d love to hear other people’s experiences doing this.
Current I see two different approches:
- Do most of the data handling at the DB level, building and aggregating data into materialized views for performance boost. This way the data will stay normalized.
- Spawn various processes that will run at different intervals (10 min, 1 hour etc.) that will query the database and insert aggregate results into a new table. In this case, the metrics/analytics are denormalized.
Which approach makes the most sense, maybe something completely different?
Did you have a look at the New ReliC thread here in the forum?
Not sure what your requirements are, but it might fit them. Of course, the free lite tier of New Relic is not really feature-rich and the paid versions are priced on the higher side.
Oh, I don’t mean performance metric for the app.
I mean displaying aggregated and grouped customer/enduser data in charts
Have you considered https://influxdata.com/time-series-platform/influxdb/ ? it is a time-series database, it has aggregation built in at the database level.
You can use https://influxdata.com/time-series-platform/telegraf/ to collect metrics, it has an embedded implementation of the statsd protocol, so it’ll play nicely with E(li)xometer.
I would however prefer to stay with PostgreSQL if possible.
I am not handling TBs of data
Yay another PostgreSQL user. You may find this interesting for your use then, or at least for a read: http://grisha.org/blog/2015/09/23/storing-time-series-in-postgresql-efficiently/