Choosing the Best Tech Stack for an Analytics Dashboard: LiveView + D3 or React?

Hi all,

I’m planning to develop a website that displays analytics reports, similar to Cloudflare Radar 2.0 or Google Analytics. I’m currently in the process of selecting a tech stack.

I looked into Plausible because it closely aligns with my project, and I noticed that it uses React for the frontend. Now that LiveView has reached version 1.0, I’d like to hear your thoughts on the best tech stack for such a website.

Would a combination of LiveView and JavaScript graph libraries like D3 be sufficient, or should I consider using React for the frontend?

Thanks a lot for your suggestions!

2 Likes

Integrating D3 with LiveView is pretty straight forward.

I’m sure you can do it any tech stack. Follow your heart, i.e. use LiveView.

Not sure you noticed, but in the end it’s d3.js after all

So just for the graph features, using React seems to me like an additional layer and most React chart libs are just abstractions over d3.

There is one benefit though: these abstractions give you the 80% for 20% effort. Unless you want something the abstraction does not support; then you have rewrite straight to d3 (my previous employee had to do so)

Advice 1
I would skip all the extra work, layers and extra deployment concerns and just go with straight d3 in LiveView. Integration is easy and you are not limited by whatever abstraction.

——

About React vs LiveView for a dashboard: not sure as I haven’t done anything remotely close to the use case with LiveView. (edit) But the Phoenix Analytics project chose React, maybe they can enlighten you on their decision:

Phoenix.LiveDashboard with Phoenix.LiveDashboard.PageBuilder seems to be a LiveView solution.

Advice 2
Please do notice there is a project for “Analytics ala Google” already. Might be better to work together than to start and alternative on your own.

1 Like

We recently made some protoypes for a reporting dashboard and were comparing different charting options:

I personally liked the dev experience for VegaLite because the Elixir package gives you a nice way to build the JSON spec and it’s just a simple hook for every chart, because it gets configured via the spec. But it’s missing some interactions/animations and that’s why we settled on ECharts in the end for a nicer user experience. You can push the serialized data as JSON to the hook and have it render the chart.

Let me know if you’re interested in more!

4 Likes

@BartOtten isn’t the Phoenix Analytics project a React frontend?

@kevinschweikert I’ve used echarts for some dashboard and animation work but have migrated the complex stuff to d3 to get more control. echarts was great until I hit that wall. Having to define your formatters on the JS side when the rest was done on the Elixir side was a bit unfortunate too.

D3 is rather confusing to learn at first, though D3 in Action and observablehq are good resources. It is rather verbose on account of being so low level, but you can create exactly what you want.

I can’t imagine a response which causes me to dodge the bullet. Auch. Fixed my post :slight_smile:

Thank you all for your valuable suggestions and useful resource—they’ve been incredibly helpful!

Once you made the decision, please let us know the outcome and the key factors leading you to it in return. :slight_smile:

1 Like

Charts.js is very simple and pragmatic and used by my favourite website monitoring tool uptime.com I don’t know if there’s any pre existing integration with livebook but I agree it should be straightforward to integrate.

An option I’m currently evaluating, because it seems reasonable and someone (@staxx) kindly wrote a liveview plugin, is Apexcharts

https://hexdocs.pm/live_charts/LiveCharts.Adapter.ApexCharts.html

Seems pretty well implemented, don’t know the author, but kudos.

1 Like

As an alternative, I’d like to point out that this type of analytics app is very amenable to simple bar charts rendered entirely as a series of <div style="height: XX%;" />.

Obviously if you want more complex charts then you can go with D3 or similar, but for a prototype rectangles really are hard to beat.

1 Like

I was trying out different libraries for charts too. My main concern was the final performance and latency.

I chose eCharts out of uPlot, chart.js and eCharts.

uPlot Is by far the fastest and has the smallest memory footprint, but it’s also quite limited in the types of charge you can use. It’s mostly time series. chart.js Is the second in terms of performance, But I could not find the functionality of brush or linked charts, so it’s not an option for me.

All these three libraries use canvas. eCharts can switch between canvas and SVG.

If you go with the pure D3, be ready for it to be slower than the rest because it can only use SVG.

Now I’m trying to battle and figure out whether to use SQLite with indexed tables or DuckDB for charts.
It’s a big mind bend for me in understanding whether to connect to the databases through a direct connection or if I need to use GenServers. AI seems to suggest that GenServers are part of best practices, But with me anticipating no more than 10,000 users for my app, I really don’t want to add any overhead to the app that is not needed.
I’m pretty new to Phoenix and it’s quite a lot of headache and hustle. But I hope it’s only in the beginning until my mind wraps around it.

D3 can render to canvas and/or svg.

What is your reasoning for putting a GenServer in front of the db for the chart data? Is the data expensive to calculate and/or rarely updated?

1 Like

Oh, that’s great. I was not aware of it. Thanks

Re GenServer, I don’t know why. My AI tool suggested it and said it’s best practice. Probably it’s because I fed it the data from PhoenixAnalytics library.

I’m very new to Phoenix and Elixir and don’t even know what GenServers are for, but when I questioned AI About it, it said it could be used as a common state management system as well as cashing solution. Also, it could be a central point where connections to the DB unmaintained.

I was thinking about the same thing and then I’ve created this repo to consolidate my learnings.

There’s nothing yet. I’m only gathering the information and organizing the ideas on the readme file. The purpuse is trying different graphics libs.

Moreover, since the Oban Web OOS release, I’m thinking about open it and understand what graphic lib they utilize to present the analytics, but I couldn’t manage it install it yet.

2 Likes

Libraries like Ecto will manage the DB connection pool for you, so you don’t need to worry about that. If you’re new I wouldn’t bother with GenServers for now, you probably won’t need them for this project.

If you want to implement caching, Cachex is a great library. But avoid caching until you’re sure you really need it.

For a DB, stick with SQLite or Postgres until you run into performance problems. DuckDB and Clickhouse are much faster for analytical queries, but also more annoying to use, and if you don’t have much data it won’t matter.

1 Like

Thanks @garrison. This is exactly my plan to go forward. I’ve done some hight level tests and since I’m mostly using SELECT and WHERE, the Duckdb, does not seem to be much faster at all. I do have a lot of data, but for purposes of displaying data, it’s seem like SQLite is enough for now.

Also I found that Duckdb does not seem to have any tool like ‘sqlite3_rsync’ or ‘streamline’ to update the production DB instance in a running state does not exist for Duckdb.

For now, I’m sticking with SQLite

1 Like

I don’t know exactly what your data looks like, but I guarantee if you try again with some analytical queries (like sum or count with a simple predicate) and a billion rows you will find DuckDB is a couple thousand times faster.

Like I said, if you don’t have much data (you probably don’t!) it’s not worth it.

I meant Duckdb is not much faster if I only need to display data for a chart, not to run any analytical SQL transform with it. I do it in Duckdb withing a python data pipeline and Duckdb flies … I wish Ecto added it…

For analytics data you usually do what are called “analytical queries”, stuff like

select count(*) from pageviews where page='/home' group by day;

Then you would dump those rows into a chart.

Column stores like DuckDB or Clickhouse can perform this particular type of query much faster than row stores like SQLite or Postgres, but at the cost of a lot of other functionality.

If you have even millions of rows you are unlikely to notice much difference. If you have billions it will matter.

1 Like