How to reduce the number of consultations using Repo?

I needed help.

I currently have a news site.
My homepage uses a lot of Repo.one, Repo.all …
Ex:
In each <div>, there is a function that calls the Repo.

But I see that it takes about 5 sec to reload …
What do you advise, use to reduce this number of consultations?

Make sure you’ve properly indexed your data. 5secs seems like an awful long time… How many queries are we talking?

The other option would be to build your backend as a REST service, and have each element load it’s data asynchronously. HTTP/2 could be helpful if you’re going to do a lot of requests over the same connection.

We are talking about 30 simultaneous queries.
I will index the bank and see if it improves.

If it doesn’t help much, I will do as you told me.
Using REST service
Thanks

I believe Ecto/Postgrex defaults to 10 connections in the pool, you could consider increasing that as well.

Why is in every <div> a call to Repo?

Usually you fetch all data in the controller and pass it to the view, then the view renders that static data.

In the controller you can optimise fetching of data to be in as few querries as possible.

4 Likes

We really can’t help much if you don’t give us some details.

Can you show us code from your controller / view / template?

2 Likes

Unless you’re doing something specifically to make those queries simultaneously, they will be made one after another. Also you may want to make sure you’re familiar with the concept of “N+1 query”, since that is something that you should try to avoid.

Something seems wrong with your setup. But you can:

  1. Profile your queries. Make sure you use index and you don’t load too much.
  2. Split UI elements to different LiveView components. Make sure data is loaded when the component is connected so you query after initial rendering.
  3. Maybe some of the queries can be merged into one query. 30 queries seems like a lot.

I actually just wrote something about it:

1 Like

I’m curious about this 30 queries thing as well. I access an old exceptionally large database, generate reports, etc… etc… and the largest set of queries for a single request (is a report of course) only hits about 16 queries (give or take based on some chosen options).