Hi,
So I’m trying to cache multiple data and I have a gen server which calls a wikipedia server and generates random movies so here’s how I’m doing it. This module is calling a movie gen server and put it in the database.
def get_movies(limit \\ 100, offset \\ 0) do
MovieGen.get_movie(limit, offset)
end
There is another gen server which I have created for just ets which basically does the normal functionality for cache. like get, put, and delete.
def put(key, data) do
GenServer.cast(MovieCache, {:put, key, data})
end
def put_movie_list(_key, data) do
movie_list = data
|> Enum.map(fn %{:name => name} = movie ->
{name, movie}
end)
GenServer.cast(MovieCache, {:put, movie_list})
end
def handle_cast({:put, key, data}, state) do
:ets.insert(:movie_cache, {key, data})
{:noreply, state}
end
def handle_cast({:put, movie_list}, state) do
:ets.insert(:movie_cache, movie_list)
{:noreply, state}
end
To test this I have created another schema
schema "users" do
field :age, :integer
field :name, :string
timestamps()
end
These two functions will check the data if it’s in the database and simply just put it in the cache and I have put dummy data in the database. But here I’m passing the id as the key. So when I call a particular id then it will get added to the cache.
def get_user!(id) do
case Cache.get(id) do
%User{} = cachec -> cachec
_ -> get_user_cache(id)
end
end
def get_user_cache(id) do
cachec = Repo.get!(User, id)
Cache.put(id, cachec)
cachec
end
The plan was to cache one item at a time. So to test that I created this.
But in the case of this MovieGen.get_movie(limit, offset) I’m getting a lot of data once.
I’m having a problem caching all those values at once. Can someone suggest?
Also, I had a doubt while doing this because I have read somewhere that caching in phoenix/ecto model is not the best practice we can do So I want to know why is that?