Creating the key for multiple object for ets cache

Hi,

So I’m trying to cache multiple data and I have a gen server which calls a wikipedia server and generates random movies so here’s how I’m doing it. This module is calling a movie gen server and put it in the database.

 def get_movies(limit \\ 100, offset \\ 0) do
    MovieGen.get_movie(limit, offset)
 end

There is another gen server which I have created for just ets which basically does the normal functionality for cache. like get, put, and delete.


  def put(key, data) do
    GenServer.cast(MovieCache, {:put, key, data})
  end

  def put_movie_list(_key, data) do
   movie_list =  data
    |> Enum.map(fn %{:name => name} = movie ->
      {name, movie}
  end)
    GenServer.cast(MovieCache, {:put, movie_list})
  end

  def handle_cast({:put, key, data}, state) do
    :ets.insert(:movie_cache, {key, data})
    {:noreply, state}
  end

  def handle_cast({:put, movie_list}, state) do
    :ets.insert(:movie_cache, movie_list)
    {:noreply, state}
  end

To test this I have created another schema

 schema "users" do
    field :age, :integer
    field :name, :string

    timestamps()
  end

These two functions will check the data if it’s in the database and simply just put it in the cache and I have put dummy data in the database. But here I’m passing the id as the key. So when I call a particular id then it will get added to the cache.

  def get_user!(id) do
    case Cache.get(id) do
      %User{} = cachec -> cachec
      _ -> get_user_cache(id)
    end
  end

  def get_user_cache(id) do
    cachec = Repo.get!(User, id)
    Cache.put(id, cachec)
    cachec
  end

The plan was to cache one item at a time. So to test that I created this.

But in the case of this MovieGen.get_movie(limit, offset) I’m getting a lot of data once.

I’m having a problem caching all those values at once. Can someone suggest?

Also, I had a doubt while doing this because I have read somewhere that caching in phoenix/ecto model is not the best practice we can do So I want to know why is that?

Original reply was incorrect and deleted to for future readers

]

 def get(key) do
    GenServer.call(MovieCache, {:get, key})
  end

  def handle_call({:get, key}, _from, state) do
    reply =
      case :ets.lookup(:movie_cache, key) do
        [] -> nil
        [{_key, movie}] -> movie
      end

    {:reply, reply, state}
  end

For lookup I’m doing this.

This was for this

Another thought around your strategy. Since your only purpose for the GenServer is to manage an ETS table you are creating a couple of issues:

  1. Depending on who the owner of the ETS table is, if your GenServer dies, your ETS table will vanish.

  2. You are serialising all accesses to the ETS table through the GenServer so if you scale up the number of processes getting and storing movies, you are going to single-thread through this GenServer.

Since I don’t see any special reason why you would want to serialise here (given that ETS has atomic updates with :insert_new/1) I would suggest you don’t need the GenServer other than as a strategy to keep the ETS table up if your GenServer crashes.

I would use something like Eternal to manage the ETS table and then directly update the cache as required with :insert_new/2 for atomic inserts and :ets.get/2 wherever you need it.

1 Like

In the way I think of it, movie_list is a list of multiple items. Therefore not one item. I may be misunderstanding what one item means to you though.

Yeah movie_list is a list of multiple items. But I have created another user module which return the single item from the list

This is just to test if I can put a single item into the ETS table.

I have added the application to supervise this. I was assuming whenever I start the application it will start the ETS genserver.

Okay. I got it