Sync inconsistent/outdated state of multiple channels in socket

Scenario:
User connects to socket, which has multiple channels. Followed by an authentication check with a token in connect/3. On passing the check, the user and roles of the user are fetched and stored in assigns.

defmodule MyAppWeb.UserSocket do
  use Phoenix.Socket

  channel "chat_room:*", MyAppWeb.ChatRoomChannel
  channel "role:*", MyAppWeb.RoleChannel
  channel "user:*", MyAppWeb.UserChannel
  ...
  def connect(%{"token" => token}, socket, _connect_info) do
    ...
    user = Repo.get_by(User, id: id) |> Repo.preload([:roles])
    {:ok, assign(socket, :current_user, user)}
  end
end

This state (in assigns) from the socket is copied to the channel when the respective join/3 is called. This includes the roles which are used for authorization in authorized?/2. (Authorization is done by checking roles of the user Eg: Admin chat room should only be accessible to user with admin role)

defmodule MyAppWeb.ChatRoomChannel do
  use MyAppWeb, :channel
  
  def join("chat_room:" <> room_id, _payload, socket) do
    room = Chat.get_room!(room_id)
    if authorized?(socket.assigns.current_user, room)
      # join channel
    else
      # throw error to client
    end
  end
end

Problem:
Suppose, we have a Channel/Controller where admins can associate/dissociate users with roles.

In this case, the channel state (assigns) will be outdated for users who were newly associated/dissociated with a role. The state of all the remaining channels associated to that user’s socket is outdated. How would you recommend they be synced so the state of all channels the user joined be updated?

(Easy fix is to query the roles table on every join or channel event. But I would like to avoid that since changes with roles are not frequent events, so they do not need to be queried every time. Using ETS to cache the role and user data from the DB might be a good option here?)

I built authorization evaluation backed with cachex. The results of each request by user are cached and the evaluation function checks the cache first, and falls back to evaluating from the database. My authorization logic is quite complex (potentially overlapping policies/roles with permit/explicit deny for specific actions) so I moved to this approach quite quickly for performance reasons.

Cachex is ETS backed but handles a lot of the complexities associated with managing cache size, expiring entries etc etc.

You have a couple of options with this approach - use cache expiry to force periodic re-evaluation and/or invalidate the relevant cache entries when role associations are changed.

This approach is plenty fast enough for all permission checks to be done “live” (i.e. not stored in socket assigns - I’m using liveview rather than channels, but it is the same problem).

An example of caching a security policy decision for a user looks something like:

  def assess(%User{}=user, %Company{}=company, action, environment) do
    cache_key = {:authz, :ap_co, user.id, company.id, action}

    {result, decision} = Cachex.fetch(:my_cache, cache_key, fn(_key) ->
      user_roles = determine_user_roles(user, company)
      decision = assess(user_roles, action)
      {:commit, decision}
    end)

    # This bit is needed because cachex doesn't inherit TTL/expiry when fetch updates the cache
    if result == :commit, do: Cachex.expire(:my_cache, cache_key,  @cache_expiry)

    {decision, user, company, action , environment}
  end

The authorization code is wrapped up in a module and use of the cache is transparent to the caller.

1 Like

I was storing this in assigns since I thought the data would be changed infrequently, but it brings in a lot of other complications this way. Having a cache expiry and invalidating entries when role/permission related data changes makes sense. Also, I’ll take a look at cachex. Thanks a ton!