Dialyzer and defoverridable: Warning if overrided function does not return all possible values

Hello

I’m struggling coming up with a good solution to this problem. I have a modules useing a generic module to override some default behaviour, very much like the following boiled-down example:

defmodule Handler do
  defmacro __using__(_opts) do
    quote do
      @behaviour Handler

      def delegate(args) do
        case handle(args) do
          :ok ->
            :ok

          {:error, reason} = result ->
            do_something(result)
            :ok
        end
      end

      defp do_something(_) do
        # Log stuff or whatever
      end

      defoverridable Handler
    end
  end

  @callback handle(any) :: :ok | {:error, any}
  @optional_callbacks handle: 1
end

defmodule Implementation do
  use Handler

  @impl true
  def handle(_) do
    :ok
  end
end

defmodule OtherImplementation do
  use Handler

  @impl true
  def handle(_) do
    {:error, "something went wrong"}
  end
end

The Implementation and OtherImplementation modules each use the Handler module, which provides for example some common error handling by matching the output from the handle function that each implementation implements. This works fine. The problem is the dialyzer warnings.

Since the delegate function is basically copied into each of the implementation modules, dialyzer will complain that return values of each of the handle functions “can never match the type” of one of the patterns in the case statement:

lib/dialyzer_test.ex:30:pattern_match
The pattern can never match the type.

Pattern:
_ = {:error, _}

Type:
:ok

________________________________________________________________________________
lib/dialyzer_test.ex:30:unused_fun
Function do_something/1 will never be called.
________________________________________________________________________________
lib/dialyzer_test.ex:39:pattern_match
The pattern can never match the type.

Pattern:
:ok

The thing is, that a given implementation does not necessarily return all possible return values, which to dialyzer looks like there is unreachable error handling code in each of those modules.

The code works fine, but the warnings are bugging me. I know that I can suppress them, but is there a better way to implement this type of behaviour that makes dialyzer less grumpy? Is overriding functions this way a too OOP way of thinking? For context, the pattern comes up in a plug_rest application where a number of REST resources use a Default module for common things like authentication and error handling.

Any thoughts?

Imho it is: Why duplicate the delegate functionality if it’s not implementation dependant:

defmodule Handler do
  def delegate(impl, args) do
    case impl.handle(args) do
      :ok ->
        :ok

      {:error, reason} = result ->
        do_something(result)
        :ok
    end
  end

  defp do_something(_) do
    # Log stuff or whatever
  end

  @callback handle(any) :: :ok | {:error, any}
  @optional_callbacks handle: 1
end

defmodule Implementation do
  @behaviour Handler

  @impl true
  def handle(_) do
    :ok
  end
end

defmodule OtherImplementation do
  @behaviour Handler

  @impl true
  def handle(_) do
    {:error, "something went wrong"}
  end
end

Good point. In my provided example, it is definitely cleaner to not override functions and just use behaviours. Maybe the question makes more sense in the full context: All of the implementations have to implement a certain behaviour (in my case it is PlugRest.Resource), and having a “default” implementation which different modules can use and override to their liking is an easy way to reuse code. But I am starting to feel that these dialyzer warnings are an unavoidable cost of trying to use “inheritance” in elixir…

If implementations fall back to an default then I’d still make that explicit:

def SomeImpl do
  @behaviour PlugRest.Resource

  def some_callback(args), do: DefaultImpl.delegate(args)
  
  def another_callback(_) do
    {:error, "something went wrong"}
  end
end