Overridable functions in protocol implementations?

I recently learned that it’s possible to delegate common implementations of protocols using defdelegate in the implementations, but I’m curious to know if this is possible using a macro that provides overridable functions.

The following code works, but shows a warning (see comment below) :

defprotocol MyProtocol do
  def foo(data)
  def bar(data)
end

defmodule MyProtocol.Defaults do
  defmacro __before_compile__(_env) do
    quote do
      def bar(data) do
        IO.puts "Default implementation of bar, data: #{data}"
      end
      defoverridable [bar: 1]
    end
  end
end

defimpl MyProtocol, for: Integer do
  @before_compile MyProtocol.Defaults

  def foo(data) do
    IO.puts "int implementation of foo, data: #{data}"
  end

  # Is it possible to override `bar` from before_compile without this warning:
  #
  # warning: this clause cannot match because a previous clause at line 28 always matches
  # my_protocol.exs:17
  def bar(data) do
    IO.puts "int implementation of bar, data: #{data}"
  end
end

Do I get this warning because before_compile’s return value is injected at the end of the module? Is there any way to get around this?

2 Likes

I’m not sure I agree with this approach to begin with, but if you were to go this way I think it makes more sense to do a __using__ instead of a before compile. The the function is injected at the top, and you override it in your defimpl block. As it is it’s always at the bottom, which is likely what prevents you from overriding it.

2 Likes

@benwilson512

What’s the idiomatic way to do this?

Yes, this works. So I guess the problem with before_compile here is that it’s appended to the end of the module body. It makes sense, then. Thanks!

You can explicitly delegate to the function you want to:

defimpl MyProtocol, for: Integer do
  def foo(data) do
    MyProtocol.Default.foo(data)
  end

  ...
end

You can even use defdelegate to make it more compact:

defimpl MyProtocol, for: Integer do
  defdelegate foo(data), to: MyProtocol.Default
  ...
end

The question you want to ask yourself is: are you really expecting to write the code above so many times to justify the use of indirection and meta-programming? The answer is likely no.

5 Likes

My thoughts precisely.

Can a case be made to have default implementations be delegated to Any when its implementation is missing?

defprotocol Duration do
   @fallback_to_any true

   def seconds(a)
   def compare(a, b)
end
defimpl Duration for: Any do
    def seconds(_), do: raise "Not yet implemented"
    def compare(_, _), do: raise "Not yet implemented"
end
defimpl Duration for: NaiveDateTime do
   def compare(a, b) do
       NaiveDateTime.compare(a, b)
   end

   # we could implement `seconds` here in the meaning of epoch; but we could also skip it's implementation if we decide there is no semantic need for this. 
end

The fact we have @fallback_to_any true means that it is not unreasonable to expect that missing implementations are defaulted to Any; so still explicit. Currently a warning is rendered for missing implementations; so the above fallback would get rid of the warning while clearly communicating that its implementation has been omitted deliberately(which could have several reasons, among which is because the domain does not require its implementation.)

Cons to the above suggestion:

  • Backwards compatibility of the language(This thread is like 6 years old, lol)
  • Protocol should always be fully implementable. If stuff remains omitted then this means the protocol is too broad and it should be made smaller. Counter argument: Naming things is hard and sometimes it is not worthwhile to split up existing protocols in more granular ones.