Use defoverridable with care. If you need to define multiple modules with the same
behaviour, it may be best to move the default implementation to the caller, and check if a callback exists via Code.ensure_loaded?/1 and function_exported?/3.
The disclaimer doesn’t explain why this is. What is the reason for this disclaimer? Why is it be preferable to query the module for an implementation, rather than use a defoverridable? Is it a performance thing? Readability thing? Other?
Most often I see defoverridable used with a using macro - does that come into play with this disclaimer?
defmodule NameBehaviour do
@callback name :: String.t()
@optional_callbacks [name: 0]
end
APPROACH 1
With defoverridable it would be:
defmodule Greeter do
defmacro __using__(_opts) do
quote do
@behaviour NameBehaviour
defdelegate name, to: unquote(__MODULE__)
defoverridable name: 0
end
end
def name do
"stranger"
end
def greet(module) do
"Hello " <> module.name()
end
end
APPROACH 2
Without defoverridable (as docs suggest) it would be:
defmodule Greeter do
defmacro __using__(_opts) do
quote do
@behaviour NameBehaviour
end
end
def greet(module) do
if Code.ensure_loaded?(module) and function_exported?(module, :name, 0) do
"Hello " <> module.name()
else
"Hello stranger"
end
end
end
Personally, I think “Approach 1” looks much better.
Surely it is not good to have this Code.ensure_loaded? and function_exported? runtime evaluation every time I call this function?
Update: function_exported? is inlined by the compiler but I don’t know if Code.ensure_loaded? is.
I believe, the issue is high coherence of modules in the former case, enforcing the compiler to recompile the world upon the “main” Greeter module change (defoverridables involve an AST passing through global compiler ets tables.)
The latter case decreases the coherence of implementations.
I’m skeptical that this would be unique to the defoveridable way of doing things. If you have an interface like this, odds are Greeter is going to provide the behaviour at the very least, and probably some other utility as well - such as the “default” implementation that the caller will call if the overrideable function isn’t defined in the implementation module.
I think both scenarios are likely to cause modules that rely on the specified behaviour to be recompiled regardless of the use of defoverideable.
There is no need to be sceptical. Since v1.14 IIRC, there was a lot of work on compiler to decrease coherence.
You might check it yourself by creating the project, with the two following files
❯ cat lib/test_beh.ex lib/test_impl.ex
defmodule TestBeh do
@callback ok :: :ok
end
defmodule TestImpl do
@behaviour TestBeh
@impl TestBeh
def ok, do: :ok
end
I am also skeptical/sceptical about this argument.
If there is to be a “default function” available, then presumably TestImpl (in above example) will have to do something like use TestBeh instead of simply @behaviour TestBeh.
This means the default function is defined inside a quote block as part of defmacro__using__.
It is considered bad practice to have a long quote block, so eventually as we add more default functions we have no choice but to defdelegate (or just simply call a function) from another module, which results in the same “compiling the world” situation.
@mudasobwa If you have time, could you provide a more complete example, with a default function?
I don’t have any constructive thoughts to offer other than that I’m also curious and it’s an example of what I was talking about a few weeks ago on a similar thread—there are several instances of similar advice given in the docs without a “why.”
Yes! Not too much I guess, but it’s I guess a request to the code server.
In the case of gen_server the two functions checked for implementation are terminate which is called only once for the lifetime of the process, and handle_info which can be called many more times. The gen_server module keeps a local cache for implementation callbacks but it does not seem to be for speed. I am not sure why.
It leads to broad interfaces because you can define default functions that are “inherited” by everyone. While this is also possible with callbacks, the fact a callback is optional means you cannot rely on it as part of your API. This is part of an overall principle were you generally want to keep your interfaces narrow.
Overridable functions come from meta-programming, which is harder to debug, and an overriden implementation can call their previous definition using super, which is not something you can turn on or off and creates additional coupling, regardless if it is intentional or not. super can also cascade, which leads to more issues during trouble-shooting.
There are situations were defoverridable is preferred and I think @slouchpie gave a good example, where you don’t want to invoke Code.ensure_loaded? on every time. But for process-based behaviours or behaviours with clear entry-points, I’d recommend optional callbacks. I will improve the documentation.
As I said at the time, saying there are several instances does not move the needle, because the Elixir team doesn’t know where they are. My biggest concern is that, without quantifying them, we may be simply repeating information which is no longer true. Especially because, if there are examples such as this discussion, then I am certain they will be addressed.
Certainly. Last time I asked if it was purposeful that they were left out and if I should bother submitting PRs when I come across them. I haven’t come across one since so this comment was more, “Hey, I’m not crazy, there are more examples!” Though it’s debatable if I’m crazy or not.
You are definitely not crazy and I am sure there were several instances in the past. I only wonder if it is still accurate, as documentation is always improving.