__using__ modules to share private functions... ordering functions?

This is a code style question (mostly?). I’m using use and the __using__ macro to share private functions between other modules. The functions have the same name and arity: the idea is that they get called recursively to validate input and accumulate errors. This works well, EXCEPT for one thing that looks… smelly: Because of how matching works, the most specific function clauses are declared first, and then the most general catch-alls appear last. What this means is that my modules have to put the use clauses BELOW, like this:

defmodule MyThing do

    defp foo(%{very: "specific", match: "clause"} = args, acc) do
        # implementation...  then call the "shared" functions
        foo(args, acc) 
    end
 
    use FunctionsInSharedModule  # <--- down here!
end

My question is: is this bad form? Some style guides stat that use statements should appear near the top of the module declaration, and it’s easy to miss a use tossed into the bottom of the module. Am I being too sensitive? I’m trying to think of a more elegant way to structure this.

Your thoughts are welcome!

I believe you can replace injecting the code with this:

    defp foo(%{very: "specific", match: "clause"} = args, acc) do
        # implementation...  then call the "shared" functions
        foo(args, acc) 
    end
   defp foo(args, acc) do
     CommonValidations.foo(args, acc)
   end 

Then CommonValidations.foo gets to be a proper function instead of an injected one, with all the benefits of better error messaging.

5 Likes

That assumes that CommonValidations.foo is a public function, yes?
My use case is slightly more complex (I tried to keep things simple): I’m actually useing multiple modules to compose the set of functions/rules that make sense for each use case, so I have to think through how that might be affected…

1 Like

Function calls are the easiest way to compose functions. If you want to indicate that a module or its functions are internal that’s what @doc false or @moduledoc false is great for that. Macros, and injecting functions with macros, have a place, but that’s generally when something about those functions would make them difficult or problematic to call directly. Really though this should be a last resort, and I think it is far more likely of a need in something like a library than in your application code.

If you are finding yourself needing use because you want to call other private functions then you need to take a seriously look at your module and figure out if you are treating it like a class.

If you still want to have some stuff in your common functions call functions within the specific module, use a behaviour, and pass the module name in to your common eg:

def foo(#specific stuff here) do
end
def foo(args, acc) do
  Common.foo(__MODULE__, args, acc)
end
5 Likes

While I agree with what Ben is saying – this may not be a good use of macros. If it is though, it is not a good use of use which is syntax sugar over require and direct macro call and is expected at the top of the file. However, if you made that a more explicit macro call, it would reduce the “smell”.

For example, replace:

use FunctionsInSharedModule

with something like:

FunctionsInSharedModule.foo_fallbacks()

and the necessary require FunctionsInSharedModule at the top of the file

1 Like

A more idiomatic way to get the intended behavior ("add these clauses to the end of the module that called use FunctionsInSharedModule") is the @before_compile callback. For instance, it’s used to declare a catchall render function in Phoenix.View

2 Likes