Functional equivalent of OO parent/child code sharing

This is more of a philosophical question hoping for some best-practices or practical suggestions to help write better code.

I’m working with behaviours and callbacks and it provides a sensible way to organize code into different implementations, e.g. modules that implement vendor-specific business rules.

Sometimes I want multiple implementations to refer to a shared function because it does something onerous or complicated. This makes sense from an OO perspective where you have an abstract base class and then children classes: the children classes can call methods on the parent classes, so they don’t have to re-write code.

In a functional language, however, I feel like it is a potential smell when the implementation calls shared functions. I would prefer that the implementations are NOT dependent on anything else: it would be cleaner if those modules were completely isolated.

What are ways to share/reuse functions that perform complex tasks?

  • Option 1: each implementation lists a dependency
  • Option 2: each implementation returns a more complex result and then the dispatching module can interpret the results and perform the more complex operations there.
  • Others?

The whole ‘parent/child’ thing implies an inheritance hierarchy, although you can model that in functional languages it is usually poor form to do so, rather composition is what should be done.

Hmm, this is rather open ended, I’m not sure what “implementation” is in this context or “lists a dependency” is?

Unsure on implementation again, but dispatching can be done in quite a variety of ways from witnesses to protocols to far far more. It really depends on ‘what’ is being done.

It might be good to give precise examples so they can be converted into a more functional style that would be a good comparison. :slight_smile:

You can have an behaviour with a “standard” implementation, much in the same line on how GenServer, Behaviour, GenStage, works.

As an example:


Sometimes I want multiple implementations to refer to a shared function because it does something onerous or complicated.

Simply calling that function to do work in the context of each module you want to use it would satisfy the need for re-use. Are you asking how you should organize shared functions? Because I’m also interested in how people choose to organize shared functions.

Personally I’m fine just having a Utils module or similar that encapsulates the often reused functions. Often reused functions to me implies a generic function and as such it seems fine to organize generic functions under a generic namespace.


To provide an example, consider the following:

  • An OrderHandling module that defines a @callback named do_something().
  • VendorOne module implements the do_something() callback.
  • VendorTwo module implements the do_something() callback.

Both VendorOne and VendorTwo modules do vendor-specific stuff, but they both wish to call a function named complex_operation(). So the question (and the reason for this post) is "where to put the complex_operation() function?"

Option 1: rewrite the complex_operation() function in each vendor’s module. This keeps VendorOne and VendorTwo highly modular but it violates DRY (don’t repeat yourself).

Option 2: put the complex_operation() function somewhere that both VendorOne and VendorTwo can access it, e.g. inside a Utils module. This follows DRY, but now VendorOne and VendorTwo have a “dependency” on the Utils module.

Now imagine that the VendorOne and VendorTwo modules were written and deployed as separate applications inside an umbrella app. If we want to develop these vendor apps separately (outside of the umbrella app), how can they share the complex_operation() code?

1 Like

You may want something like this:

defmodule Shared.OrderHandling do
  @callback do_something(foo, bar)

  defmacro __using__(opts) do
    quote location: :keep, bind_quoted: [opts: opts] do
      @behaviour Shared.OrderHandling

      def do_something(foo, bar), do: ... your default implementation here
      def complex_operation, do: ... your implementation here ...
      def i_can_also_override(bar), do: ... your implementation ...

      defoverridable i_can_also_override: 1, do_something: 2

Then in your AppOne application:

defmodule AppOne.VendorOne do
  use Shared.OrderHandling

  # Now you can override the implementation of i_can_also_override and do_something
  def do_something(foo, bar), do: ...

Same for VendorTwo.

1 Like

As written one could make the argument that complex_operation() should be part of OrderHandling (you started out with subclasses using functionality from the abstract base class) and that the callback should be respecified as do_something(complex_operation_fun) so that the callback module can use the functionality provided by the behaviour module - in which case do_something simply becomes a higher order function.

1 Like

All what I’d personally do, thus:

I’d put complex_operation/0 in the OrderHandling module or a specialty module depending on the transformations it should do.

No point, it doesn’t do anything vendor specific thus should go into a singular module. If you really really want it callable from those modules then just defdelegate it in.

I’d probably put it on OrderHandling itself if it does something complex related to Order Handling, otherwise in some other more specially named module, Utils is way too generically named so not that. No clue what’s bad about a dependency on another module as modules are just namespaced static functions (to use a C++/Java’ism).

Then just have the library that supplies their behaviours also supply that function. :slight_smile:
A dependency is only loaded once regardless of the amount of things depending on it, like object linkages in C++.

Eh I would not recommend that unless you really really want it implemented entirely internally, which is almost certainly poor form and harms maintainability.

1 Like

Thanks for the response. Yes: I have tended to put things like the complex_operation() in the “parent”-ish class (OrderHandling in this case) for exactly the reason you articulated: because it is NOT specific to a vendor.

I guess what has been bothering me is that the VendorOne and VendorTwo implementations have a “dependency” of sorts on the OrderHandling module because each vendor implementation will specify @behaviour OrderHandling.

It may sound crazy, but I’m wondering if I should NOT explicitly list that @behaviour. In the case where I want multiple vendor integrations to exist as separate and independent apps, I don’t want them to have to know or care about the OrderHandling module. I could totally write and test the vendor specific code without ever concerning myself with the OrderHandling module.

How crazy would it be to loosely couple the dynamic dispatching in OrderHandling to the vendor-specific modules? I.e. do that WITHOUT a behaviour or @callbacks. Sure, everything would fall apart if your VendorThree module didn’t implement specific functions, but it could be developed and tested without ever referencing the OrderHandling module. The vast majority of the time, I’d say it’s a bad idea to forgo the benefits of a contact, but if there is a pressing need to isolate implementations into their own applications, then it might make sense to skip it.

This scenario reminds me of how Go implements interfaces implicitly (not explicitly). In Go, as long as your class contains the proper methods, it will be considered as a viable implementation of an interface. In this one use case, I can see a distinct benefit to that approach.

If you have a dependency – your vendor modules should be usable as callback module for OrderHandling – why are you trying so hard to hide that? What do you get by your vendor modules not knowing about OrderHandling if they need to implement it’s callback to work in the first place? If that’s not the intention then have VendorOne be a separate module all together and only have VendorOneOrderHandling be the callback module. It could even just delegate to VendorOne where possible and you’d have OrderHandling and VendorOne fully separated.


Hmm, then how else would you specify the API that they should follow? Otherwise I guess you can pray-and-hope that they follow identical interfaces so you can call them the same? :-/

But why wouldn’t you want to reference the OrderHandling module?

But again, why skip it depending on OrderHandling?

1 Like

Want to have duck typing in Elixir?

{module, data} = value 
new_value = module.some_function(value)

However if it is polymorphism you’re after, stick with protocols - even if you have to be explicit about implementing them.

1 Like

The main drive in my thinking here is from having observed a large dev team bloat even simple applications into monolithic monstrosities. Maintaining as much modularity and separation as possible has been one effective way to help confine the bloat: keep services/apps as small as possible. In other words: if you can develop the functionality in one app in isolation and then use that component in a larger whole, that has generally equated to a win: faster dev/testing, and easier outsourcing and onboarding of devs. That is really the primary motivation in attempting to avoid the dependency on the “parent” OrderHandling being referenced as a @behaviour in the various vendor modules. Maybe attempting to obfuscate the relationship here is the wrong remedy to the problem.

Via dynamic dispatching, I can define a config value something like this:

config :my_app, :vendor_dispatch_mapping, %{
  "vendor_one" => MyApp.VendorOne,
  "vendor_two" => MyApp.VendorTwo,

And then in my OrderHandling module I do a dynamic dispatch like this:

def dispatch_vendor_processing(vendor_id, order_data) do
    case Map.fetch(Application.get_env(:my_app, :vendor_dispatch_mapping), vendor_id) do
      {:ok, vendor_module} -> vendor_module.process_order(order_data)
      :error -> {:error, "Unmapped vendor"}

I’m trying to think through how protocols might work here when the OrderHandling and VendorOne modules may be in separate applications and the OrderHandling may not yet know the implementation details… I’m probably overthinking this.

That sounds all the more important to depend on OrderHandling! You need to keep the API’s the same so you know how to call them, and they can program to those API’s so they know when they work or don’t, especially with tests against the API. ^.^

1 Like

When you use inheritance to share code you still have to state the child/parent dependency. The fact is there is a dependency. I’m not sure what is to be gained or even how it could be possible to hide this fact (in OO or functional). If you simply want to avoid the function call sites in VendorOne/VendorTwo knowing about where the shared function lives then you could use

defmodule AppOne.VendorOne do
  import OrderHandling, only: [complex_function: 1]

  def do_something(foo, bar) do
1 Like

If you want to keep dependencies small then extract OrderHandling in it’s own package and make your main application as well as VendorOne and VendorTwo depend on it. If you no longer need OrderHandling you can easily scrap it from all three packages. Also your vendors don’t depend on your main application but they can still be used together based on the behaviour of OrderHandling.