Commandex - Make complex actions a first class data type

I’ve found myself implementing the command pattern in Elixir many times over across various projects, often with slightly different implementations. This library standardizes a lot of best practices I’ve found, with a foundation that can grow into a really powerful tool.

What is the command pattern? Only one of the most awesome patterns ever. Imagine wrapping all of your relevant params, data and errors into a struct that can be easily piped. Related business logic can live together in a single module file, instead of dirtying up Phoenix controllers or Ecto models. This library closely resembles Plug and Ecto changesets but with one key difference-- because everything is a module struct, you can easily implement all kinds of protocols for different flows of business logic.

Example

Getting started is easy. A command module might look like this…

defmodule RegisterUser do
  import Commandex

  command do
    param :email
    param :password

    data :password_hash
    data :user

    pipeline :hash_password
    pipeline :create_user
    pipeline :send_welcome_email
  end

  ...pipeline functions go here
end

The command/1 macro automatically defines new/1 and run/1 functions on the module, as well as a struct with the given attributes of the block. Params are what’s given to the new/1 function, and it can take either a keyword list or a string/atom key map. Data fields are things generated over the course of running a command, and can be set with put_data/3.

Pipeline functions take three arguments (command, params, data), and must return a command. Structuring it this way makes for super simple pattern matching:

  def hash_password(command, %{password: nil} = _params, _data) do
    command
    |> put_error(:password, :not_given)
    |> halt()
  end

  def hash_password(command, %{password: password} = _params, _data) do
    put_data(command, :password_hash, Pbkdf2.hash_pwd_salt(password))
  end

Pipeline functions are run in the order in which they are defined. And remember, like Plug, Commandex will continue running through the pipeline unless you call halt/1. This allows for intelligent error handling further down the pipe.

Calling a function outside of the module? That’s easy. The following three definitions are equivalent…

pipeline :hash_password
pipeline {RegisterUser, :hash_password}
pipeline &RegisterUser.hash_password/3

If a command is fully run without calling halt/1, it will have success: true marked on the struct. Usage might look like this:

%{email: "example@example.com", password: "asdf1234"}
|> RegisterUser.new()
|> RegisterUser.run()
|> case do
  %{success: true, data: %{user: user}} ->
    # Success! We've got a user now

  %{success: false, error: %{password: :not_given}} ->
    # Respond with a 400 or something

  %{success: false, error: _error} ->
    # I'm a lazy programmer that writes catch-all error handling
end

Future Plans

There’s many different directions this project can take, but two that I have in mind: automatic validations/casting and saga rollbacks.

Because of the way attributes are defined, adding types would be easy:

command do
  param :email, :string
  data :user, User
end

This would allow intelligent casting of Phoenix params, as well as errors if put_data/3 did something you did not expect.

Sagas might be a bit more difficult, and while I might not strive for something as complex and powerful as Sage, rollbacks could be as easy as:

pipeline :create_user, rollback: :delete_user

Feedback

What kind of API would you like to see? Is the command macro straightforward? I’m open to ideas. I’ve begun converting many of my old custom command implementations to commandex, and it works really well for my use cases.

6 Likes

Few notes / questions, mostly around naming or convenience:

  • If I end up using such a library I’d appreciate it if I had a convenience function that does both new() and run(). Something like create_and_run(), maybe? Or make_and_run()? Or simply start?
  • Not sure how intuitive the naming of params and data is. I’d be a bit more formal and christen them input and output.
  • Ditto for pipeline. I get it that you probably want Plug users to feel at home but IMO something like action is clearer.
  • In my eyes {:ok, %{user: user, password_hash: "123"}} and {:error, error_object_or_message} are more idiomatic Elixir. (Note that Ecto.Multi partially follows your pattern but still uses tuples.)

To recap, this is how I’d find the whole thing more readable:

defmodule RegisterUser do
  import Commandex

  command do
    input :email
    input :password

    output :password_hash
    output :user

    action :hash_password
    action :create_user
    action :send_welcome_email
  end

  ...action functions go here
end

Don’t take my criticism seriously if you like your naming. I am only giving you a pure anecdote on what I’d find more intuitive / readable.


Lastly, as a friendly competition, you can take a look at Opus (they seem to like your pipeline naming :wink: ).

3 Likes

Thanks for the feedback!

I thought ahead before release, and new/1 isn’t actually necessary (but more straightforward for teaching others about the package). If you give a list of params to run/1 it will automatically invoke new/1. Down the road new/1 would likely be where to enforce parameter typing, so there would be many cases where you wouldn’t want to invoke run/1.

Regarding naming, I’ll likely stick with the current ones, but I am open to changing it if there’s enough pushback.

You touched on an important issue that I didn’t mention, but perhaps it’s time to discuss. :sweat_smile: I’m starting to consider tagged tuples an antipattern for complex data types. Why? Maps and structs can easily annotate their own success/errors, and the convention doesn’t play nicely with protocols and pipes. It’s partly why I built this library, to enforce a consistent success/error convention that could be pattern matched on. Don’t get me wrong, I’m still a fan of them for primitives ({:ok, pid} and the like).

It could easily be a topic for another thread. I do like some of the things I see in Opus, but it’s unlikely this package would ever return :ok/:error tuples in its final result.

Why not go a step further and introduce your own Result struct? It will give you an even more type safety and it will make dialyzer happier.

Heh, I’ve thought about it, but it breaks protocols in the way I envision them being used.

Commandex currently lets you write a theoretical phoenix JSON API like this:

def register_user(conn, params) do
  params
  |> RegisterUser.run()
  |> Respondable.to_json(conn)
end

def login_user(conn, params) do
  params
  |> LoginUser.run()
  |> Respondable.to_json(conn)
end

Where the protocol implementation might look something like:

defimpl YourApp.Respondable, for: YourApp.RegisterUser do
  import Phoenix.Controller

  def to_json(%{success: true, data: %{user: user}}, conn) do
    json(conn, %{token: Auth.sign_token(user), user: user})
  end

  def to_json(%{success: false, error: %{user: :already_exists}, conn) do
    conn
    |> put_status(:bad_request)
    |> json(%{error: "Email is already taken."})
  end
end

defimpl YourApp.Respondable, for: YourApp.LoginUser do
  import Phoenix.Controller

  def to_json(%{success: true, data: %{user: user}}, conn) do
    json(conn, %{token: Auth.sign_token(user), user: user})
  end

  def to_json(%{success: false, error: %{login: :unauthorized}, conn) do
    conn
    |> put_status(:unauthorized)
    |> json(%{error: "Invalid password given."})
  end
end

The trick is having a consistent naming structure while maintaining the flexibility of different kinds of business logic in protocols. Both of those protocol implementations could call out to a shared Response.successful_auth_response(conn, user) function. But they certainly don’t have to, and more importantly, you aren’t sticking all of this logic in the controller.

This looks elegant but somewhat generic and not telling the reader much. I’d stay away from protocols. Never did like the idea of their runtime overhead (even on the prod builds). Plain old behaviour should be fine, unless you want to go as far as possible in terms of dialyzer checks and runtime guarantees? (But I think, might be wrong, protocols don’t give you anything extra there.)

The name Respondable will net you a blank stare from me. I’d much prefer to have more granular names like Commands.Users.respond_json for example; might not be the best name as well but at least you know it deals with commands working with users (and it returns JSON). :slight_smile:

If I am doing a code review of a PR containing code like yours I’ll immediately return it with “I would like this code to tell me what it does without me having to dig what Respondable.to_json does and where is the code for this particular invocation”.

I agree that controllers should be lean. Contexts, domain business logic objects or whatever anyone wants to call them, are the much better way to go. What’s more, none of them should deal with Ecto directly as well but that’s another thread.

It think we just have a difference in taste. Behaviours are wonderful, and I use them often, but I may not necessarily want to be importing and working with Phoenix-specific functions in the same command module.

I’ve always loved protocols because the implementation blocks are concise to the purpose, and can be stuck in a location that makes sense for your project, whether thats:

  1. At the bottom of the controller file
  2. In the defprotocol's file
  3. At the bottom of your command file

I come from a strong iOS background, where protocols rule the day, which is why I tend to favor them. Even so, it can definitely be taken too far. POOP is a very adequate name for it.

Also yes, Respondable is a terrible name. :sweat_smile: You could very easily rename it to CommandResponse.respond_json, UserResponse.respond_json, or anything else you find more understandable.

The very fact that there are different, well-reasoned, opinions on which direction to take things has kept me skeptical if building a library is worthwhile, vs. educational materials on the pattern itself.

3 Likes

Fair point. It’s worth keeping in mind, you don’t have to use protocols with this package, but it’s now an option, whereas any other structure wouldn’t support it.

There’s value in a library, but there’s always tradeoffs involved. I’m probably not thinking creatively enough, but I’m not seeing a lot of value for protocols for this, so the value in keeping that an option is lost on me.

I’ve been building out a custom command pattern use in my primary work system. I have it set up to return {:ok, command_struct} or {:error, exception_struct} and I’m liking the affordances that offers.

Perhaps there’s some middle ground to be had if the library offered both bang and non-bang new and run functions. Protocols only work with the bang functions, but for something more bespoke, you can give that up and use the non-bang versions. OTOH, that might tip the library into the realm of too complex ¯_(ツ)_/¯

2 Likes

New updates out for v0.3.0!

Parameter Defaults

You can now set param defaults like so:

command do
  param :limit, default: 20
  param :offset, default: 0
end

new/0

The params for new/1 have been made optional!

Pipelines support 1-arity functions

You can now specify a 1-arity function that takes only the command as an argument. Useful for quick inspecting, or passing to other similar protocols.

command do
  pipeline &IO.inspect/1
end
1 Like