JSON REST with variable input structure in Phoenix?

Hello!

I’m facing an issue that I don’t even know what to search on :slight_smile:
I want to integrate my Phoenix app with an external app that offers webhook integration. So, it can send to my endpoint various messages. Now, while those messages follow a general schema, like {"event":..., "timestamp": ...., "payload": {.....}}, one could easily imagine that payload format depends on the event type.
I can imagine that going low-level, intercepting the full request body, copy it and manually analyzing it is an option. My question is, can I do something more generic via Phoenix? What would be the pathway?

1 Like

I would not say there’s a recommended way and/or special Phoenix mechanism; controller actions are just plain old Elixir functions.

So just deconstruct via normal pattern-matching to your heart’s content.

One thing you might consider though, is a JSON Schema library; if you know for a fact the several possible data shapes then you can construct a schema that would be valid only for them and enforce it.

But I would advise against that in your current initial experimentation stage. IMO just record everything like you already figured and then apply common sense. You’ll do perfect.

First off it sounds like you need a post action in your router.

scope "/webhook", MyApp do
  pipe_through :api

  post "/svcname", MyApp.WebhookController, :hook
end

Then in your controller handle the action. The body is already decoded in params so just pattern match on the shapes that you recognise to process it.

  defp hook(conn, %{"event" => "type1", "timestamp" => timestamp, "payload" => payload} = params) do
    # process timestamp and payload 
  end

  defp hook(conn, %{"event" => "type2", "timestamp" => timestamp, "payload" => payload} = params) do ....

 # etc...
end

typed on my phone so code may not compile.

1 Like

This will cause the server to crash on a match error. For a REST API, a 400 Bad Request might be a better design, which you could do with a catch-all clause, but you wouldn’t be able to provide feedback about which data was bad. Complex matches in the function header are also an anti-pattern :smiling_imp:

A project I’m involved with experimented with using Ecto and non-embedded embedded schemas with Changeset.apply_action/2, for this but it’s quite a lot of boilerplate and feels like the wrong tool for the job. It’s also awkward for saying things like “field present even if nil/empty”.

1 Like

It is pretty trivial to have a catch all function for things you don’t recognise and return a HTTP 400 error, that is left as an excercise for the reader, I just outlined a general approach. There was no mention of full santisation checking of JSON payloads. If that is needed then a JSON schema validator can be used.

I STRONGLY disagree that function matching in this instance is an anti pattern. It is not complex, and is matching the expectations provided in the original question.

Again I must say there were long threads about code smells being interpreted as holy scripture and lorded by dogmatic types, I hope the community doesn’t go down this path as I support guidance but not dogma.

PATTERN MATCHING ON EXPECTATIONS IS GOOD

3 Likes

Let’s not start with this again, please. Every individual or a team decides it for themselves. I am a big fan of pattern matching in function heads though I’ve seen some monstrosities – spanning 55 lines if memory serves – that even I wouldn’t write. At the end of the day though, the complexity has to go somewhere and this technique is one of several. There’s no right or wrong here. Doing 10+ Map.has_key? or Map.fetch isn’t very sightly or easy to visually parse either.

Yep, and that’s still one of my top 5 reasons to still stick with Elixir.

And I’ve heard people arguing “but what if the pattern does not match and we get an error?” then well, you have match? to avoid a runtime error, and you also can have catch-all clauses so I am really not sure what’s the fuss about.

1 Like