32) ElixirConf 2017: Plugging the Security Holes in Your Phoenix Application

After having watched the talk I’m wondering if this would also be a good opportunity to gather examples / tips about how to prevent or mitigate the mentioned issues. I’d expect the e.g. for the mentioned session issue there might already be examples out there. Also I’m especially curious about the last topic on mass assignment. How do you guys handle changesets which are allowed to change things like admin flags or alike, so at best they’re not accidentally usable from any frontend forms.


I’ll comment with some of my thoughts a bit later when I have some more time. Until then, I wanted to say thanks for posting, and I’m happy to answer questions about the content if anyone has any!


I’ll start by adding my captain obvious solution to prevent accidental changes to admin flag fields: Don’t handle it through the params sent to changeset/2, but only allow them to be changed by custom functions.

def make_admin(%User{} = user) do
  |> Ecto.Changeset.change(%{is_admin: true})
  |> Repo.update()

def put_admin_flag(%Ecto.Changeset{} = changeset) do
  Ecto.Changeset.change(changeset, %{is_admin: true})

This way in each places, where setting that flag should indeed be possible it has to be done explicitly, independent to any changeset/2 functions/params.


The nice thing about most of the vulnerabilities I talked about is that Phoenix does a good job of pushing you in the right direction, so there often aren’t too many ways to make mistakes. This means that 1) Once you are aware of the issues, they become much easier to avoid, and 2) A lot of these issues are easy to detect in an automated fashion, so a tool like Sobelow can simply/reliably help you avoid them.

That leaves “logical” issues like session/auth handling and “mass assignment” type vulns. These tend to be less Phoenix-specific, and are common across applications. Mass Assignment vulnerabilities are a pretty classic Rails issue, and have been fairly highly publicized, but the Rails community still has issues with it. It’s a pretty hard thing to mitigate.

I think part of the problem people have w/ the changeset functionality is that, because the changeset function is generated for you (and because it’s called “changeset”), the changeset function feels very “official” and safe.

^^ This is definitely a solid way to start dealing with sensitive fields! Another is to have a bunch of unique, very explicit changesets. E.g. registration_changeset, admin_changeset.

Also, when possible, explicitly passing params is a good move. For example, something like:

User.registration_changeset(%{email: email, username: username})

People tend to like this recommendation less b/c it can cause a lot of duplication. But I like it, because even if something unfortunate changes w/in the changeset function, you’re still safe :stuck_out_tongue:


One person’s duplication is another’s de-coupling. :slight_smile:


I am trying out this approach to the “mass assignment” issue. I haven’t merged it in to my project yet so this is a good time for feedback :slight_smile:

I have a plug called WhitelistParams that takes the name of the parameter to be filtered, and a list of acceptable fields for that param. So my UserController has this at the top:

plug WhitelistParams, %{param: "user", list: User.public_fields} when action in [:update]

User exposes its public_fields through a function which returns the @public_fields module attribute. The @allowed_fields attribute is a concatenation of the @public_fields and @private_fields arrays. I can certainly see an argument that the public_fields list is not the responsibility of the model but it seemed convenient since the fields must be listed in the model for casting changesets.

1 Like


Good idea, do you have a hex package up of it, or want to PR it to Plug? :slight_smile:

I don’t yet. Assuming our team merges it, I’ll see how it goes here and apply it in some other cases. If it is easy to work with then I will put something together. I think there will be some edge cases in terms of handling parameters of different types - right now I’ve only tried it on a single map, such as:

  "user": {
    "field1": "value1",
    "field2": "value2"

That’s exactly what changeset does, just as a plug rather than part of ecto, right?


I think in the case of the User I have some fields that should not be available to the update route, but do need to make it into the changeset from other places. For example, User has a password_reset_code field that is set by a changeset, but I don’t want that field to be editable by passing it into update. Filtering that out in the controller seemed like the best solution.

I think it is working the same way as changesets. It takes only the whitelisted fields from the map and replaces the existing param with those.

def call(%{:params => params} = conn, %{param: param_name, list: list}) do
  filtered_params = Map.take(params[param_name], list)
  merged_params = Map.put(params, param_name, filtered_params)
  %{conn | params: merged_params}

Nothing too fancy.

You would define multiple changeset functions in that case. In general, in ecto, almost no configuration or validation is “global”. There is nothing special in the function called changeset/2.

I’m sorry, but I really don’t see a problem this is trying to solve. It’s, of course, different if you don’t use ecto, but if you do, I can’t see an advantage this offers.


I’m sorry if I came through being hostile (and reading my answers I could see this being the case). This was not my intention, I’m just trying to understand if there is some gap in how ecto works and if so, how we could fix it.


No worries - I appreciate the input. This is just something I’m trying out right now, so I don’t even know if it works well.

So the multiple changeset option in this example would be something like:

def update_changeset(model, params) do
  |> cast(params, @public_fields)
  |> validate_common

def reset_password_changeset(model, params) do
  |> cast(params, [:password_reset_code, :password_reset_sent_at])
  |> validate_common

If that is the case I would probably prefer that. An advantage of the plug approach would be to avoid the extensive refactoring I probably have to do to get to the point where I can take advantage of these custom changesets.

1 Like

I think the multiple changeset option is ideal. Though, I do get the impulse for a Plug! Having the whitelisted parameters declared right in the controller can make it easier to reason about what data you’re actually passing around, similar to Rails’ strong params. That’s why I like the explicit parameter passing approach (combined with multiple changesets), though it can get a bit unwieldy with lots of values.

1 Like

I think the multiple changeset option is ideal.

+1, I also like this approach. To avoid duplication I often have a base_changeset for stuff that every generic change can do that is used (in case of User schema) by admin_changeset, form_registration_changeset, edit_self_changeset etc.

Works pretty well for me, never had any problems.

1 Like

I keep coming back on the session cookie issue. Wouldn’t it be quite simple to just generate a random key for each user on creation and putting that one into the session alongside the user id. Then in the authentication plug simply search the db for the user by the key and the id. In case a cookie is compromised one could update that users key without effecting other users.

That is what I’ve been doing for a long long time.

I think the ideal solution is pretty close to that, which is to generate a unique, secure-random session_id for the user. Then, on every login/logout you update the value. The auth plug would function like you say, where it will fetch the user by the secure token (basically how people do it now, except with a session_id instead of user_id).

In the event of a compromised cookie, all a user needs to do is log out!

Probably only on logout, because otherwise logins would invalidate prev. sessions as well.

1 Like