Transforming data and Elixir syntax style

While writing Elixir, you are encouraged to start your pipe chain with a raw value. Credo has a check for this at https://github.com/rrrene/credo/blob/master/lib/credo/check/refactor/pipe_chain_start.ex

However, I think blindly applying this decreases the readability of the code. Here are a few examples:

Instances where this rule makes sense

# 1
 list
      |> Enum.take(5)
      |> Enum.shuffle
      |> pick_winner()

#2
      %User{}
      |> User.changeset(%{"email" => "danny@m.com"})
      |> Repo.insert()

Instances where this rule does not make sense

# 3
User
    |> Repo.all
    |> Enum.map(fn(user) -> csv_fields(user) end)
    |> CSV.encode(headers: true)


#4
    20
    |> :crypto.strong_rand_bytes
    |> Base.encode16(case: :lower)

I think the question boils down to whether you are transforming something. In case of 20 |> :crypto.strong_rand_bytes you are not really transforming 20 into bytes and instead getting 20 random bytes. Just because it is an argument to a function doesn’t mean it is being transformed. What are your thoughts on this?

After writing this, I looked if there was similar discussion elsewhere and found https://github.com/rrrene/credo/issues/431#issuecomment-348917013

2 Likes

I don’t think enforcing that style is sensible. I just loosely follow the rule.
The following example is what I most dislike with it. Strictly following the rule would mean I need to split up arguments, which really belong to each other. There’s not even a hierarchy in those arguments.

SpeedDate.new(user1, user2)
|> SpeedDate.start()
|> SpeedDate.triggerHints()
4 Likes

I consider this issue to be the “perfect is the enemy of good” case, this rule clearly has lots of exceptions but it never makes the code really unreadable. For me the idea is to make the code somewhat cleaner and more consistent as well as to avoid thinking about each particular case.

Because of this I just let credo warn me and blindly comply with this rule which lets me concentrate on other things :slight_smile:

1 Like

Doesn’t it hamper readability? I find it a bit jarring when I see something like 20 |> :crypto.strong_rand_bytes, it may be because of the way I read the pipe operator, I always read it as transform 20 using this function.

It does, your examples are valid. My point is: It’s not always confusing and not in a big way, for me the benefit of having a consistent way to do things outweigh this cost.

How I read it is “pass (result of) 20 to this function” which might be a bit easier on the brain :slight_smile:

1 Like

I agree with you and I have disabled that check in Credo because I don’t care for the resulting pipelines in many cases.

2 Likes

Depends on your perspective.

I initially didn’t like the style at all and always wanted to start the pipeline with a fully populated function. The way I made my peace with it was to accept that the pipeline is in effect emulating (lambda calculus style) function composition, something that only works when all the functions involved only have a single parameter.

|> Enum.take(5)

is suggestive of a partial application to bring Enum.take/2 down to an arity of 1. In the case of

|>  :crypto.strong_rand_bytes

:crypto.strong_rand_bytes is already at an arity of 1 - so it is ready to be composed as is.

This in effect places more emphasis on the composition than the value that enters it.

2 Likes

I agree, I generally disable this check for credo.

4 Likes