It would work for all code-blocks, hence the ...
in the original description, (and all macro do-end
s as long as they compile to one of Elixir’s code blocks too), but it won’t work across functions (anonymous or otherwise) or modules.
All good!
With the understanding that this is adding “local accumulators” instead of general mutability, I’m for this! I think I’m not alone in wishing I didn’t have to create ad hoc tuples in order to accumulate multiple values.
All I want to say is that I think a different syntax is crucial. I’m going to try to circumvent the moratorium on syntax by saying that I don’t care what it is so long as it’s different. If x = x + 1
can change a value of x
even outside it’s current scope, I’d like to know just by looking at it!
As others have said, F# has this. And they have a special syntax for reassigning mutable variables:
let mutable x = 1
x <- x + 1
If Elixir gains this power, I think you should have to opt into it with syntax. Otherwise, accidental mutation accumulation will happen.
Cool, that’s good then. Thank you for reminding me, and sorry about not addressing it in my comment.
In this case I believe you should aim for specialized syntax for this particular scenario and not introduce something that can and will be misinterpreted as general mutability (albeit with a limited scope). People come to Elixir already expecting stuff they are used to, so let’s not exacerbate this problem.
I like @bit4bit’s idea the most: let’s have for
option that provides special treatment of the variables listed in it (including nested for
blocks). Whether it’s called state
or accumulators
or mutable
or iterables
(or iterators
) remains to be discussed but IMO you should stick to this thing being a specialized syntax sugar to make using for
more intuitive.
EDIT: Though if you plan on this being used not only in for
constructs then yes, your original idea seems the most reasonable, with some syntax amendments to avoid ambiguity.
The concept is interesting. It is cool to get ideas from experimental languages. However, I’m sceptical about adding this to Elixir now, since it won’t bring performance benefits. I already see a new credo rule for blocking local mutability, because this is readability only issue I almost didn’t work with mutable languages during the past 5 years, so I don’t even think in a way “I need mutability here, since it is more readable”. When I have an opportunity to write some JS code, I write it using functional style even if it takes more lines (usually it doesn’t). This is easier for me. Let’s think about the opposite - how will the code look like if mutability is overused? Will we have a new anti pattern page in docs?
I may be wrong, but introducing mutability opens another topic - arrays.
Thanks everyone for your inputs. I will close this proposal soon and resubmit it without mentioning mutability. There are two aspects to decide:
-
What should we call them? Local state? Local accumulators?
-
We definitely need explicit syntax to declare them, such as the
mut
“keyword”, do we also need explicit syntax to reassign them? I am personally undecided. F# does use a different syntax, as @billylanchantin mentioned, but in F# it is literally mutable and it crosses the local scope, which is why you need a different syntax. One of the reasons to not have different syntax is: what if you write{foo, bar} = some_fun()
wherefoo
is a mutable (as in this proposal) butbar
is not? Using a separate operator would not capture this nuance. Using a sigil such as$foo
is an alternative but it has a much bigger footprint on the language. Anyway, we don’t need to decide this now, but it is food for thought.
I’d say only here is a red herring, readability is extremely important and it should be enough reason.
- What should we call them? Local state? Local accumulators?
Since this is a different scoping rule for variables (we can already re-assign them, just not in a nested scope), how about something with “scope” like “local scoped variables”?
- We definitely need explicit syntax to declare them, do we also need explicit syntax to reassign them?
I wondered the same, especially from the point of view of metaprogramming or static analysis: would having a separate operator help distinguish at the AST level or shouldn’t this matter?
I rewrote your earlier example using $foo
just to sort of see what it looked like and a few things caught my eye:
let $section_counter = 0
let $lesson_counter = 0
for section <- sections do
if section["reset_lesson_position"] do
$lesson_counter = 0
end
$section_counter = $section_counter + 1
lessons =
for lesson <- section["lessons"] do
$lesson_counter = $lesson_counter + 1
Map.put(lesson, "position", $lesson_counter)
end
section
|> Map.put("lessons", lessons)
|> Map.put("position", $section_counter)
end
One very noticeable change is that it’s immediately more apparent where in the code the “local accumulators” are being accessed and modified, and I think this is sort of a plus. They operate differently from normal values but they are necessarily intermingled with regular values, and you can tell at a glance you’re dealing with one of these “special” variables without having to go trace up to where it was defined.
The other bit that became clear though is that I think we still need to keep the let
or similar up front binding to establish where the scope for it lives.
I don’t have a strong opinion about $
vs some other signifier. $
does seem somewhat fitting though.
The reduce form of comprehensions allows carrying the necessary state, is it just a matter of education on the idioms?
{sections, _, _} =
for section <- sections, reduce: {[], 1, 1} do
{acc, section_position, lesson_position} ->
lesson_position = if section["reset_lesson_position"], do: 1, else: lesson_position
{lessons, lesson_position} =
for lesson <- section["lessons"], reduce: {[], lesson_position} do
{acc, lesson_position} ->
lesson = Map.put(lesson, "position", lesson_position)
{[lesson | acc], lesson_position + 1}
end
section = Map.put(section, "position", section_position)
{[%{section | "lessons" => Enum.reverse(lessons) } | acc], section_position+1, lesson_position}
end
sections = Enum.reverse(sections)
This has been covered the first time such proposal was discussed. It’s not a skill issue, it’s a readability issue. I personally don’t mind Enum.map_reduce
and using tuples that carry multiple accumulators but I’ll agree with the others that this can become unwieldy and hard to read, pretty quickly.
I’ve solved this for myself by devising a special accumulator struct – or simply use a map so the combined accumulator’s goals are more apparent (since, you know, a struct / map has named keys) – but I understand if people would just prefer some mutable[-like] syntax to make the whole thing easier to read and maybe it having less coding lines (debatable).
Another way I am keeping complexity at bay in these cases is to liberally use small private functions with descriptive (usually long) names.
This definitely looks better but I fear that somewhere in the near future we might really need the dollar sign for something else (say, the upcoming type checker) and it will be taken.
Though that makes me wonder… how many free special symbols Elixir still has at its disposal for future syntax additions? And I mean only those in the English alphabet and the QWERTY keyboards, not the entirety of Unicode.
Unused ones are `
and $
. However, some are used under very specific situations, such as %
, ~
, and \
which may still be available for other use cases. We will also free '
once Elixir v2.0 arrives (as single-quotes for charlists are deprecated). However, I don’t think this feature is important enough to justify “spending” $
on it (if the sigil such as $
is a requirement, I’d consider it a deal breaker for now ).
EDIT: there is also the option of using something like @@foo
for this.
The proposal talks about this and the previous discussion as well. For this particular use case, there is too much verbosity around immutability and lack of reassignment, that the code using reduce
(or map_reduce
) gets overloaded with the mechanics of how you pass the data around, which makes it much harder to figure out the actual “business work” that you need to do.
Give an Elixir developer both Elixir and Python code, and ask them to summarize what the algorithm is doing, I wouldn’t be surprised if the majority will be able to figure out the Python one quicker than the reduce
one.
Despite the fact that comprehensions can be used for this problem, the Enum.map_reduce
form is still easy to read for those familiar with Elixir (IMO).
It does beg the question if Enum.map_reduce
is sufficient ,and reads nicely (as per Jose’s original post), do we really need to change anything?
Do we need a better example problem that is a lot more clumsy to solve without comprehensions and requires “local accumulators” / “function scope rebinding” ?
I feel that the problem with comprehensions is that they have an almost imperative style, which is a more familiar way to those coming from non functional languages, but not to those who have made the mental switch to a functional approach.
The illusion of that imperative style also starts to break down because when using an immutable functional language we can’t do things the same way, and must shift our thinking. I hazard to guess this is why the uptake on comprehensions is low, once you shift your thinking to the functional way it just feels strange to use an imperative looking style, even though for
is still built on recursion.
As for the function scope mutability I would never want to use it, and would ensure credo checks to weed out any use of it because it is unnecessary and wreaks of an imperative style.
I would entertain a more constrained form, such as deliberate binding of comprehension state variables from/to the outer scope, and compiler checks to raise an error on any rebinding/assignment to a variable that shadows a variable/binding in the outer scope and which has not been specifically bound to the comprehension state.
I did previously suggest a syntax such as pinning but that would be confusing, a better syntax might be inner <-> outer
to indicate the binding goes both into and out of the comprehension.
To me it is sufficient but apparently other people disagree. I am not trying to change their mind.
To me: yes, we need a better and more comprehensive problem statement. But again, seems like people are already convinced the problem is bad enough to merit such an addition.
I’d constrain this even more, as mentioned above I’d have something like for item <- items, accumulators: [a: outside_var_a, b: outside_var_b]...
which would just alias outside_var_a
to a
and outside_var_b
to b
only inside the for
block, and be done with it.
But admittedly I am not on board with why is this change even necessary so I might be shooting entirely the wrong target.
I think you are correct on this, the Elixir code does take more cognition to decode what its doing.
Which suggests we need a better way to describe the state that gets “carried” within comprehensions and the state that gets returned to the outer scope.
It doesn’t follow that we need pervasive function scope level rebinding however (mut
).
Yes basically what I was suggesting with the inner <-> outer
syntax given that we already have state “flowing” into the comprehension with <-
it didn’t seem a stretch to have a similar concept for “bidirectional” state bindings using <->
.
I am not sure “better” is the right word here. We already have a mechanism to do so, because functional programming wants the state to be explicit, the issue is that it is too verbose. We need a more concise mechanism but the big question is which parts to hide.
Definitely, which is why it is one of several proposals on this topic. Your suggestions are similar in semantics to the previous one, except with a different syntax, so I really recommend reading the past thread.
I am not saying we should accept this because the previous one was rejected. After all, we can still leave the solution space open, but we should avoid rehashing non-accepted solutions.
I kinda disagree. This was already extensively discussed over two previous proposals. The problem has been described, perhaps you just don’t consider it enough to warrant any of the requested language changes.
Very possible, it’s been a while and I don’t remember well anymore and you are not mandated to repeat yourself – hence, I am not contesting the need for the extra syntax.
Nah, it is totally fine to contest it. Even if just a gut feeling check because even if we accept it now, we will still have to explain the same rationale for the next several years!