I am strongly against this proposal
My points will stay mostly the same as they used to be in the previous post:
- Purposes of the proposal are unclear
- This feature will bring more confusion than benefits
- Existing solution is okay
- There are other solutions and approaches
- The problem is not important
This post restates the first two points, while last three points are available here: Koka-inspired local mutable variables for cleaner comprehensions - #67 by hissssst. (for the existing solutions point Iâd also like to mention the Iteraptor solution and update_and_reduce
suggestion from @mudasobwa).
Proposal
Purpose in unclear
It is still unclear what problem it is solving. Comparison of Elixir and Python solution leads me to the thought that this proposal aims at lowering the learning curve for devs coming from mutable languages, but it is just a guess. I am not the only one who doesnât understand what does this proposal solve in the first place (for example, @dimitarvp and @mudasobwa also stated that original problem is unclear in previous thread).
So letâs enumerate our guesses:
- This proposal lowers learning curve for devs coming from mutable languages
- This proposal makes writing and reading algorithms easier
- This proposal improves performance of loops
In the next paragraph Iâll explain why this proposal wonât achieve any of these goals.
Scope is unclear
It is called âlocal accumulators for cleaner comprehensionsâ, but it is expected to be implemented in if
, case
, cond
and receive
too. These constructs have nothing to do with comprehensions. I sense that they were mentioned to add a feeling of consistency to the feature, but it will never be consistent, since most enumerable traversals are already implemented with high order functions. (this is a fact)
This will bring more confusion
1.Context dependency
2. Unclear edge cases
3. It will be misused
4. Exotic performance penalties
Context dependency
This feature depends on technical implementation of the travesal of enumerable, while it is implemented on the level of the language semantics. If you have a solution with for
, and you want to translate it into using Stream
or Enum
, youâll have to change the semantics of the algorithm (one uses out-of-the-context accumulators (aka local mutable variables), and other one is not) (it is a fact)
Inconsistency, when the same algorithm is solved with two function doing exactly the same, but one version of this algorithm has compiler-built-in magic and other one is not, is an exception by definition, which negatively impacts learning curve and ease of writing and reading code. (it is a conclusion)
Unclear edge-cases
-
Context puns. Breaking immutability in some cases leads to context-inheritance puns. Here is an example:
list = [123]
@@acc = 0
for i <- list do
[
@@acc = @@acc + 100500,
@@acc = @@acc + 1
]
end
It is unclear what will it return, and moreover, devs coming from oo languages will expect it behave like regular mutability (which it is not), thus it will make learning curve higher and it will make reading and writing this code harder (this is a conclusion).
-
Loop break Hereâs an example with early return from the loop (which is a pretty common case for languages with such kind of mutability):
@@acc = 0
try do
for i <- 1..100 do
@@acc = @@acc + i
if i >= 10, do: throw(:break)
end
catch
:break -> @@acc
end
@@acc
What will it return? It will return 0
, but developer would expect it to return 55
.
Please note that the two examples above canât be easily fixed with current state of Elixirâs compiler, and would require an extremely huge rewrite of Elixir compiler in places with exception handling and instantiation of lists and maps. (thatâs my strong opinion, which I am 100% sure about, since I am an Elixir compiler developer).
It will be misused
Developers coming from mutable languages will expect it to behave like regular mutability (which it is not), thus they will write code as above expecting it to behave differently to how it will actually behave. Early return and context puns examples are applicable here. (this argument is a second conclusion of the examples above)
Exotic performance penalties
Take a look at this example:
@@acc1 = 0
@@acc2 = 1
@@acc3 = 1
for item <- list do
if extremely_rare_case(item) do
@@acc1 = @@acc1 + item
@@acc2 = @@acc2 + item * item
@@acc3 = @@acc3 + item ** 3
end
end
Every developer will expect it to behave like regular mutable variable, but depending on frequency of positive result of extremely_rare_case
, it might even be faster to use pdict to store this value, instead of using this feature (which will introduce tuple build and match on every iteration). Right now this example is optimized by the Erlang compiler, but we can just add a few remote functions to make it completely impossible for analysis by Elixirâs and Erlangâs compiler. (it is a fact)
It makes it extremely hard to reason about this feature in terms of performance in tight loops without knowing deep details of itâs implementation in the language. (it is my strong opinion)