Having written a lot more Phoenix templates as of late, I’m doing a lot more attribute checks than I would like. In Ruby 2.3 we could use the #try method or the safe-navigation operator &. to navigate possible nested nils. Here, if address was nil, then it shouldn’t try to access city:
Ruby: user.try(:address).try(:city) OR user&.address&.city as opposed to user && user.address && user.address.city / writing nested if expressions/statements.
Is there a nicer way to navigate nested maps or structs in Elixir, so that code that I write in EEX could be more terse? Or is there a better pattern to not have an exception raised if I’m trying to access a nested field found in a parent field that is nil?
I spoke too soon as both your examples would raise an UndefinedFunctionError returning User does not implement the Access Behaviour.
However, I believe the current best alternative is: user |> Map.get(:address) |> Map.get(:city), but if I import Map, only: [:get], then it becomes user |> get(:address) |> get(:city)
Doesn’t work either: (ArgumentError) Access is not a protocol, cannot derive Access for User. However, I agree that it’s probably better to use a named function and likely the more idiomatic solution I’m looking for…
It has been brought up but the decision is pretty final (and changing it is not backwards compatible) because they wanted it as an ‘argument’ syntax like the for special form is too. I personally think both are pretty big warts on Elixir’s syntax, they should have been blocks like everything else like them is in elixir, so instead of this:
for\
n <- 1..4,
times <- 1..n,
into: "",
do: "#{n} - #{times}\n"
with\
{:ok, result} <- do_something(),
{:ok, result} <- do_something_more(result),
result = result * 4,
{:ok, result} <- do_even_more(result),
do
final_thing(result)
else
{:error, error} -> report_error(error)
_ -> throw :UNHANDLED_ERROR
end
Instead I think it really should be this, which can be easily done via a macro if the for/with keywords were not already corrupted by being special forms…
for into: "" do
n <- 1..4
times <- 1..n
"#{n} - #{times}\n"
end
with do
{:ok, result} <- do_something()
{:ok, result} <- do_something_more(result)
result = result * 4
{:ok, result} <- do_even_more(result)
final_thing(result)
else
{:error, error} -> report_error(error)
_ -> throw :UNHANDLED_ERROR
end
These both read much more clearly to me, are easier to copy/paste, are easier to compose, are just better in every way in my opinion.
Really the only special forms should be case and cond and defmacro, everything else can be built up efficiently from those as macro’s or functions, even def and defmodule and so many others could as well (those by calling into the base compiler, which would shrink the needed erlang base of Elixir by even more). ^.^
End Side Tangent
But back to your thing, you could easily make a macro for it, maybe something like:
defmodule SafeNilGet do
defmacro sng(ast) do
Process.put(:SafeNilGetDepth, __CALLER__.line*10000) # Stupid `case` leaking bindings...
res = do_sng(ast)
Process.delete(:SafeNilGetDepth)
res
end
def do_sng({_v, _meta, context}=ast) when is_atom(context), do: ast
def do_sng({:., meta, [v, k]}) when is_atom(k) do
maybe_map = do_sng(v)
depth = Process.get(:SafeNilGetDepth)
Process.put(:SafeNilGetDepth, depth+1)
var = {String.to_atom("$SafeNilGet$#{depth}"), meta, nil} # Have to do this annoyance because Elixir's `case` leaks bindings out of its scope...
quote do
case unquote(maybe_map) do
nil -> nil
unquote(var) -> Map.get(unquote(var), unquote(k), nil)
end
end
end
def do_sng({call, _meta, []}), do: do_sng(call)
end
Or named whatever, and could be used like:
iex> import SafeNilGet
SafeNilGet
iex> map = %{blah: %{blorp: %{bleep: 42}}}
%{blah: %{blorp: %{bleep: 42}}}
iex> sng map.blah.blorp.bleep
42
iex> sng map.blah.wrong.bleep
nil
iex> sng(map.blah.blorp.bleep) # I personally like the parenthesis because commas...
42
def, defmacro, and defmodule are already plain macros.
Most people working on the compiler would love to get rid of this too, but it’s a backwards-incompatible change. So we have to live with it until 2.0.
You can easily introduce a new lexical scope by wrapping in try do <code> end.
Furthermore, Elixir has hygienic macros, so it won’t leak between contexts. You’re building the variable AST by hand working really hard to work around the regular (hygienic) macro mechanisms and then complaining it’s not hygienic
What’s more, you can rebind variables, so it’s perfectly fine to have something like this in reduce:
case unquote(maybe_map) do
nil -> nil
map -> Map.get(map, unquote(k), nil)
end
This leads us to yet another realisation that we never bind any variable using = so nothing will ever leak, even with current implementation:
iex(1)> case %{} do
...(1)> map -> map
...(1)> end
%{}
iex(2)> map
** (CompileError) iex:2: undefined function map/0
This means the whole thing can be simplified to:
defmodule SafeNilGet do
defmacro sng({var, _meta, ctx} = ast) when is_atom(var) and is_atom(ctx) do
ast
end
defmacro sng({{:., _, [v, k]}, _, []}) when is_atom(k) do
quote do
case sng(unquote(v)) do
nil -> nil
map -> Map.get(map, unquote(k), nil)
end
end
end
end
Yes, def/defmacro are currently macro’s that generate special AST syntax that is taken by the defmodule macro that takes all that AST and feeds it in to one of the base compiler erlang calls to generate it from the data. I’m just of the opinion that defmacro could be enhanced a bit that would require it to be a special form then, but that special form would then be able to handle everything else that the language could possible do, and even case and cond would not need to be special forms either and could just be macro’s if defmacro were enhanced in such a way.
I so very much cannot wait! ^.^
Hmm, I’ve not checked, I know that adding a try scope on the BEAM has a non-free cost (and many of my macros are for performance reasons), but does Elixir remove the try scope entirely when compiled if there is no catcher while retaining the non-leaking property?
Force of habit, and the building the variable AST is because recursive calls to building an ast would make them all the same, like sng thing.blah.blorp would make something like:
case thing do
nil -> nil
the_variable ->
case Map.get(the_variable, :blah, nil) do
nil -> nil
the_variable -> Map.get(the_variable, :blorp, nil)
end
end
Which in this specific example does not matter as it is not used ‘after’ (except maybe recursive calls to the macro itself from the outer scope?), but it definitely matters in other macro’s I’ve written where the leaking bindings caused thousands of warnings of the style of using a binding that was defined within a case kind of error (it was caused by recursively building an AST while defining a var before, and assigning it from the result where the same was done inside as well, then being used, which Warning-Hell that caused), which was rather painful to fix, so to prevent that I just always create unique vars regardless, even if not necessarily an issue such as in this specific case.
Sooo, hurry up with Elixir 2.0!
/me really Really REALLY hates the bindings leaking from case’s and such, why was that ever even considered to be a thing in the first place?!? Nothing has caused me more pain in Elixir that that
It does when you have a macro generating recursive code like:
case parse_something_1(context) do
%{__exception__: _} = exc -> exc
context ->
context = update_value(context, 42)
case parse_something_2(context) do
%{__exception__: _} = exc -> exc
context ->
context = update_value(context, 49)
context
end
context = parse_something_3(context)
context = update_value(context, 53)
set_success(context)
end
To see an example of recursive macro’s where Elixir starts spewing the binding warning messages see the library of SpiritEx that I’ve not fixed yet (though it works, wow the warning spew) and create a moderately complex grammar in it.
What I do not get is why bindings ever leaked the case scope in the first place? Erlang does not act that way. Not even C acts that way. It just makes no sense, especially in a language where a binding either has a value or does not exist (rather Elixir does the magical wtf’ness of assigning nil if it did not go down that branch).
I have lost so much time and gained another few hundred grey hairs because of Elixir leaking bindings and in a few cases utterly making my code do not what was expected because a binding suddenly changed when they should not (looked like it was changing, this is why I really really love Erlang’s lack of re-binding, it would entirely prevent this class of errors), I really really hate that they do that…
Yes, this specific case could, but not all cases could, and this one could easily grow to a version that it does not and would require that.
Either way, I was having fun and expanded it to have more functionality:
defmodule SafeNilGet do
defmacro sng(ast) do
Process.put(:SafeNilGetDepth, __CALLER__.line*10000) # Stupid `case` leaking bindings...
res = do_sng(ast)
Process.delete(:SafeNilGetDepth)
res
end
defp do_sng({_v, _meta, context}=ast) when is_atom(context), do: ast
defp do_sng({:__aliases__, _ameta, [v, :All]}), do: do_sng(v)
defp do_sng({:., dotmeta, [{:__aliases__, ameta, [v, :All]}]}), do: do_sng(v)
defp do_sng({:., dotmeta, [{:__aliases__, ameta, [v, :All]}, k]}) when is_atom(k) do
maybe = do_sng(v)
var = new_var(dotmeta)
ivar = new_var(dotmeta)
quote do
case unquote(maybe) do
nil -> nil
unquote(var) ->
Enum.map(unquote(var), fn unquote(ivar) -> unquote(do_sng({:., ameta, [ivar, k]})) end)
|> Enum.filter(&(&1!==nil))
end
end
end
defp do_sng({:., meta, [v, k]}) when is_atom(k) do
maybe = do_sng(v)
var = new_var(meta)
idx =
case to_string(k) do
"_"<>pidx ->
case Integer.parse(pidx, 10) do
{idx, ""} -> idx
_ -> -1
end
_ -> -1
end
quote do
case unquote(maybe) do
nil -> nil
unquote(var) when is_map(unquote(var)) -> Map.get(unquote(var), unquote(k), nil)
unquote(var) when is_list(unquote(var)) or is_tuple(unquote(var)) -> Enum.at(unquote(var), unquote(idx), nil)
end
end
end
defp do_sng({:., _meta, [v]}), do: do_sng(v)
defp do_sng({v, meta, [fun_ast]}) do
maybe = do_sng(v)
var = new_var(meta)
quote do
case unquote(maybe) do
nil -> nil
unquote(var) -> Enum.filter(unquote(var), unquote(fun_ast))
end
end
end
defp do_sng({v, _meta, []}), do: do_sng(v)
defp new_var(meta) do
depth = Process.get(:SafeNilGetDepth)
Process.put(:SafeNilGetDepth, depth+1)
{String.to_atom("$SafeNilGet$#{depth}"), meta, nil} # Have to do this annoyance because Elixir's `case` leaks bindings out of its scope...
end
end
So yeah, it can operate on tuples, lists, maps, you can get All of a set and work over each child, you can filter by putting an anonymous function (or binding to one) in parenthesis between the dots, etc… And it could all be expanded further quite easily. ^.^
1> case #{} of
1> Map ->
1> X = Map
1> end.
#{}
2> X.
#{}
It’s even more wired - a variable can be bound or not depending on which branch you took:
1> case #{} of
1> Int when is_integer(Int) ->
1> X = Int;
1> Other ->
1> ok
1> end.
ok
2> X.
* 1: variable 'X' is unbound
3> case 1 of
3> Int when is_integer(Int) ->
3> X = Int;
3> Other ->
3> ok
3> end.
1
4> X.
1
Exactly - in the snippet you’re using =, so there’s a possibility of leaks. Without = there are no leaks possible.
The updated code you posted also does not need this dance with variable names - it won’t generate any warnings always using the same variable - my refactoring is in the gist sng.ex · GitHub.
Also - you can silence warnings in generated code by marking it with quote generated: true do
Not that I recall? You have to specify it in all branches for it to ‘leak’ out, you do not get a magic value out of non-taken branches (rather it does not leak out at all in those cases). Even if you had it in all cases (which my macro’s did not) it would still not have corrupted my base values as I ensured that I always had a branch that specified no variables as commonly done in Erlang to ensure scope, as well as it would not overwrite a pre-existing binding, so setting a context in my code, then case’s, then using the context would have still worked since it would not have hoisted it out of the case to overwrite it.
That is in the shell, the shell is very… incomplete in Erlang, it has a lot of issues. ^.^
Try it in a module.
Also try putting a Y = 42, before your case in your example module, you will find that it will not stomp it. It will only create a new binding, it cannot replace an existing one (this is what makes it so surprising in Elixir, I really wish Elixir had SSA like Erlang, this issue would not exist then).
I did that initially but I got hit when my context variable actually ‘changed’ and started returning wtf values, it was that case when I started making sure they were all unique… ^.^;
Yep, the shell has a lot of issues compared to compiled code. There is a better shell that I think was made by rvirding with a bit different of syntax (erl2 or something) that I played with a decade+ ago that fixed those issues very well (and some other issues that base erlc had too).
Also a cool bit of info, Core Erlang does not ‘leak’ at all, even in matching bindings. A translation pass in the Erlang compiler converts it to Core by returning the changed bindings, so your example file above turns in to this Core Erlang:
'test'/1 =
%% Line 5
fun (_cor0) ->
let <_cor5,Y> =
case _cor0 of
%% Line 7
<Map>
when call 'erlang':'is_map'
(_cor0) ->
%% Line 8
<Map,Map>
%% Line 9
<_X_Other> when 'true' ->
%% Line 11
<'ok','ok'>
end
in %% Line 13
Y
Or more readably in Elixir’ish it got turned in to this:
def test(x) do
{_returned, y} =
case x do
map when is_map(map) -> {map, map}
_x_other -> {:ok, :ok}
end
y
end
This is one of many many reasons that I love Core (I really should write a Core generator again, it is such a pleasant language to generate for).
Elixir really should compile to Core instead of Erlang as well, it is a much more sensible language and target.
The problem with that is that we lose a lot of tools - dialyzer, cover, debugger to name a few. There’s an opening to change this with OTP 20 and the new debug info format.
Aww, I’d never tested but I hoped those worked on the BEAM files rather than any kind of source files…
Ooo really? Any docs on that? (I’m still mentally stuck in the OTP 17 world as that is the last time I delved in to the engine… >.>)
EDIT1:
Hmm, the dialyzer docs says it can work from debug-compiled BEAM bytecode, which the Core Erlang I gave had the debug annotations in it so it would work: Erlang -- dialyzer
Cover seems like it can work fine without the source files but you have to add in the extra decorations yourself (which a language compiling to Core could do itself): Erlang -- cover
And I’m pretty sure the debugger works fine with debug-compiled BEAM bytecode, otherwise various embedded setups would not work? Checking though, and yep, it works fine with debug-compiled BEAM files (in fact it only touches beam files, not source): Erlang -- Debugger
So yeah, it should work fine and would open up more abilities and a more simple generator (other than adding cover annotations, does not seem too hard overall).
EDIT2: Huh, actually it looks like :cover can take an existing non-cover debug-enabled BEAM file and create a new BEAM file with the correct cover annotations, so that is easy too. http://erlang.org/doc/man/cover.html#compile_beam-1
Yes, they all work fine with debug-compiled files. But debug-compiled means you store Erlang AST in the beam file (that’s what the old Abs chunk stores). So if you’re compiling straight to core, you have no Erlang AST to store and no debug info.