It’s not all that funny. If you just quickly add a route and every time you change something, even just a small type like 1/3 of your modules recompile (controllers often come in masses) without any reason for them to do so. Depending on the size of your project this could take a while.
There are a few talks of Renan Ranelli on the topic of recompilation and how quickly things can snowball into being a real problem and not just a second or two here and there.
That seems to cut the other way though: I rarely add new routes in web apps. And when I do, it’s very important — a new route is a new feature, after all.
I guess I also don’t see why adding a new external link symbol reference to a routes file would cause every controller to need recompilation. I’d think the dependency would run the other way: Only the newly referred to controller would require recompilation.
I think you could extend your project and add a new compiler to ensure all routes point to real modules, or fail. You could also make a custom credo check for that.
1 Like
You’re right. I’ve messed up the direction of the compile time dependency. It’s not the router changes, which would make the controller recompile, but any changes to controllers would make the router recompile. As @hauleth described, this is really bad, as it easily snowballs to basically recompiling all of your views, which depend on the route setup of the router to generate urls/paths. So a change in one controller can easily result in recompiling hundreds of modules.
The big problem here is that a compile time dependency exists as soon as a valid module name is seen at macro expansion. Without the compiler knowing how the module is used by the macro this means any change in the module could modify what the macro does. So the compiler needs to recompile the module using the macro whenever a module changes, which was seen when expanding the macro.
This would happen to the router if phoenix would be using the full module names of controllers instead of just the namespaced versions.
There’s no “ensure the module exists” in macros. Either you’re recompiling on each change of seen modules or you don’t depend on them at compile time at all.
1 Like
A simple controller test would also catch this type of error and only add a few milliseconds to your test suite.
I agree! But (coming from Ruby) I’ve gotten tired of writing tests for things that other languages’ tooling simply handles for you.
Another issue is that a failing test isn’t as good as a linking or type error. Because the test failure doesn’t tell you why it failed.
Yep, and this is where I feel like Elixir/Phoenix isn’t a good fit for my projects. Because for me, “code upgrades” are always easy, and actually helped by compile-time checks. In other words, I’m not deploying code that must handle live hot fixes.
You aren’t forced to do such updates, but some people are (ex. when you have application that does live streaming of music or video-calls), you shouldn’t prevent them from using language features. And as I said earlier, you can always define module in runtime, so there is no way to have static typing on that.
I have posted this on another thread but, for completeness, if you don’t want your code to compile in those cases, you can enable warnings as errors:
[elixirc_options: [warnings_as_errors: true]]
Or if you want to enable it only for xref:
[aliases: ["compile.xref": "compile.xref --warnings-as-errors"]]
5 Likes
…but a few minutes or more to my developer time — the resource which is in short supply.
I also replied elsewhere, but I wasn’t very clear: just getting a warning would be awesome. An actual error would be a cherry on top.