Well, it goes both ways xD If I had started thinking about how to optimize for comprehensions (instead of quitting), I’d have stumbled upon your solution about 1 month ago It’s a natural consequence of trying to do things at compile time, because we can take advantage of the fact that
for is a special form that can’t be overriden, so it can be analyzed statically.
But I believe we may have started with different priorities.
You had @chrismccord’s example from the LiveView talk and and maybe optimizing such dynamic highly dynamic templates was a priority for you.
I started thinking about (mostly static) forms and how to optimize those, because those are the ones I saw myself using the most, especially the part of having real-time form validation without writing any JS. That brought me naturally close to the idea of inlining static parts and merging adjacent binaries together. Yes, live tables that could update in response to typing in a search field were cool, but I had already given up on those
I’m the first to praise the simplicity of your implementation but I think most of the complexity with my approach is unavoidable. It’s the result of implementing an optimizing compiler (although a very simple one) that tries to separate the static and dynamic parts in a very obsessive way.
The other main source of complexity with my approach is probably the attempt to maintain backward compatibility with Phoenix.HTML, which contains some things that are not very easy to optimize statically. The
tag() function, in particular, is a bit weird, and I should probably break compatibility so that it can be abetter building block for the rest. If I can encapsulate most of the complexity inside the
tag() macro (because I need it to be a macro), thins will probably be much simpler in the rest of the code base.
I don’t consider this “premature” optimization, because it’s essential for my main goal, which is to send the absolute minimum amount of data over the network for things like forms.