Elm vs Vue/React which one do you prefer to use with Phoenix, and why?

I havent worked with JSX yet, but at a glance I get flashbacks from my days of php templates where people made basicly the same arguments agains template engines like Twig. :smile:

Ooo I was not trying to make arguments, I was even trying to avoid them, because it’s really a personal choice. But to be fair to JSX, I put the equivalent code to @bjunc example, so anyone can make his own judgement :slight_smile:

1 Like

I think that Vue can work well with Phoenix forms like so:

<%= text_input f, :name, class: "form-control", v_model: "message" %>

Generally I just include vue.js from a CDN on the form pages I need it on and make the form or its container my ‘app’.

I’m definitely not a vue expert, but this has been helpful for simple bindings etc.

If I’m not mistaken you are penalizing the client twice:

  • The Full (Runtime + Compiler) version has to be downloaded instead of just the Runtime version.
  • Vue.js then has to compile the templates on the client - further delaying when the page will be ready.
  • If you are using the in-DOM HTML of the mounting DOM element as the template (aka DOM template), the browser will attempt to parse features only intended for Vue.js which can lead to issues.

See: Why You Should Avoid Vue.js DOM Templates and DOM Template Parsing Caveats

1 Like

Thanks for posting! I’m very new to Vue.js and hadn’t considered this. I’ve only been peppering Vue components into a few places trying to answer the same question as OP. Thanks!


HTH! :-))


Thanks a lot for your in-depth review of the front end frameworks. I wonder if performance is ever a deciding factor in which framework you use considering that the VueJS core is a few kilobytes and there’s a general assumption it’s faster than the other frameworks where Angular is the slowest (but I’ve seen lots of fast Angular websites anyway).

Do you consider performance or your React preference was mostly due to the extra features you mentioned (ReasonML / ReasonReact)?

Because I don’t think it was posted in here previously yet:


It’s a benchmark (which should, as all benchmarks, be taken with a grain of salt) which shows that Elm in certain cases is twice as fast as React (and 1.5 as fast as Angular), while having smaller compressed JS file sizes as well.

I still stand by my former opinion, which is that writing a front-end application with Elm results in a much more understandable and maintainable application, while React/Vue/Angular are better suited for quick prototyping because of their dynamic natures but I would not want to work on a large project that should live more than a couple of months in them. I have yet to find two teams in React that have the same architectural setup in their soup of libraries (because React only does the rendering part. What do you use for promises? remote requests? event handling? redirection? state management? JSON parsing with humanly readable error messages? Working with websockets? BigDecimal numbers? Transpiling to support older browsers? etc…).

I think that even if you already know JS, learning Elm with its clear documentation and well-defined architecture that is the same across all apps is less of a time and effort investment than learning that whole soup of JS libraries. Also read How it feels to learn JS in 2016. :slightly_smiling_face:


What do you think about the stated problems with js library interop, f.e. Elm vs Vue/React which one do you prefer to use with Phoenix, and why? and other critique like Elm - General Discussion, Blog Posts, Wiki (there is more to be found on this forum)? What do you think about reasonML compared to elm?

I have no extensive experience with ReasonML (or any variant of OCaML, only having followed a basic guide so far), so I can only say that Elm’s purity is both a blessing and a curse:

  • It is a blessing for writing maintainable software, because there is no way to put in ‘cheats’ or ‘duct tape solutions’ that would break referential transparency.
  • It is a curse for quick prototyping.

OCaML is not pure, which means that while I think that it would be better for prototyping, I would expect that projects written it it will also become messier over time. But again, I have no experience with OCaML so I cannot make an accurate comparison here.

JS-interop indeed is difficult, and for a reason: Asking people every time if it would not be nicer to perform the work they want to be doing in a pure, strongly typed environment, or rather in a separate, asynchroniously-running untyped one.
The main ‘difficulty’ with JS-interop is that you (1) need to be aware that the code will run asynchroniously and (2) that the return result might be any JS-type, so you will need to check if its response is indeed what you expected once you enter Elm-land again.

I think there is no way around this methodolgy if you want to keep the nice properties that pure functional programming (with the insane reasoning and optimizations the compiler can do for you).

The main critique of what Elm currently does, is that a feature known as ‘Task ports’ does not exist (yet): This means that writing a round-trip to JS requires two bits of boilerplate, rather than one. But libraries, like Porter exist to mitigate the need for (almost all) boilerplate.

To respond to the points that @OvermindDL1wrote in his post from nearly two years ago:

This is not something I have benchmarked, so I cannot make any claims about it, other than that Elm’s compilation has so far never been too slow for me.

NPM being the crowded place of mediocre JS libraries as it is, I like Elm’s simple installation wrapper, but even more do I like the fact that you can search, Hoogle-style through all publicized packages.

:man_shrugging: Your Mileage May Very. In two years of time, a lot more packages have definitely been written.

This claim was still made when Elm 0.18 was the latest version. Most of the larger compiler-bugs at that time were fixed in the current version of Elm, 0.19.

Yes, Elm’s package management is currently bound to GitHub. This is also something that I’d like to see change in the future.

But my Tl;Dr: is still: Not having a way to cheat really helps for the maintainability of your project, and I think that Elm will win in that category from OCaML-based systems based on that inherent difference between the two stacks.


Thanks. My friend google adds a lot of answers too, f.e. https://stackoverflow.com/questions/51015850/reasonml-vs-elm

1 Like

At the time I circled back around to React because that seemed to be the foundation of any number of functional frameworks like reagent (re-frame), Om, thermite, and of course ReasonReact.

I think we’re at a point now were nobody ever got fired for choosing React.js. Unless you’re at Google but they’ve got Jason Miller. And it’s somewhat ironic that React is being increasingly used with TypeScript (that was supposed to be Facebook flow’s job).

where Angular is the slowest (but I’ve seen lots of fast Angular websites anyway).

I’ve always felt that AngularJS was primarily targeting enterprise desktop browsers. As mobile has gotten more prevalent Angular tried to get leaner but its inherent design isn’t than lean to begin with. More recently I’ve decided that Google’s “browser people” make a lot more sense than Google’s “framework people”. The browser-centric view point is much more aware of the web’s constraints and limitations, while the framework-centric mindset often makes questionable abstractions in a futile attempt to the hide these intrinsic constraints of distribution (some think that Facebook is actively trying to abstract the Web away).

I wonder if performance is ever a deciding factor

The Google-centric view is that the quantity of JavaScript shipped to the browser is a major concern - The Cost Of JavaScript In 2018. So JavaScript heavy applications will have to employ tactics (code splitting, aynchronous imports, SSR) which increase the solution complexity even more in order to deliver an adequate user experience.

But other performance characteristics can vary wildly between frameworks:

Squoosh is Google’s current performance PWA demonstration which is described in Complex JS-heavy Web Apps, Avoiding the Slow (Chrome Dev Summit 2018). They claim that Angular, React and Vue are too heavy for their budget ultimately arriving at Preact - and even then some of the heavy lifting is outsourced to vanilla web components. In this case “performance” seems to have required a pool of fairly high level talent and a significant level of effort - probably more than many ventures are willing to engage in (or have access to).

That discussion kind of flared up again here - ReasonML vs Elm
Leaves one with the impression that right now isn’t the time to get into ReasonML/ReasonReact.

clojureD 2019: “Our Journey from Elm and Elixir to Clojure” by Martin Kavalar describes a case where a project went to (isomorphic?) Clojure(Script):

  • initially to escape the constraints of Elm’s JS interoperability
  • and then to gain full unfettered access to the capabilities of Datomic

For note, I added bucklescript-tea to that same benchmark and it profiled ‘slightly’ faster than elm. ^.^

Having a mutable reference type does not mean it is not pure, the base language itself is pure. It’s “escape hatch” is the external type, which for bucklescript lets you call out to javascript via a typesafe call. It’s like typescript in that way and it is a huge boon in actually getting real work done. :slight_smile:

You do have to remain typesafe the entire time, there is no real escape hatch from that (a pox on those that use Obj.magic! *coughs*) and projects do not become messier over time in it, the type system enforces that to an extreme.

Both typescript and OCaml allow you to call javascript in a type safe way by declaring it’s type. This is more like Rust’s unsafe, you declare the types the unsafe should take in and give back, but whatever it does inside of that is up in the air, and if it fails then you know it is failing in one of those ‘unsafe blocks’, which give you a very very restricted area of where you need to check for crash bugs. :slight_smile:

Also, Elm’s compiler doesn’t do much in the way of optimization as of yet, that’s still a ‘coming feature’ for years now.

Last I ran any Elm (0.17 I think?) it still took over 40 seconds to compile my work project, that same project ported to ocaml/bucklescript and even still larger than then compiles from a full clean compiles in, hmm, let’s test:

╰─➤  node_modules/.bin/bsb -clean
Cleaning... 230 files.
╰─➤  time node_modules/.bin/bsb        
node_modules/.bin/bsb  1.08s user 0.28s system 169% cpu 0.803 total

And incremental recompiles, let me change one file, and run build again:

╰─➤  time node_modules/.bin/bsb
node_modules/.bin/bsb  0.10s user 0.04s system 68% cpu 0.194 total

So it seems quite fast enough to keep up with phoenix reloading as fast as I can save and alt-tab. :slight_smile:

NPM being the crowded place of mediocre JS libraries as it is, I like Elm’s simple installation wrapper, but even more do I like the fact that you can search, Hoogle-style through all publicized packages.

OPAM then. ^.^

:man_shrugging: Your Mileage May Very. In two years of time, a lot more packages have definitely been written.

Actually as each new Elm version is backwards incompatible with the older libraries I’ve heard that the overall libraries to use are less than it used to be now?

This claim was still made when Elm 0.18 was the latest version. Most of the larger compiler-bugs at that time were fixed in the current version of Elm, 0.19.

I’ve been keeping a list of reporting Elm 0.19 compiler bugs that happen to popup on my radar (I hear about them a lot for some reason), all kinds of things from debugger bugs, not handling DOM changes from browser extensions more cleanly, TCO failing when pipes are used, 0.19 broke being able to match with a negative number, compiler generating bad code when the github library owner’s name starts with numbers, subscriptions not updating upon first load, function generation ordering bug in some cases, inequality operator returning the wrong type, breaking CSS variables, compiler crashing with certain file modification timestamps, port runtime errors with certain names, compiler access violation with some code, module dependency graph not calculated properly, etc… And those are just the ones I’ve heard about on IRC (with github issue links, I have a list here if you want it). As it stands 0.19 has ‘more’ bugs than 0.18 as the reason cited why quite a number of people have ported to bucklescript-tea from Elm (there’s a tool that helps auto-convert most code too now, made by the tea community ^.^).

So far based on IRC usage and each respective area’s Discourse forums, ReasonML/OCaml/Bucklescript/JSOO is significantly more popular than Elm and only getting more so as time goes on.

Honestly I don’t see either taking over though, rather I see things moving to wasm and javascript slowly dying as DOM integration features are added to the standard wasm browser lib. And yes OCaml has a building wasm backend, though personally I think Rust is quite a bit better there.

Preact is pretty cool actually at the bit I’ve looked through, it’s fast and slim. :slight_smile:


I think Elm’s marketing about runtime exceptions is wrong, the JavaScript’s ecosystem has many tools to avoid common pitfalls, I only saw unfair comparisons arguing how JavaScript has those kind of exceptions, but that’s because they are just comparing valid but poor JavaScript code vs Elm, a type-checked language. You can even get ride of some errors writing proper JavaScript. Also the benchmark is a bit misleading, they are comparing a plain virtual DOM implemented in Elm vs libraries that offers many other advantages.

Personally I wouldn’t drop my JavaScript skills to learn another language and do the same thing, still, is a cool language for someone who loves FP.

Like all package managers then. NPM is a mess in the sense of package’s dependencies, but saying that is “the crowded place of mediocre JS libraries” is a bit unrespectful for authors, some packages may be intended for personal use. At some point Elm would have those kind of packages, and that’s totally fine, not all libraries are meant to be relevant.


Statistics predict: there are excellent libraries also. And indeed there are.

Point taken; it was not my intention to be disrespectful. And you are right, if Elm might become very large, there will no doubt also be packages of low quality in that package manager.

However, this does not change the fact that (1) Elm prevents you from publishing a package without a correct Semantic Versioning number, (2) packages have auto-generated documentation that is immediately readable from within the package browser, and (3) that you can search through all functions in all packages using a dedicated search engine which makes it very easy to find the package you are looking for.

This gets very close to a no true scotsman fallacy.
I am speaking about my own experiences with working on two large React-projects and two large (+ one smaller) Elm projects in the last two years. ‘Large’ here means: multiple people worked on them in a timespan of more than two months. In both of those cases, it was very difficult to properly expand the architectural structure of the React applications and keep them well-behaved at all times. Refactoring them became more and more necessary but also increasingly difficult with growing application size.
In Elm, this has not been a problem for us at all.

Of course, your mileage may very well vary, depending on what values you exactly prefer in a language.



The browser APIs have always been specified with reference to JavaScript and I don’t see that changing. So while the integration may become more “lightweight”, I don’t see even new APIs being more “typing friendly” - for JavaScript they don’t have to be.

How does WebAssembly fit into the web platform?

So JavaScript “slowly dying” seems unrealistic at best (aside: Gary Bernhardt: The Birth & Death of JavaScript (2014))

Preact is pretty cool actually at the bit I’ve looked through, it’s fast and slim.

As (dangerously) lean as it is, it’s still vDOM based. hyperHTML and lit-html provide an alternative approach that seems to work with the browser’s capabilities, rather than creating something from scratch (in pure JavaScript). vDOM was great in 2013 but it shouldn’t be considered a default now

Doesn’t stop this being a great quote from one of those authors who deserve respect:
Small modules: it’s not quite that simple:

I offer an additional explanation: that we in the JavaScript world have a higher tolerance for nonsense and dreck.

1 Like

Wasm was designed to interact with simple layers, and both firefox and chrome are building (or in firefox’s case, have already built) methods to pass more direct function pointers into wasm itself, removing the overhead of a DOM call down to a single virtual (and even the virtual bit can be optimized out) dispatch. It’s actually really cool development that they are going down. ^.^

Lol, even he says (comedian though he is) that wasm will just take over everything down to the OS level with just an HTML GUI built on top to interact with. ^.^

Google’s lit-html build actually looks really cool as well, been meaning to look in to it but haven’t got around to it yet, it seems to be used in their latest Polymer library as well.

Some things on the DOM are slow, but in general yeah. A VDom’s benefits aren’t just in that it doesn’t touch the DOM (meaning fewer cache misses!) but also that it’s comparison code is really tightly packed and aligned, thus really quick to iterate over and test for changes.

Honestly I’m not actually a fan of VDom’s, I only made tea to port over my old big elm app with minimal work needed. I’m personally a fan of observational changes, I.E. you hold a mapping of data -> view structures and when you update the data then the mappers are called that transform the data as necessary and apply it specifically to the exact DOM nodes via holding direct references to them. It’s hard to get faster than that, no VDOM iteration, nothing touched that is not specifically being changed, etc… There are a few good (really tiny) libraries out there that follow this pattern. :slight_smile:

Care to link a few? It sounds interesting.