Elm vs Vue/React which one do you prefer to use with Phoenix, and why?

HTH! :-))

2 Likes

https://medium.com/@Pier/vue-js-the-good-the-meh-and-the-ugly-82800bbe6684?mkt_tok=eyJpIjoiTldSa05HVXpZelUwWVRsaCIsInQiOiJBVzU1SjB1MzdUbWl6U05idGphNlNIdlpjN21VNWJtaVdGUlo4MFR1ZERQWUQxQ0Q4ckpCS3poMUJNcGtneFwveUordWJjZ0pXYUpMZE5lMW9OanZhaEp2YjFIeStGXC8yYStoV0VvdnNySDFaWSs3RkY1cXZrNXJqRkFiTXlhS2hrIn0%3D

3 Likes

Thanks a lot for your in-depth review of the front end frameworks. I wonder if performance is ever a deciding factor in which framework you use considering that the VueJS core is a few kilobytes and there’s a general assumption it’s faster than the other frameworks where Angular is the slowest (but I’ve seen lots of fast Angular websites anyway).

Do you consider performance or your React preference was mostly due to the extra features you mentioned (ReasonML / ReasonReact)?

Because I don’t think it was posted in here previously yet:

https://elm-lang.org/blog/blazing-fast-html-round-two

It’s a benchmark (which should, as all benchmarks, be taken with a grain of salt) which shows that Elm in certain cases is twice as fast as React (and 1.5 as fast as Angular), while having smaller compressed JS file sizes as well.


I still stand by my former opinion, which is that writing a front-end application with Elm results in a much more understandable and maintainable application, while React/Vue/Angular are better suited for quick prototyping because of their dynamic natures but I would not want to work on a large project that should live more than a couple of months in them. I have yet to find two teams in React that have the same architectural setup in their soup of libraries (because React only does the rendering part. What do you use for promises? remote requests? event handling? redirection? state management? JSON parsing with humanly readable error messages? Working with websockets? BigDecimal numbers? Transpiling to support older browsers? etc…).

I think that even if you already know JS, learning Elm with its clear documentation and well-defined architecture that is the same across all apps is less of a time and effort investment than learning that whole soup of JS libraries. Also read How it feels to learn JS in 2016. :slightly_smiling_face:

3 Likes

What do you think about the stated problems with js library interop, f.e. Elm vs Vue/React which one do you prefer to use with Phoenix, and why? and other critique like Elm - General Discussion, Blog Posts, Wiki (there is more to be found on this forum)? What do you think about reasonML compared to elm?

I have no extensive experience with ReasonML (or any variant of OCaML, only having followed a basic guide so far), so I can only say that Elm’s purity is both a blessing and a curse:

  • It is a blessing for writing maintainable software, because there is no way to put in ‘cheats’ or ‘duct tape solutions’ that would break referential transparency.
  • It is a curse for quick prototyping.

OCaML is not pure, which means that while I think that it would be better for prototyping, I would expect that projects written it it will also become messier over time. But again, I have no experience with OCaML so I cannot make an accurate comparison here.

JS-interop indeed is difficult, and for a reason: Asking people every time if it would not be nicer to perform the work they want to be doing in a pure, strongly typed environment, or rather in a separate, asynchroniously-running untyped one.
The main ‘difficulty’ with JS-interop is that you (1) need to be aware that the code will run asynchroniously and (2) that the return result might be any JS-type, so you will need to check if its response is indeed what you expected once you enter Elm-land again.

I think there is no way around this methodolgy if you want to keep the nice properties that pure functional programming (with the insane reasoning and optimizations the compiler can do for you).

The main critique of what Elm currently does, is that a feature known as ‘Task ports’ does not exist (yet): This means that writing a round-trip to JS requires two bits of boilerplate, rather than one. But libraries, like Porter exist to mitigate the need for (almost all) boilerplate.


To respond to the points that @OvermindDL1wrote in his post from nearly two years ago:

This is not something I have benchmarked, so I cannot make any claims about it, other than that Elm’s compilation has so far never been too slow for me.

NPM being the crowded place of mediocre JS libraries as it is, I like Elm’s simple installation wrapper, but even more do I like the fact that you can search, Hoogle-style through all publicized packages.

:man_shrugging: Your Mileage May Very. In two years of time, a lot more packages have definitely been written.

This claim was still made when Elm 0.18 was the latest version. Most of the larger compiler-bugs at that time were fixed in the current version of Elm, 0.19.

Yes, Elm’s package management is currently bound to GitHub. This is also something that I’d like to see change in the future.


But my Tl;Dr: is still: Not having a way to cheat really helps for the maintainability of your project, and I think that Elm will win in that category from OCaML-based systems based on that inherent difference between the two stacks.

2 Likes

Thanks. My friend google adds a lot of answers too, f.e. https://stackoverflow.com/questions/51015850/reasonml-vs-elm

1 Like

At the time I circled back around to React because that seemed to be the foundation of any number of functional frameworks like reagent (re-frame), Om, thermite, and of course ReasonReact.

I think we’re at a point now were nobody ever got fired for choosing React.js. Unless you’re at Google but they’ve got Jason Miller. And it’s somewhat ironic that React is being increasingly used with TypeScript (that was supposed to be Facebook flow’s job).

where Angular is the slowest (but I’ve seen lots of fast Angular websites anyway).

I’ve always felt that AngularJS was primarily targeting enterprise desktop browsers. As mobile has gotten more prevalent Angular tried to get leaner but its inherent design isn’t than lean to begin with. More recently I’ve decided that Google’s “browser people” make a lot more sense than Google’s “framework people”. The browser-centric view point is much more aware of the web’s constraints and limitations, while the framework-centric mindset often makes questionable abstractions in a futile attempt to the hide these intrinsic constraints of distribution (some think that Facebook is actively trying to abstract the Web away).

I wonder if performance is ever a deciding factor

The Google-centric view is that the quantity of JavaScript shipped to the browser is a major concern - The Cost Of JavaScript In 2018. So JavaScript heavy applications will have to employ tactics (code splitting, aynchronous imports, SSR) which increase the solution complexity even more in order to deliver an adequate user experience.

But other performance characteristics can vary wildly between frameworks:

Squoosh is Google’s current performance PWA demonstration which is described in Complex JS-heavy Web Apps, Avoiding the Slow (Chrome Dev Summit 2018). They claim that Angular, React and Vue are too heavy for their budget ultimately arriving at Preact - and even then some of the heavy lifting is outsourced to vanilla web components. In this case “performance” seems to have required a pool of fairly high level talent and a significant level of effort - probably more than many ventures are willing to engage in (or have access to).


That discussion kind of flared up again here - ReasonML vs Elm
Leaves one with the impression that right now isn’t the time to get into ReasonML/ReasonReact.


clojureD 2019: “Our Journey from Elm and Elixir to Clojure” by Martin Kavalar describes a case where a project went to (isomorphic?) Clojure(Script):

  • initially to escape the constraints of Elm’s JS interoperability
  • and then to gain full unfettered access to the capabilities of Datomic
2 Likes

For note, I added bucklescript-tea to that same benchmark and it profiled ‘slightly’ faster than elm. ^.^

Having a mutable reference type does not mean it is not pure, the base language itself is pure. It’s “escape hatch” is the external type, which for bucklescript lets you call out to javascript via a typesafe call. It’s like typescript in that way and it is a huge boon in actually getting real work done. :slight_smile:

You do have to remain typesafe the entire time, there is no real escape hatch from that (a pox on those that use Obj.magic! *coughs*) and projects do not become messier over time in it, the type system enforces that to an extreme.

Both typescript and OCaml allow you to call javascript in a type safe way by declaring it’s type. This is more like Rust’s unsafe, you declare the types the unsafe should take in and give back, but whatever it does inside of that is up in the air, and if it fails then you know it is failing in one of those ‘unsafe blocks’, which give you a very very restricted area of where you need to check for crash bugs. :slight_smile:

Also, Elm’s compiler doesn’t do much in the way of optimization as of yet, that’s still a ‘coming feature’ for years now.

Last I ran any Elm (0.17 I think?) it still took over 40 seconds to compile my work project, that same project ported to ocaml/bucklescript and even still larger than then compiles from a full clean compiles in, hmm, let’s test:

╰─➤  node_modules/.bin/bsb -clean
Cleaning... 230 files.
╰─➤  time node_modules/.bin/bsb        
...snip...
node_modules/.bin/bsb  1.08s user 0.28s system 169% cpu 0.803 total

And incremental recompiles, let me change one file, and run build again:

╰─➤  time node_modules/.bin/bsb
...snip...
node_modules/.bin/bsb  0.10s user 0.04s system 68% cpu 0.194 total

So it seems quite fast enough to keep up with phoenix reloading as fast as I can save and alt-tab. :slight_smile:

NPM being the crowded place of mediocre JS libraries as it is, I like Elm’s simple installation wrapper, but even more do I like the fact that you can search, Hoogle-style through all publicized packages.

OPAM then. ^.^

:man_shrugging: Your Mileage May Very. In two years of time, a lot more packages have definitely been written.

Actually as each new Elm version is backwards incompatible with the older libraries I’ve heard that the overall libraries to use are less than it used to be now?

This claim was still made when Elm 0.18 was the latest version. Most of the larger compiler-bugs at that time were fixed in the current version of Elm, 0.19.

I’ve been keeping a list of reporting Elm 0.19 compiler bugs that happen to popup on my radar (I hear about them a lot for some reason), all kinds of things from debugger bugs, not handling DOM changes from browser extensions more cleanly, TCO failing when pipes are used, 0.19 broke being able to match with a negative number, compiler generating bad code when the github library owner’s name starts with numbers, subscriptions not updating upon first load, function generation ordering bug in some cases, inequality operator returning the wrong type, breaking CSS variables, compiler crashing with certain file modification timestamps, port runtime errors with certain names, compiler access violation with some code, module dependency graph not calculated properly, etc… And those are just the ones I’ve heard about on IRC (with github issue links, I have a list here if you want it). As it stands 0.19 has ‘more’ bugs than 0.18 as the reason cited why quite a number of people have ported to bucklescript-tea from Elm (there’s a tool that helps auto-convert most code too now, made by the tea community ^.^).

So far based on IRC usage and each respective area’s Discourse forums, ReasonML/OCaml/Bucklescript/JSOO is significantly more popular than Elm and only getting more so as time goes on.

Honestly I don’t see either taking over though, rather I see things moving to wasm and javascript slowly dying as DOM integration features are added to the standard wasm browser lib. And yes OCaml has a building wasm backend, though personally I think Rust is quite a bit better there.

Preact is pretty cool actually at the bit I’ve looked through, it’s fast and slim. :slight_smile:

2 Likes

I think Elm’s marketing about runtime exceptions is wrong, the JavaScript’s ecosystem has many tools to avoid common pitfalls, I only saw unfair comparisons arguing how JavaScript has those kind of exceptions, but that’s because they are just comparing valid but poor JavaScript code vs Elm, a type-checked language. You can even get ride of some errors writing proper JavaScript. Also the benchmark is a bit misleading, they are comparing a plain virtual DOM implemented in Elm vs libraries that offers many other advantages.

Personally I wouldn’t drop my JavaScript skills to learn another language and do the same thing, still, is a cool language for someone who loves FP.

Like all package managers then. NPM is a mess in the sense of package’s dependencies, but saying that is “the crowded place of mediocre JS libraries” is a bit unrespectful for authors, some packages may be intended for personal use. At some point Elm would have those kind of packages, and that’s totally fine, not all libraries are meant to be relevant.

2 Likes

Statistics predict: there are excellent libraries also. And indeed there are.

Point taken; it was not my intention to be disrespectful. And you are right, if Elm might become very large, there will no doubt also be packages of low quality in that package manager.

However, this does not change the fact that (1) Elm prevents you from publishing a package without a correct Semantic Versioning number, (2) packages have auto-generated documentation that is immediately readable from within the package browser, and (3) that you can search through all functions in all packages using a dedicated search engine which makes it very easy to find the package you are looking for.

This gets very close to a no true scotsman fallacy.
I am speaking about my own experiences with working on two large React-projects and two large (+ one smaller) Elm projects in the last two years. ‘Large’ here means: multiple people worked on them in a timespan of more than two months. In both of those cases, it was very difficult to properly expand the architectural structure of the React applications and keep them well-behaved at all times. Refactoring them became more and more necessary but also increasingly difficult with growing application size.
In Elm, this has not been a problem for us at all.

Of course, your mileage may very well vary, depending on what values you exactly prefer in a language.

:man_shrugging:

3 Likes

The browser APIs have always been specified with reference to JavaScript and I don’t see that changing. So while the integration may become more “lightweight”, I don’t see even new APIs being more “typing friendly” - for JavaScript they don’t have to be.

How does WebAssembly fit into the web platform?

So JavaScript “slowly dying” seems unrealistic at best (aside: Gary Bernhardt: The Birth & Death of JavaScript (2014))

Preact is pretty cool actually at the bit I’ve looked through, it’s fast and slim.

As (dangerously) lean as it is, it’s still vDOM based. hyperHTML and lit-html provide an alternative approach that seems to work with the browser’s capabilities, rather than creating something from scratch (in pure JavaScript). vDOM was great in 2013 but it shouldn’t be considered a default now

Doesn’t stop this being a great quote from one of those authors who deserve respect:
Small modules: it’s not quite that simple:

I offer an additional explanation: that we in the JavaScript world have a higher tolerance for nonsense and dreck.

1 Like

Wasm was designed to interact with simple layers, and both firefox and chrome are building (or in firefox’s case, have already built) methods to pass more direct function pointers into wasm itself, removing the overhead of a DOM call down to a single virtual (and even the virtual bit can be optimized out) dispatch. It’s actually really cool development that they are going down. ^.^

Lol, even he says (comedian though he is) that wasm will just take over everything down to the OS level with just an HTML GUI built on top to interact with. ^.^

Google’s lit-html build actually looks really cool as well, been meaning to look in to it but haven’t got around to it yet, it seems to be used in their latest Polymer library as well.

Some things on the DOM are slow, but in general yeah. A VDom’s benefits aren’t just in that it doesn’t touch the DOM (meaning fewer cache misses!) but also that it’s comparison code is really tightly packed and aligned, thus really quick to iterate over and test for changes.

Honestly I’m not actually a fan of VDom’s, I only made tea to port over my old big elm app with minimal work needed. I’m personally a fan of observational changes, I.E. you hold a mapping of data → view structures and when you update the data then the mappers are called that transform the data as necessary and apply it specifically to the exact DOM nodes via holding direct references to them. It’s hard to get faster than that, no VDOM iteration, nothing touched that is not specifically being changed, etc… There are a few good (really tiny) libraries out there that follow this pattern. :slight_smile:

Care to link a few? It sounds interesting.

I’m really bad about keeping links around (or rather I keep too many around with too little context…). ^.^;

Let me see… I think redom was one of the ones I liked on a cursory look; it wasn’t my favorite that I have found so far (which of course is no doubt incomplete considering all the JS stuff out there), but it was good. CycleJS was a really good looking one as well. There was one that I can’t remember that was my favorite though, can’t recall the name and not finding it on google. Google’s Polymer is of course another though it comes with so much stuff that it is a bit heavier. Can’t recall the one that I really liked though (hmm, it used a single function to register a new data endpoint, to link data nodes together, to push updates to specific DOM elements, etc… etc… really simple, and tiny)…

There are a number of libraries for C++, OCaml, Rust, etc… that have similar updating features (though not DOM related, rather just more generic) as well.

1 Like

I tried to learn a front end framework after not doing front end for a year or so.

I did ember and angular 1 before this.

I did my research and chose vuejs. It seems better for me because I have tons of experience in javascript. Elm didn’t seem transferable in skillset and frontend dev move too quickly for my taste.

Currently just using jQuery and I’m so happy for it. Vuejs and frontend ecosystem move too fast for me and as a fullstack dev I can’t be a master of all unless I have no life and don’t want a family. If I were to stick to mastering vuejs I would force myself to learn typescript. It’s the direction the library is moving toward in version 3. I’ll come back when VueJS dev slow down. Also when webcomponent comes out and stabelize all these frontend web framework is going to change and it’ll make it a hell of a lot easier to do web dev (knock on wood).

You should try not using jQuery, just plain ol’ vanilla javascript. jQuery was a crutch for the times of IE8 and Netscape Navigator, it is entirely unnecessary for the modern web, and take a look at the performance difference as well, jQuery is significantly slow because of all the backwards compatible checks within it. For example, unpoly.js removed jQuery in exchange for normal javascript calls and it had a significant performance boost with a substantial reduction in packaged code. :slight_smile:

3 Likes

Currently using bootstrap 4 so I’m sticking with jquery until bootstrap 5 release and mature. Basically too lazy to invest time on front end for my side projects >___<.

Have a look at a (no-frills) lit-html with unistore combo (unlit.js)

lit-html (like hyperHTML before it) accepts JS tagged template literals as “HTML templates”. Once parsed (by the browser) it stores the resulting template element in a cache indexed by the template literal that created it. So the template literal is only ever parsed once and creating DOM elements from a template element is fast by design (at least that is how I understand it).

The video lit-html (Chrome Dev Summit 2017) goes into a bit more detail but to appreciate what is being conveyed some deeper than average understanding of a browser’s capability is necessary.

My informal (likely oversimplified and inaccurate) perception:

  • React uses the vDOM as a firewall between itself, written in “it’s just JavaScript”, and the browser.
  • lit-html uses every possible opportunity to use the browser for what it is good at.

The template element is part of the web component spec so of course there is LitElement which uses lit-html.

I remember coming across some rumours that Polymer 3.0 primarily exists as a migration target for the existing install base. If the Polymer team continues to exist it is likely to move lit-html/LitElement along provided there is any traction. The Polymer landing page is suggestive at any rate:

The libraries now seem much more modular than Polymer proper (or worse, Angular).
Polymer Blog

Sounds like hyperApp but that uses the vDOM approach (h function generates vDOM elements, app function registers everything).

Actually looking at round 8 seems domc is more likely …


Aside:

search - results | Marko benchmark:
Marko vs Preact vs React vs Vue vs Inferno

Seems the claim is that lit-html/LitElement should be in the neighbourhood of Marko:

  • B Marko (226 ops/sec)
  • A Inferno (210 ops/sec)
  • C React (172 ops/sec)
  • D Preact (169 ops/sec)
  • E Vue (155 ops/sec)

color-picker | Marko benchmark

4 Likes