Server side rendering - do I need react?

Hey everyone. Most of my experience is with nodeJS + express, and on the front end Angular 2 and ReactJS. I’m building out a multi page site similar to Quora, Github, or Yelp, Foursquare, etc. I have only ever worked with SPA before, and even though I have experience with angular-universal and server side rendering with react, I am wondering for this particular use case… do I even need React? Please bear with me while I try to explain and I am hoping that you guys and girls might have some experience (I’m fairly new to web dev!) to share.

My front end routes are fairly simple

  • /explore/:location for a nice google maps + results list for things in your area. This is my core basically.
    -/user/:name for user profiles
    -/groups/:name for groups
    -/events/:name for events

I had built out the backend using nodeJS + express + postgres, and it is deployed on AWS.

I have built a front end POC with server side rendered react + redux. I need server side rendering here because I am making use of geolocation to serve users with results in their current location (or close to them). I am using redis for caching, so I have a bunch of HTML pages by location cached and which can be updated as the data stores update. I’m only doing this for indexing reasons and also because I think its cool. It is possible I am doing this totally incorrectly, but like I said its just me and I’m new to this.

The issue here is that even with lots of code splitting and a webpack config that makes me feel dizzy (even though I wrote it!), I am still really only making use of the things react offers on the /explore map page. For the rest, I can probably avoid having to serve a 180kb gzipped js file that parses into a pretty big 670kb. Most of that is really only for the /explore page…

What is your suggestion here? I am thinking of cutting out react completely, still using redux and using server side rendered redux states that can be used by my views. I can use jquery or something for any dom manipulation. Does that make sense?


There’s a piece from thoughtbot on the topic:


For me, Phoenix + Turbolinks works really well. For real time stuff (like notifications and comments) I’m using a simple jQuery script + Phoenix Channels. It’s not fancy but it works.

1 Like

In my experience, if you are going to write your client-side code in React then you really don’t end up using the templates and views in Phoenix very much. You are much better off, in my experience, simply using Phoenix to provide an back-end API and just writing your front end, in React, to use that API.

The sticking point I’ve run into involves getting values from the “Elixir world” of .eex templates into the JavaScript world of the encapsulated JavaScript modules where your React components live.

Using something like webpack, it is possible to set up a system that loads individual JavaScripts on a per-Phoenix-View basis… but the solutions I’ve seen, or have created, seem very ad-hoc and require more discipline, rigor, and understanding on the part of the developer than I personally prefer.

I’ve had success using Elm, and React with Phoenix, but I’ve really found a good way to mix either of them with Phoenix Templates and .eex (apart from perhaps embedding individual components in .eex here or there).



When you’re using Phoenix + Turbolinks, are you using the Turbolinks caching functionality as well? Or just the pjax-style page updates.

i ended up using drab and unpoly, works great i like the fact i mostly never have to mess with js really simple but really powerful


Just the pjax update. Phoenix is fast as hell for rendering so I just need the app to look like a SPA.

1 Like

Plus with Unpoly.js you can have it pre-cache some links, have it cache on hover (after a certain amount of time even if you want), have it start loading on mouse-down instead of mouse-up, etc… etc… It makes it ‘feel’ fast. :slight_smile:


Ohhh, nice! I’ll definitely look into this! Thanks!

1 Like