LiveView with non HTML frameworks for mobile development

Hi everyone,

I was wondering how easy/hard would be to make LiveView work with frameworks that are not HTML-based.

For example, imagine that you want to develop a new system that should have web and mobile support.

Today I would say that LiveView is not the best fit for it unless you are OK with having your app simply be an HTML rendered page which IMO seems very alien compared to native apps in a mobile environment.

So I was wondering if we could extend LiveView to support multiple types of render functions, what I mean by that is that you could have a render function for HTML, and one render function for Flutter.

Of course, this also means that we need a library to render what LiveView sends on the Flutter side.

Also, maybe another idea would be to simply ditch the render altogether and just have a way to link Flutter state with a LiveView state so we can simply use the LiveView protocol to update the Flutter page state without any boilerplate or custom API and at the same time keep the frontend “design” code in Flutter so the app is more “responsive” and works better in offline mode.

What are your thoughts? Also, I used Flutter here because it is the framework I use for mobile dev, but I think the idea applies to any other mobile framework too.


Isn’t this just channels?

This doesn’t sound like liveview to me at all. The whole point of LiveView is that you are doing server side rendering. If you aren’t doing server side rendering, then there are other tools you should use.

Well, sure, I get what you are saying, but at the same time, one big benefit of LiveView is that you have a well-defined and minimal WebSocket API that is transparent to the frontend.

For me at least this is a big win, so even if the frontend is not rendered by the backend as we have right now with HTML, just having this API working transparently with other frameworks would mean that I can extend my LiveView code to work for mobile instead of having to double my work just to have something for that too.

I didn’t think all the way through to see if that would be actually possible or if there would be corner cases that would invalidate the whole thing, but that is one reason that I create this discussion anyway :slight_smile:

Also, I did mention rendering from the backend too, I just think that for mobile probably the biggest advantage is the API support part, not the rendering part.

It honestly sounds like you just want channels. Channels are well defined minimal API. LiveView exists explicitly to translate a server side state into HTML.

Can you provide a concrete proposal for how a LiveView template could “transparently” be either HTML or some other wire format used by Flutter?

There could be a way that is half way between LV and Channel. For example, what if you send not the rendered diff over like LV, but only a diff of socket assigns, and somehow use the diff to update the client U/I reactively.

As I said before, I don’t though too much about how this should be implemented, so take my suggestion below with a grain of salt…

One way that this could possibly work is having a special widget in Flutter called LiveWidget which would be used to identify what element should be updated by which assign in the LiveView state, so for example:

class _Example extends State<Example> {
  Widget build(BuildContext context) {
    return Container(
      child: LiveWidget(
        id: 'counter', 
        child: (value) => Text('$value')

And this code in Elixir:

defmodule CounterLive do
  use Phoenix.LiveView

  def mount(_, _, socket) do
    {:ok, assign(socket, counter: 0)}

  def render(assigns) do
    <live_widget id='counter'>@counter</live_widget>

Then, somehow, during initialization, the counter id would be changed into an index number that both the front and back agreed and then the front knows what should change after receiving diff messages via the LiveView protocol.

Another thing is that I agree with you that simply using phoenix channels solves this, but my point is not that. My point is that only using Phoenix channels would mean that I still need to manually handle all the changes in the frontend manually and create my own protocol of state changes too.

Maybe see it this way, forget about the rendering part and just focus on the LiveView diff protocol, basically, as @derek-zhou said, if we could somehow link a widget to a diff index in the LiveView protocol, we would be able to send the assign diff via the WebSocket and the frontend would know where to change the current page state accordingly.

So, in the end, maybe the question is more about how hard it would be to support the LiveView protocol in other frameworks (the same way that LiveView injects a javascript that understands it protocol to make the changes in the browser) and how to “link” the assign diff to real widgets in the frontend side.

Sounds like maybe something like this (disclaimer: not tried it and a while since i watched the video). LiveData - a (perfect) marriage of JavaScript and Elixir - YouTube