Support async operations with liveview streams

I’m copying and pasting this issue from @Terbium-135 here verbatim, because I would be interested in such a feature as well, and because I don’t see any discussion about it on this forum.

The new async functions around assign_async are very nice! Easy to setup and a good enhancement to phoenix liveview functionality.

In my opinion it is very likely that these functions will be used for database queries, which might or might not return a greater amount of data for the initial rendering.

Since the general recommendation is to use streams (which are also a great improvement and simplification) I would like to see a stream_async function supported

If this is not the proper place to ask for this: feel free to delete/close this issue

Sincerely

In short, now that we have assign_async, it seems natural (to me) that we should also have stream_async.

Is there any particular reason why stream_async is not implemented at the moment, or is it more a question of bandwidth?

2 Likes

assign_async is implemented on top of start_async and the handle_async callback. So you could use those to build your own asynchronous behavior and add to the stream in the callback.

2 Likes

Most definitely! I guess the question is more along the lines of: Since streams are supported (and encouraged?), many people will probably have a need to implement their own stream_async functionality. Could we make their lives easier by having a standard implementation in LiveView without adding too heavy a burden on the LiveView maintainers?

Or is the idea that streams and async are two semi-orthogonal ways to (among other things) handle the issue of latency, and that they shouldn’t be mixed?

LV streams are not related to latency at all. They’re related to memory optimizations of the server side process. They exist so you don’t need to keep (potentially long) lists of data in memory for the lifecycle of the LV process, while still being able to add/update/delete individual rows when necessary.

LV streams don’t care at all how you load data that you send to the client.

1 Like

Fair enough, I’m probably too biased by my use of streams to handle infinite-scrolling.

Anyway, that’s besides the point, forget I ever talked about latency. My original question still stands.

Perhaps it should, but we probably need to let the idea bake a bit more. Feel free to share what you come up with as one of the starting points. :slight_smile:

2 Likes

Sounds reasonable, will do!

I finally got around to try and implement this using start_async/3 and handle_async/3. It turns out to be rather simple to do, so perhaps it’s not worthy of a macro at all.

Because streams must be lists, I needed to add an extra assign to control whether or not the stream had been asynchronously loaded. Here’s what I did:

@impl true
def mount(%{"id" => id}, _session, socket) do
  {:ok,
   socket
   |> assign(:foos, AsyncResult.loading()),
   |> stream(:foos, [])
   |> start_async(:foos, fn ->
     Foo.list_foos(id)
   end)}
end

@impl true
def handle_async(:foos, {:ok, foos}, socket) do
  {:noreply,
   socket
   |> assign(foos: AsyncResult.ok(:ok))
   |> stream(:foos, foos, reset: true)}
end

@impl true
def render(assigns) do
  ~H"""
  <.async_result :let={_foos} assign={@foos}>
    <:loading>
      <div><%= gettext("Loading...") %></div>
    </:loading>
    <:failed :let={_failure}>
      <div><%= gettext("An error occurred.") %></div>
    </:failed>
    <.table id="foos" rows={@streams.foos}>
      ...
    </.table>
  </.async_result>
  """
end
3 Likes

Hello,

I provided a hex package live_stream_async with which provides you with stream_async/4 macro that you can use like this:

  use LiveStreamAsync

  def mount(%{"location" => location}, _, socket) do
    {:ok,
     socket
     |> stream_async(:hotels, fn -> Hotels.fetch!(location) end)
    }
  end
4 Likes