Feature idea: measure and expose socket latency

(re-post of: Feature idea: measure and expose socket latency · Issue #1890 · phoenixframework/phoenix_live_view · GitHub)

This is related to the following PR: buunguyen/topbar#15

Would LiveView developers be open to the idea of measuring the user’s latency, which could then be used to show loaders of any kind? With a pure timer, you still have a slice of users who can have a “negative experience”; with a 500ms cut-off, the most obvious slice would be the ones who have a 500-700ms latency. Latency-based loaders are better in this sense, since they use an actual signal to provide feedback to the user. To the user with a latency of 550ms, you can actually show the loader for 550ms, instead of for 50ms. Considering that the LiveView socket stays open, it provides an amazing opportunity to provide such feedback, with minimal drawbacks.

There is of course variance involved in various ways, but a decent implementation can account for it. It also relieves developers of having to measure latency manually (which I do in my projects).


Check the LiveBeats source. Latency detection is very easy to add yourself. I thought about adding it to LiveView, but you often will want to access the latency calc on the server as well (as we do in LiveBeats), so it’s better left to user land in my opinion:


Thanks. That’s how I’ve done it so far :slight_smile:. It’s quite rudimentary and works, but with LiveView one could even predict page load time ahead of time with a slightly smarter approach. E.g. if you can predict a page load will take ~300ms, you can show a loader immediately, without delaying it. Delaying the loader in my view effectively just pushes the problem further back and only improves the experience for a subset of the users. Measuring latency with various payloads in LiveView can give you a very good signal for how long an action might take.

Of course, alternatively there could be some sort of a plugin for this, not sure in which form though. Just food for thought, as I happened to spot the PR.


Right, but you can do all this with the linked code above :slight_smile: I agree you can do all kinds of interesting things with average latency calculations over the previous X period. You could even do this for buffer/quality calculations for media streaming. To handle the prediction usecase you mention, all you need to do is store the average latency in a reachable javascript object, then reference that in your progress bar code to determine whether or not to display a loader. I’ve been meaning to write a blog post on this, but the LiveBeats code gets you 95% of the way there in like a dozen lines of code total.