AI powered app (with open-source LLMs like Llama) with Elixir, Phoenix, LiveView, and TogetherAI

Wrote a simple tutorial on handling HTTP streams and integrating it well with LiveView.

TLDR:

In Elixir world, we are going to have two processes, one for liveview and another process that will handle HTTP call with streams. LiveView will send the prompt and its pid (process id) to the handler, that in turn will spawn a separate process that will make HTTP call and send the chunks of LLM output to the LiveView as the chunks arrive. When the last chunk arrives, we then notify the LiveView that the text generation has finished.

More: AI powered app (with LLMs) with Elixir, Phoenix, LiveView, and TogetherAI - DEV Community

3 Likes