Video Streaming with Elixir (wiki)

To transcode the video there is FFMPEG.

On Demand

When a user uploads a video, the app renames and copy the file to a path, then call FFMPEG to transcode the video. Let’s say we are using HLS, we can use a script like this, to transcode the video to multiple resolutions, create a playlist file and sequences:
https://github.com/shavit/Diana/blob/master/bin/create_sequence

When a user requests a video, the server responds with m3u8 file, which contains a list of ts files. The creation of the sequence files was done by FFMPEG.

Example:

get "/video/:slug" do
ext = slug
  |> String.split(".")
|> List.last

video_file = cond do
  ext == "m3u8" -> "tmp/ts/320x180.m3u8"
  ext == "ts" -> "tmp/ts/#{slug}"
  # Default playlist file
  true -> "tmp/ts/320x180.m3u8"
end

file_path = Path.join(System.cwd, video_file)
offset = get_offset(conn.req_headers)
size = get_file_size(file_path)
   
conn
|> put_resp_content_type("application/vnd.apple.mpegurl")
|> put_resp_header("Accept-Ranges", "bytes")
|> put_resp_header("Content-Length", "#{size}")
|> put_resp_header("Content-Range", "bytes #{offset}-#{size-1}/#{size}")
|> send_file(206, file_path, offset, size-offset)

end

The Content-Range gives the video player the option to seek in the video.

Live video

When a client broadcast the stream, the server should transcode the video on the fly to multiple resolutions and file types. This can be done with FFMPEG too.

To accept the live video, the app can listen to UDP connections in a separate process.

{:ok, _socket} = :gen_udp.open(3001, [:binary, {:active, true}])

When a client request a live stream, the respond should block and wait for data. I started to write named pipes but Elixir couldn’t read the fifos. It seems that it should be done using agents.

Elixir Media Libs

Open source elixir libraries for working with media (currently focused on RTMP). (Dedicated thread is here)

Github: https://github.com/KallDrexx/elixir-media-libs

Hex Packages:

  • amf0 - Library with functions for serializing and deserializing data in the AMF0 encoding format. It’s not 100% complete on the spec but it has the core types implemented.
  • rtmp_handshake - Library that allows systems to perform an RTMP handshake, both as client and a server. It supports both simple and digest handshake formats.
  • rtmp_session - An abstraction that represents a single peer in an RTMP connection. This is essentially the core brain of an RTMP peer where you give it TCP packets and it spits out events and tcp responses. Right now it is only programmed to work as a server, but the client portion is on my todo list.
  • gen_rtmp_server - Behavior that makes easy to create your own RTMP server and provides callbacks for your own custom application logic to respond to RTMP events
12 Likes

That’s really helpful, thanks!

Shall we split your post into a new thread and make it a wiki? (Can add some other info to it as well then - such as other options.) Pretty sure a lot of people will find it helpful in future :003:

2 Likes

Yes, I wish to finish soon a simple working example and publish it.

1 Like

…done - I’ll look forward to seeing your example :023:

Could we shift the whole ffmpeg task to elixir through this? https://github.com/talklittle/ffmpex

2 Likes

In a semi related note, I am 90% ready to release my rtmp abstraction libraries, a GenRtmpServer abstraction, and a (very) simple rtmp server implementation all in Elixir.

Unfortunately, playback video is artifacting which is the only hold up, and I’m trying to determine where the issue lies (I don’t believe it’s Elixir related but instead an issue with … well I’m still trying to figure it out). Once I can get playback working properly I plan to split the umbrella project out, put it on GitHub, and create hex packages for the different pieces.

4 Likes

Ooo, awesome! Looking forward. :slight_smile:

2 Likes

I tried to use the Elixir FFMPEG library, but it seems that it doesn’t support pipes at the moment.

While it still in development, I came up with the following script to write incoming live HLS stream into disk.

#!/bin/sh

DIR=`dirname $BASH_SOURCE[0]`
TMP_DIR="${DIR}/../tmp/pool"
OUTPUT_FILE="${TMP_DIR}/live_%00d.ts"
OUTPUT_PLAYLIST="${TMP_DIR}/live.m3u8"

ffmpeg \
  -i pipe:0 \
  -c:a copy \
  -c:v libx264 \
  -bufsize 1024k \
  -strict experimental \
  -map 0 \
  -s 320x240 \
  -f ssegment \
  -segment_list $OUTPUT_PLAYLIST -segment_time 10 \
  -flags +global_header \
  -segment_list_size 10 -segment_list_flags +live \
  -segment_list_type hls \
  $OUTPUT_FILE

Receiving video stream

In order to receive videos from a source, we can listen to UDP packets using a GenServer

defmodule Streaming.Incoming do
  use GenServer

  def start_link(opts \\ []) do
    GenServer.start_link(__MODULE__, :ok, opts)
  end

  def init(:ok) do
    incoming_port = Application.get_env(:streaming, :incoming_port)

    {:ok, _socket} = :gen_udp.open(incoming_port, [:binary,
      {:active, true}, {:buffer, 1024}
      ])
  end

  def handle_info({:udp, _socket, _ip, _port, data}, state) do
    IO.inspect "---> Received #{byte_size(data)} bytes from #{_port}"

    # Write to a bucket
    Streaming.Bucket.add data
    
    # Or write to file
    Streaming.Encoder.write data

    {:noreply, state}
  end

  def handle_info({_, _socket}, state) do
    {:noreply, state}
  end

end

The Bucket module is the same as the key value store module on Elixir website. The Encoder is a module that writes to the FFMPEG script above. It is the part where the data should be written into multiple process for multiple resolutions.

This incoming module will store any stream into the same file with the same configuration, so you will need to change that too.

Start the process when the app or the stream starts

def init(_state) do
    # Open a port for the external process
    #   wait for stdin.
    port = Port.open({:spawn, "bin/read_hls"}, [:binary])

    {:ok, port}
  end

Then write to it from Streaming.Incoming

def encode(data) do
    GenServer.cast(:encoder, {:encode, data})
  end

def handle_cast({:encode, data}, port) do
    port |> Port.command(data)

    {:noreply, port}
  end

Streaming the live data
You can write the response from the FFMPEG process, back to the key value storage and stream it to client. Keep in mind that unlike streaming files, it will take a lot of memory. It is also FILO, so you will need to reverse the list.

Another option which I use in the example, is to stream the files that FFMPEG created. When a client request the live stream, serve the playlist.m3u8 file and it will contain a list of all the *.ts video files.

Testing the stream

Run this command in order to stream to the server.

    ffmpeg -y \
      -i $MEDIA_FILE \
      -c:a aac -ac 2 -ar 44100 \
      -c:v libx264 \
      -movflags frag_keyframe+empty_moov \
      -strict experimental \
      -crf 18 \
      -b:a 64k -b:v 64k -bufsize 64k \
      -pix_fmt yuv420p -profile:v baseline -level 1.3 -maxrate 192K -bufsize 1024k \
      -f mpegts -hls_time 9 -hls_list_size 0 \
      -s 1280x720 \
      -threads 12 \
      udp://127.0.0.1:3001

You should see the playlist under the ./tmp folder.

3 Likes

Anybody have experience or know if Amazon’s Elastic Transcoder is an option that works and is cheaper than running one’s own instance on EC2 with FFmpeg?

1 Like

It seems like it wouldn’t be cheaper. With nginx-rtmp (which executes ffmpeg) I can stream all day and pay just the ec2 price, with elastic transcoder I would have to pay $0.015 * 60 * 24 = $21.6 additionally.

3 Likes

Also worth looking at this New York Times project that abstracts the transcoding service allowing you to change in the future and not be locked in:

It looks very very well written…

3 Likes

our flussonic goes with opened source of binding to ffmpeg. You may feel free to use them.

2 Likes

Membrane is out, you can have an eye on it.

Thread: Membrane - a new framework for multimedia processing

4 Likes