How to demux MP4 using membrane framework?

I’m playing around with membrane, trying to stream an MP4(h264/aac) file through HLS to my phone, I think the first thing is to demux the file into video and audio streams. The code look like this

  use Membrane.Pipeline

  def handle_init(path_to_file) do
    children = %{
      source_file: %Membrane.File.Source{location: path_to_file},
      demuxer: Membrane.MPEG.TS.Demuxer,
      video_parser: %Membrane.H264.FFmpeg.Parser{framerate: {24, 1}},
      video_decoder: Membrane.H264.FFmpeg.Decoder,
      audio_decoder: Membrane.AAC.FDK.Decoder,
      audio_converter: %FFmpeg.SWResample.Converter{
        output_caps: %Membrane.Caps.Audio.Raw{channels: 2, format: :s16le, sample_rate: 48_000}
      },
      sink: %Membrane.File.Sink{location: "video_output"},
      portaudio: PortAudio.Sink
    }

    links = [
      link(:source_file) |> to(:demuxer),
      link(:demuxer) |> via_out(Pad.ref(:output, 256)) |> to(:video_parser),
      link(:demuxer) |> via_out(Pad.ref(:output, 257)) |> to(:audio_decoder),
      link(:video_parser) |> to(:video_decoder),
      link(:video_decoder) |> to(:sink),
      link(:audio_decoder) |> to(:audio_converter),
      link(:audio_converter) |> to(:portaudio)
    ]

    spec = %ParentSpec{
      children: children,
      links: links
    }

    {{:ok, spec: spec}, %{}}
  end

However, when I run it, there’s no sound, and also video_output is zero bytes.

iex(3)> Pipeline.play(pid)                              
:ok
iex(4)> 
21:01:41.415 [debug] [pipeline@<0.295.0>] Changing playback state from stopped to prepared
 
21:01:41.441 [debug] [pipeline@<0.295.0>] Playback state changed from stopped to prepared
 
21:01:41.441 [debug] [pipeline@<0.295.0>] Changing playback state from prepared to playing

21:01:41.441 [debug] [:video_decoder] Evaluating playback buffer

21:01:41.441 [debug] [:source_file] Evaluating playback buffer

21:01:41.441 [debug] [:video_parser] Evaluating playback buffer

21:01:41.441 [debug] [:audio_converter] Evaluating playback buffer

21:01:41.441 [debug] [:audio_decoder] Evaluating playback buffer
 
21:01:41.441 [debug] [:sink] Evaluating playback buffer

21:01:41.441 [debug] [:demuxer] Evaluating playback buffer
 
21:01:41.488 [debug] [:portaudio] Evaluating playback buffer
 
21:01:41.488 [debug] [pipeline@<0.295.0>] Playback state changed from prepared to playing
 
21:01:41.499 [debug] [:video_parser] Ignoring event %Membrane.Core.Events.EndOfStream{}
 
21:01:41.499 [debug] [:audio_decoder] Ignoring event %Membrane.Core.Events.EndOfStream{}

I need a bit help here, am I doing this correctly?

2 Likes

Hi @hlcfan, you’re right, the first step would be to demux the MP4, but you’re using Membrane.MPEG.TS.Demuxer, that is able to demux MPEG-TS stream, not MP4. We don’t have MP4 demuxer in Membrane yet, though it’s in the roadmap.

4 Likes

Thanks @mat-hek for the quick reply. Looking forward to the MP4 demuxer!

After converting the file to TS format, it works, thanks!

1 Like

@mat-hek Do you have an example of how to stream a TS file through HLS? I can’t seem to find one.

I don’t, but once you have H264 and AAC, you should be able to put them into HLS like in this demo

3 Likes

Hey, I’m doing something similar to you, did you ever complete this project?
If so are you able to share the code with me/us?

I’ve completed the pipeline with the help of @DominikWolek

First take your MP4 file and demux it to acc Audio and H264 Video.
I use ffmpeg command for this, (there might be a ffmpeg elixir wrapper out there to do it safer than this, I feel like calling System.cmd isn’t a nice thing)
For the file name I just take the basename of the original Mp4.

basename = Path.basename(mp4_path, ".mp4")
System.cmd("ffmpeg", ["-i", mp4_path, "-an", "-vcodec", "libx264", 
"videos/demuxed/#{basename}_video.h264", "-vn",
"-acodec", "aac", "videos/demuxed/#{basename}_audio.aac"])

This should split your mp4 into two files, one .aac and it’s corresponding video file .h264
Then in your handle_init function you can create your SkinBin like so:

    sink_bin = %Membrane.HTTPAdaptiveStream.SinkBin{
      muxer_segment_duration: 2 |> Membrane.Time.seconds(),
      manifest_module: Membrane.HTTPAdaptiveStream.HLS,
      target_window_duration: :infinity,
      target_segment_duration: 2 |> Membrane.Time.seconds(),
      persist?: false,
      storage: %Membrane.HTTPAdaptiveStream.Storages.FileStorage{
      	directory: @output_dir
      }
    }

output_dir is the path you wish to save your HLS files to for streaming them.
Once you have your sink you need to create your source and filters, a lot of this next bit of code can be found here inside the Sink Bin Integration Test on membrane_http_adaptive_stream_plugin’s github.

     children =
      	@audio_video_tracks_sources
      	|> Enum.flat_map(fn {source, encoding, track_name} ->
      	  parser =
      	    case encoding do
      	      :H264 ->
      	        %Membrane.H264.FFmpeg.Parser{
      	          framerate: {25, 1},
      	          alignment: :au,
      	          attach_nalus?: true,
      	          skip_until_parameters?: false
      	        }

      	      :AAC ->
      	        %Membrane.AAC.Parser{
      	          out_encapsulation: :none
      	        }
      	    end

      	  [
      	    {{:source, track_name},%Membrane.File.Source{
      			 location: source,
      		}},
      	    {{:parser, track_name}, parser}
      	  ]
      	end)
      	|> then(&[{:sink_bin, sink_bin} | &1])

     links =
      	@audio_video_tracks_sources
      	|> Enum.map(fn {_source, encoding, track_name} ->
      	  link({:source, track_name})
      	  |> to({:parser, track_name})
      	  |> via_in(Pad.ref(:input, track_name),
      	    options: [encoding: encoding, track_name: track_name]
      	  )
      	  |> to(:sink_bin)
      	end)

     {{:ok, spec: %ParentSpec{children: children, links: links}, playback: :playing}, %{}}

@audio_video_tracks_sources is our sources in this format:

@audio_video_tracks_sources [
  	{"videos/demuxed/#{@basename}_audio.aac",
  	 :AAC, :audio},
  	{"videos/demuxed/#{@basename}_video.h264",
  	 :H264, :video}
  ]

But it can be whatever you want as long as it’s in the tuple {path, codec, type}

Once you’ve got all that you can startup your program with
iex -S mix
then start the init
{:ok, pid} = Module.start_link

And your program will generate the HLS files to the @output_dir that you specified.
You can test to see if they’re correct by serving them from a simple python server like so:
python3 -m http.server 8000
And playing it with ffplay
ffplay http://localhost:8000/FILENAME
with FILENAME being output_dir/index.m3u8

Feel free to ask any questions or if I’ve left something out!

2 Likes

Updated to include -profile:v baseline

System.cmd("ffmpeg", ["-i", mp4_path, "-an", "-profile:v", "baseline", "-vcodec", "libx264", 
"videos/demuxed/#{basename}_video.h264", "-vn",
"-acodec", "aac", "videos/demuxed/#{basename}_audio.aac"])
1 Like