Use FFmpeg gpu acceleration with Bumblebee

In the Dockyard extract paper below, Sean Moriarty explains that Bumblebee uses FFMPEG.

I see (here and there) that you can use the native GPU with FFmpeg on OSX, with something like h264_videotoolbox.
Is there a way to compile FFmpeg for OSX or parametrise Bumblebee to ensure that the GPU is used on OSX?
I understand this is the case for linux or Windows. In which way should we take this into account when using a Docker image (say Debian based)?

Thanks to Bumblebee (and Paulo Valente and Jonatan Kłosko), you can use Whisper directly from Elixir. You’ll need to start by installing Bumblebee, Nx, and EXLA all from the main branch. Additionally, if you don’t want to design an audio-processing pipeline using Membrane or another multimedia framework, you’ll need to install ffmpeg. Bumblebee uses ffmpeg under the hood to process audio files into tensors.

1 Like

You can get the ffmpeg call from here, customize it or replace with anything else and pass the resulting tensor to the serving.

1 Like

Thanks @jonatanklosko .
Is it worth considering compiling your own FFmpeg?

It would be nice to have a package with NIF bindings to ffmpeg and precompiled binaries, so that we don’t rely on the global installation, if that’s what you mean. I don’t think that exists currently, and it’s not a priority : )