In the Dockyard extract paper below, Sean Moriarty explains that Bumblebee uses FFMPEG.
I see (here and there) that you can use the native GPU with FFmpeg on OSX, with something like h264_videotoolbox
.
Is there a way to compile FFmpeg for OSX or parametrise Bumblebee to ensure that the GPU is used on OSX?
I understand this is the case for linux or Windows. In which way should we take this into account when using a Docker image (say Debian based)?
Thanks to Bumblebee (and Paulo Valente and Jonatan Kłosko), you can use Whisper directly from Elixir. You’ll need to start by installing Bumblebee, Nx, and EXLA all from the main branch. Additionally, if you don’t want to design an audio-processing pipeline using Membrane or another multimedia framework, you’ll need to install ffmpeg. Bumblebee uses ffmpeg under the hood to process audio files into tensors.