Bumblee and Facebook M2M100 - getting (RuntimeError) could not match the class name "M2M100ForConditionalGeneration"

Hi guys,

I tried to load Facebook hugging face model for text translation

Bumblebee.load_model({:hf, "facebook/m2m100_418M"})

But I got following error, any help is welcome :slight_smile:

** (RuntimeError) could not match the class name "M2M100ForConditionalGeneration" to any of the supported models, please specify the :module and :architecture options
    (bumblebee 0.1.2) lib/bumblebee.ex:262: Bumblebee.load_spec/2
    (bumblebee 0.1.2) lib/bumblebee.ex:372: Bumblebee.load_model/2
    iex:3: (file)
1 Like

Hey @regex.sh :wave:
I don’t know much about Bumblebee, but I think you are getting that error because the model "facebook/m2m100_418M" is not implemented in Bumblebee yet.
You can find the all the models and tokenizers implemented in the docs (see the “Models” section in the navigation menu on the left).

Besides that, I read a blog post lately and i think it might be interesting for you as well, there is a section about “Machine Translation”.

In particular this is the snippet of code:

model_name = "facebook/mbart-large-en-ro"

{:ok, model} = Bumblebee.load_model({:hf, model_name},
  module: Bumblebee.Text.Mbart,
  architecture: :for_conditional_generation
{:ok, tokenizer} = Bumblebee.load_tokenizer({:hf, model_name})

article = """
Elixir is a dynamic, functional language for building scalable and maintainable applications.

serving = Bumblebee.Text.Generation.generation(model, tokenizer,
  max_new_tokens: 20,
  forced_bos_token_id: 250041
Nx.Serving.run(serving, article)

:point_up: If you run the snippet, it should return the sentence “Elixir is a dynamic, …” translated to romanian.

Maybe this can be a good starting point, you will need to replace the model_name with the one that you are trying to use "facebook/m2m100_418M" and to set the :module option in the load_tokenizer function too.
But I fear this won’t be enough because if I understood well the MBART tokenizer implemented in Bumblebee does not accept some parameters (e.g src_lang) that might be needed for M2M100 model, but I might be wrong, please take everything I’m writing with a grain of salt :grimacing:

To conclude, in this PR the model M2M100 is listed as a variation of Bart and MBart.

Good luck :crossed_fingers:


Hey @regex.sh

I just realized that there is an open PR in Bumblebee to add the M2M100 model

I don’t know what’s its state tho’, but I guess it is worth looking.

Cheers :v:


Hi @NickGnd, thank you for the extensive reponse! :heart:
I will give it a shot and write update here.


Okay, update as promised

I didn’t get m2m100 to work (there are no fast tokenizers available), but I tried t5-base and it worked! (I wanted to use AI to translate text)
Anyhow I switched to Cloud translation by Google :smiley:

1 Like