KeyError: 'nllb_moe'

#1
by MR-7 - opened

so i get this error when trying to load the model: KeyError: 'nllb_moe'. it seems that config_dict["model_type"] is set to 'nllb_moe' but then when the library wants to get CONFIG_MAPPING[config_dict["model_type"] it doesn't exit. I'm using transformers==4.27.2 (latest) and the Auto classes to load the model as the model card suggests:

tokenizer = AutoTokenizer.from_pretrained("facebook/nllb-moe-54b")
model = AutoModelForSeq2SeqLM.from_pretrained("facebook/nllb-moe-54b")

am i missing something?

You are right! Fixing this asap

This should be fixed by #2 ! (tried at home and works)

ArthurZ changed discussion status to closed

not work

raise KeyError(key)

KeyError: 'nllb-moe'

where is this directory
"_name_or_path": "/home/arthur_huggingface_co/fairseq/weights/checkpoints/hf-converted-moe-54b",

please tell me it doesn't work

Sign up or log in to comment