Text Generation
Transformers
PyTorch
English
beit3_llava
Inference Endpoints

key_error "beit3_llava"

#4
by nanamma - opened

when I use AutoModelForCausalLM.from_pretrained(),
ValueError: The checkpoint you are trying to load has model type beit3_llava but Transformers does not recognize this architecture. This could be because of an issue with the checkpoint, or because your version of Transformers is out of date.

and I've tried pip install transformers -U

well, I see. from muffin import Beit3LlavaLlamaForCausalLM is OK. πŸ€—

nanamma changed discussion status to closed

Sign up or log in to comment