ValueError: Unknown quantization type, got exl2 - supported types are: ['awq', 'bitsandbytes_4bit', 'bitsandbytes_8bit', 'gptq', 'aqlm', 'quanto', 'eetq', 'hqq']

#1
by Pierre918 - opened

Hello !
When i run : pipe = pipeline("text-generation", model="blockblockblock/Code-Mistral-7B-bpw3.7")
i get the error in the title. I have also imported : from transformers import pipeline
I am beginner so maybe it is just something i didn't install.

Thank you for your help

Hello !
When i run : pipe = pipeline("text-generation", model="blockblockblock/Code-Mistral-7B-bpw3.7")
i get the error in the title. I have also imported : from transformers import pipeline
I am beginner so maybe it is just something i didn't install.

Thank you for your help

Hi, the models are in exl2 format and are to be used with aphro or exllama.

Hello !
When i run : pipe = pipeline("text-generation", model="blockblockblock/Code-Mistral-7B-bpw3.7")
i get the error in the title. I have also imported : from transformers import pipeline
I am beginner so maybe it is just something i didn't install.

Thank you for your help

Hi, the models are in exl2 format and are to be used with aphro or exllama.

Hi ! Thank you very much for your answer.
So concretely, what do you have to do in my command shell (i'm on windows) or in the python file ?
I am very beginner.

Thank you for your help.

Hello !
When i run : pipe = pipeline("text-generation", model="blockblockblock/Code-Mistral-7B-bpw3.7")
i get the error in the title. I have also imported : from transformers import pipeline
I am beginner so maybe it is just something i didn't install.

Thank you for your help

Hi, the models are in exl2 format and are to be used with aphro or exllama.

Hi ! Thank you very much for your answer.
So concretely, what do you have to do in my command shell (i'm on windows) or in the python file ?
I am very beginner.

Thank you for your help.
No problem.
in order to use exl2 quant you need to use exllama.

Sign up or log in to comment