Do you have plan to provide quantized model

#1
by bingw5 - opened

40B is too large, could you provide 4bit quantization version?

OpenGVLab org
β€’
edited Jul 11

Thanks for your interest, we will offer an AWQ version of the 40B model soon.

OpenGVLab org
czczup changed discussion status to closed

Sign up or log in to comment