mamba codestral support ?

#118
by Daemontatox - opened

Error: Error converting to fp16: b'INFO:hf-to-gguf:Loading model: Mamba-Codestral-7B-v0.1\nERROR:hf-to-gguf:Model Mamba2ForCausalLM is not supported\n'

It is not yet supported by llama.cpp

Please follow PR: https://github.com/ggerganov/llama.cpp/pull/9126

Daemontatox changed discussion status to closed

Sign up or log in to comment