vicuna-13b-v1.5-PL
ExLlamav2 8 bpw quants of https://huggingface.co/Aspik101/vicuna-13b-v1.5-PL-lora_unload
- Downloads last month
- 7
Inference API (serverless) is not available, repository is disabled.
Model tree for altomek/vicuna-13b-v1.5-PL-8bpw-EXL2
Base model
Aspik101/vicuna-13b-v1.5-PL-lora_unload
Finetuned
this model