Edit model card

Miquella 120B

Model has been remade with the fixed dequantization of miqu.

This is a merge of pre-trained language models created using mergekit. An attempt at re-creating goliath-120b using the new miqu-1-70b model instead of Xwin.

The merge ratios are the same as goliath, only that Xwin is swapped with miqu.

Models Merged

The following models were included in the merge:

image/png Miquella the Unalloyed, by @eldrtchmoon

Downloads last month
7
Safetensors
Model size
118B params
Tensor type
FP16
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for alpindale/miquella-120b

Quantizations
2 models