Edit model card

C4-ProX-1.7B

ArXiv | Models | Data | Code

C4-ProX-1.7B is a small language model. It was and trained on the C4-pro for 50B tokens.

Evaluations

ProX models are evaluated over 10 language model benchmarks in zero-shot setting.

ArC-c ARC-e CSQA HellaS MMLU OBQA PiQA SIQA WinoG SciQ AVG
raw 25.3 48.8 30.1 52.4 28.8 32.2 72.0 40.6 53.6 71.7 45.5
ours 31.1 56.0 28.4 55.2 31.1 36.2 74.0 41.0 54.1 76.8 48.4

Citation

@misc{TBD
}
Downloads last month
4
Safetensors
Model size
1.74B params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Dataset used to train gair-prox/C4-ProX-1.7B

Collection including gair-prox/C4-ProX-1.7B