File size: 1,287 Bytes
1722fc0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47f3ee1
1722fc0
 
 
2936274
 
d25d9ea
 
 
 
1722fc0
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
---
license: llama2
datasets:
- gair-prox/open-web-math-pro
language:
- en
base_model:
- codellama/CodeLlama-7b-hf
---



# CodeLlama-7B-ProXMath

<p align="center">
  <img src="prox-teaser.png">
</p>

[ArXiv](http://arxiv.org/abs/xxxx) | [Data: OpenWebMath-Pro](https://huggingface.co/datasets/gair-prox/open-web-math-pro) | [Code](https://github.com/GAIR-NLP/program-every-example)

**CodeLlama-7B-ProXMath** is a math-adapted language model that is continually pre-trained on [OpenWebMath-Pro](https://huggingface.co/datasets/gair-prox/open-web-math-pro) (a refined version by ProX) for **10**B tokens.

## Evaluations

ProX models are evaluated on 9 common math reasoning benchmarks.

| Model                 |   asdiv  |   gsm8k  |  mathqa  |   mawps  | minerva_math | mmlu_stem | sat_math |   svamp  |  tabmwp  |  average |
|-----------------------|:--------:|:--------:|:--------:|:--------:|:------------:|:---------:|:--------:|:--------:|:--------:|:--------:|
| CodeLlama-7B          |   50.7   |   11.8   |   14.3   |   62.6   |      5.0     |    20.4   |   21.9   |   44.2   |   30.6   |   29.1   |
| CodeLlama-7B-ProXMath | **67.9** | **35.6** | **38.9** | **82.7** |   **17.6**   |  **42.6** | **62.5** | **55.8** | **41.3** | **49.4** |

### Citation
```
@misc{TBD
}
```