File size: 1,282 Bytes
5251a0e
 
 
 
 
 
 
 
 
 
 
6a403af
5251a0e
 
 
 
 
 
 
813f96c
5251a0e
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
---
license: apache-2.0
language:
- en
pipeline_tag: text-generation
datasets:
- appvoid/no-prompt-15k
---
![palmer](https://huggingface.co/appvoid/palmer-003-turbo/resolve/main/image.png)
# palmer
### a better base model 
This model will continuosly be improved over time. The model is named as palmer-003-turbo-yearmonth formatting.

### evaluation 🧪
note that this is a zero-shot setting as opposite to open llm leaderboard's few-shot evals
```
   Model           ARC_C   HellaSwag  PIQA  Winogrande Average
palmer-001	     | 0.2807 | 0.5524 | 0.7106 | 0.5896 | 0.5333 |
palmer-003-turbo | 0.3106 | 0.5806 | 0.7247 | 0.5951 | 0.5527 |
p-003-turbo-2401 | 0.3114 | 0.5805 | 0.7258 | 0.5959 | 0.5534 | (this)
palmer-002       | 0.3242 | 0.5956 | 0.7345 | 0.5888 | 0.5607 |
```

This model is as good as tinyllama base while being half the size.

### prompt 📝
```
no prompt 🚀
```

### Note
As of today 1/4/2024 is still not possible to convert to gguf, [see more here](https://github.com/ggerganov/llama.cpp/issues/4199#issuecomment-1825833475).

<a href="https://ko-fi.com/appvoid" target="_blank"><img src="https://cdn.buymeacoffee.com/buttons/v2/default-yellow.png" alt="Buy Me A Coffee" style="height: 48px !important;width: 180px !important; filter: invert(70%);" ></a>