File size: 1,227 Bytes
98b4575
 
7fa856b
8dec835
98b4575
 
8dec835
98b4575
 
93d7fce
98b4575
 
 
8dec835
98b4575
8dec835
98b4575
2bdf85b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
---
library_name: transformers
license: apache-2.0
pipeline_tag: text-generation
---

![image/png](https://cdn-uploads.huggingface.co/production/uploads/65f22e4076fedc4fd11e978f/MoTedec_ZL8GM2MmGyAPs.png)


![image/png](https://cdn-uploads.huggingface.co/production/uploads/65f22e4076fedc4fd11e978f/LJ6jXeMTrPobAzgpfklhI.png)



# T3Q-ko-solar-dpo-v6.0

## This model is a version of T3Q-ko-solar-dpo-v5.0 that has been fine-tuned with DPO.

## Model Developers Chihoon Lee(chihoonlee10), T3Q

hf (pretrained=chihoonlee10/T3Q-ko-solar-dpo-v6.0), limit: None, provide_description: False, num_fewshot: 0, batch_size: None
|      Task      |Version| Metric |Value |   |Stderr|
|----------------|------:|--------|-----:|---|-----:|
|kobest_boolq    |      0|acc     |0.5028|±  |0.0133|
|                |       |macro_f1|0.3396|±  |0.0067|
|kobest_copa     |      0|acc     |0.8020|±  |0.0126|
|                |       |macro_f1|0.8018|±  |0.0126|
|kobest_hellaswag|      0|acc     |0.5340|±  |0.0223|
|                |       |acc_norm|0.5720|±  |0.0221|
|                |       |macro_f1|0.5322|±  |0.0224|
|kobest_sentineg |      0|acc     |0.7985|±  |0.0202|
|                |       |macro_f1|0.7956|±  |0.0205|