File size: 2,922 Bytes
d8936bc
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b809979
 
 
 
 
 
 
07e1c0a
b809979
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e162965
6387de8
 
 
b809979
 
 
 
6387de8
ede9c1a
 
7153bea
b809979
d350fbd
 
 
 
 
cb59693
b809979
6387de8
 
 
 
b809979
 
6387de8
 
 
 
b809979
 
e62d54f
d1d29d1
6387de8
ede9c1a
b809979
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
---
base_model:
- davidkim205/nox-solar-10.7b-v2
- chihoonlee10/T3Q-ko-solar-dpo-v6.0
library_name: transformers
tags:
- mergekit
- merge

---
# model_storage

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details
### Merge Method

This model was merged using the SLERP merge method.

### Models Merged

The following models were included in the merge:
* [davidkim205/nox-solar-10.7b-v2](https://huggingface.co/davidkim205/nox-solar-10.7b-v2)
* [chihoonlee10/T3Q-ko-solar-dpo-v6.0](https://huggingface.co/chihoonlee10/T3Q-ko-solar-dpo-v6.0)

### Configuration

The following YAML configuration was used to produce this model:

```yaml
base_model:
  model:
    path: chihoonlee10/T3Q-ko-solar-dpo-v6.0
dtype: float16
merge_method: slerp
parameters:
  t:
  - filter: self_attn
    value: [0.0, 0.5, 0.3, 0.7, 1.0]
  - filter: mlp
    value: [1.0, 0.5, 0.7, 0.3, 0.0]
  - value: 0.5
slices:
- sources:
  - layer_range: [0, 47]
    model:
      model:
        path: chihoonlee10/T3Q-ko-solar-dpo-v6.0
  - layer_range: [0, 47]
    model:
      model:
        path: davidkim205/nox-solar-10.7b-v2
```
<!DOCTYPE html>
<html lang="ko">
<head>
    <meta charset="UTF-8">
    <title>Evaluation Results</title>
    <style>
        table {
            width: 100%;
            border-collapse: collapse;
            margin: 25px 0;
            font-size: 18px;
            text-align: left;
        }
        th, td {
            padding: 12px 15px;
        }
        th {
            background-color: #f2f2f2;
        }
        tr:nth-of-type(even) {
            background-color: #f9f9f9;
        }
        tr:hover {
            background-color: #f1f1f1;
        }
    </style>
</head>
<body>
    <table border="1">
        <thead>
            <tr>
                <th>Model</th>
                <th>글쓰기</th>
                <th>이해</th>
                <th>문법</th>
            </tr>
        </thead>
        <tbody>
            <tr>
                <td>HyperClovaX</td>
                <td>8.50</td>
                <td>9.50</td>
                <td><b>8.50</b></td>
            </tr>
            <tr>
                <td>solar-1-mini-chat</td>
                <td>8.50</td>
                <td>7.00</td>
                <td>5.21</td>
            </tr>         
            <tr>
                <td>allganize/Llama-3-Alpha-Ko-8B-Instruct</td>
                <td>8.50</td>
                <td>8.35</td>
                <td>4.92</td>
            </tr>
            <tr>
                <td>Synatra-kiqu-7B</td>
                <td>4.42</td>
                <td>5.71</td>
                <td>4.50</td>
            </tr>
            <tr>
                <td><b>Ocelot-ko-10.8B</b></td>
                <td><b>8.57</b></td>
                <td>7.00</td>
                <td>6.57</td>
            </tr>
        </tbody>
    </table>
</body>
</html>