chlee10 commited on
Commit
0f913fb
1 Parent(s): c83dd54

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -5
README.md CHANGED
@@ -5,8 +5,8 @@ license: apache-2.0
5
  ## T3Q-MSlerp-7Bx2
6
 
7
  T3Q-MSlerp-7Bx2 is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
8
- * [zhengr/MixTAO-7Bx2-MoE-v8.1](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-v8.1)
9
- * [yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B](https://huggingface.co/yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B)
10
 
11
 
12
  **Model Developers** Chihoon Lee(chlee10), T3Q
@@ -16,13 +16,13 @@ T3Q-MSlerp-7Bx2 is a merge of the following models using [mergekit](https://gith
16
 
17
  slices:
18
  - sources:
19
- - model: zhengr/MixTAO-7Bx2-MoE-v8.1
20
  layer_range: [0, 32]
21
- - model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B
22
  layer_range: [0, 32]
23
 
24
  merge_method: slerp
25
- base_model: zhengr/MixTAO-7Bx2-MoE-v8.1
26
 
27
  parameters:
28
  t:
 
5
  ## T3Q-MSlerp-7Bx2
6
 
7
  T3Q-MSlerp-7Bx2 is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
8
+ * [zhengr/MixTAO-7Bx2-MoE-Instruct-v7.0](https://huggingface.co/zhengr/MixTAO-7Bx2-MoE-Instruct-v7.0)
9
+ * [yunconglong/13B_MATH_DPO](https://huggingface.co/yunconglong/13B_MATH_DPO)
10
 
11
 
12
  **Model Developers** Chihoon Lee(chlee10), T3Q
 
16
 
17
  slices:
18
  - sources:
19
+ - model: zhengr/MixTAO-7Bx2-MoE-Instruct-v7.0
20
  layer_range: [0, 32]
21
+ - model: yunconglong/13B_MATH_DPO
22
  layer_range: [0, 32]
23
 
24
  merge_method: slerp
25
+ base_model: zhengr/MixTAO-7Bx2-MoE-Instruct-v7.0
26
 
27
  parameters:
28
  t: