Muennighoff commited on
Commit
97ef89b
1 Parent(s): 717413c

Add MTEB metadata

Browse files
Files changed (1) hide show
  1. README.md +2744 -1
README.md CHANGED
@@ -4,6 +4,7 @@ tags:
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
 
7
  language: en
8
  license: apache-2.0
9
  datasets:
@@ -28,7 +29,2749 @@ datasets:
28
  - embedding-data/SPECTER
29
  - embedding-data/PAQ_pairs
30
  - embedding-data/WikiAnswers
31
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
32
  ---
33
 
34
 
 
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
7
+ - mteb
8
  language: en
9
  license: apache-2.0
10
  datasets:
 
29
  - embedding-data/SPECTER
30
  - embedding-data/PAQ_pairs
31
  - embedding-data/WikiAnswers
32
+ model-index:
33
+ - name: all-MiniLM-L6-v2
34
+ results:
35
+ - task:
36
+ type: Classification
37
+ dataset:
38
+ type: mteb/amazon_counterfactual
39
+ name: MTEB AmazonCounterfactualClassification (en)
40
+ config: en
41
+ split: test
42
+ metrics:
43
+ - type: accuracy
44
+ value: 64.14925373134331
45
+ - type: ap
46
+ value: 27.237875815186907
47
+ - type: f1
48
+ value: 58.03105716318715
49
+ - task:
50
+ type: Classification
51
+ dataset:
52
+ type: mteb/amazon_polarity
53
+ name: MTEB AmazonPolarityClassification
54
+ config: default
55
+ split: test
56
+ metrics:
57
+ - type: accuracy
58
+ value: 62.582975
59
+ - type: ap
60
+ value: 58.26562634146188
61
+ - type: f1
62
+ value: 62.304673961004156
63
+ - task:
64
+ type: Classification
65
+ dataset:
66
+ type: mteb/amazon_reviews_multi
67
+ name: MTEB AmazonReviewsClassification (en)
68
+ config: en
69
+ split: test
70
+ metrics:
71
+ - type: accuracy
72
+ value: 31.785999999999998
73
+ - type: f1
74
+ value: 31.40726949960717
75
+ - task:
76
+ type: Retrieval
77
+ dataset:
78
+ type: arguana
79
+ name: MTEB ArguAna
80
+ config: default
81
+ split: test
82
+ metrics:
83
+ - type: map_at_1
84
+ value: 25.605
85
+ - type: map_at_10
86
+ value: 41.165
87
+ - type: map_at_100
88
+ value: 42.230000000000004
89
+ - type: map_at_1000
90
+ value: 42.241
91
+ - type: map_at_3
92
+ value: 35.965
93
+ - type: map_at_5
94
+ value: 38.981
95
+ - type: ndcg_at_1
96
+ value: 25.605
97
+ - type: ndcg_at_10
98
+ value: 50.166999999999994
99
+ - type: ndcg_at_100
100
+ value: 54.534000000000006
101
+ - type: ndcg_at_1000
102
+ value: 54.772
103
+ - type: ndcg_at_3
104
+ value: 39.434000000000005
105
+ - type: ndcg_at_5
106
+ value: 44.876
107
+ - type: precision_at_1
108
+ value: 25.605
109
+ - type: precision_at_10
110
+ value: 7.908999999999999
111
+ - type: precision_at_100
112
+ value: 0.9769999999999999
113
+ - type: precision_at_1000
114
+ value: 0.1
115
+ - type: precision_at_3
116
+ value: 16.500999999999998
117
+ - type: precision_at_5
118
+ value: 12.546
119
+ - type: recall_at_1
120
+ value: 25.605
121
+ - type: recall_at_10
122
+ value: 79.09
123
+ - type: recall_at_100
124
+ value: 97.724
125
+ - type: recall_at_1000
126
+ value: 99.502
127
+ - type: recall_at_3
128
+ value: 49.502
129
+ - type: recall_at_5
130
+ value: 62.731
131
+ - task:
132
+ type: Clustering
133
+ dataset:
134
+ type: mteb/arxiv-clustering-p2p
135
+ name: MTEB ArxivClusteringP2P
136
+ config: default
137
+ split: test
138
+ metrics:
139
+ - type: v_measure
140
+ value: 46.54595079050156
141
+ - task:
142
+ type: Clustering
143
+ dataset:
144
+ type: mteb/arxiv-clustering-s2s
145
+ name: MTEB ArxivClusteringS2S
146
+ config: default
147
+ split: test
148
+ metrics:
149
+ - type: v_measure
150
+ value: 37.85709823840442
151
+ - task:
152
+ type: Reranking
153
+ dataset:
154
+ type: mteb/askubuntudupquestions-reranking
155
+ name: MTEB AskUbuntuDupQuestions
156
+ config: default
157
+ split: test
158
+ metrics:
159
+ - type: map
160
+ value: 63.47681681237331
161
+ - type: mrr
162
+ value: 77.08657608934617
163
+ - task:
164
+ type: STS
165
+ dataset:
166
+ type: mteb/biosses-sts
167
+ name: MTEB BIOSSES
168
+ config: default
169
+ split: test
170
+ metrics:
171
+ - type: cos_sim_pearson
172
+ value: 84.41897516342782
173
+ - type: cos_sim_spearman
174
+ value: 81.64041444909368
175
+ - type: euclidean_pearson
176
+ value: 82.67500318274435
177
+ - type: euclidean_spearman
178
+ value: 81.64041444909368
179
+ - type: manhattan_pearson
180
+ value: 82.35165974372227
181
+ - type: manhattan_spearman
182
+ value: 81.50968857884978
183
+ - task:
184
+ type: Classification
185
+ dataset:
186
+ type: mteb/banking77
187
+ name: MTEB Banking77Classification
188
+ config: default
189
+ split: test
190
+ metrics:
191
+ - type: accuracy
192
+ value: 79.75000000000001
193
+ - type: f1
194
+ value: 78.92604185699534
195
+ - task:
196
+ type: Clustering
197
+ dataset:
198
+ type: mteb/biorxiv-clustering-p2p
199
+ name: MTEB BiorxivClusteringP2P
200
+ config: default
201
+ split: test
202
+ metrics:
203
+ - type: v_measure
204
+ value: 38.48301914135123
205
+ - task:
206
+ type: Clustering
207
+ dataset:
208
+ type: mteb/biorxiv-clustering-s2s
209
+ name: MTEB BiorxivClusteringS2S
210
+ config: default
211
+ split: test
212
+ metrics:
213
+ - type: v_measure
214
+ value: 33.170209943399804
215
+ - task:
216
+ type: Retrieval
217
+ dataset:
218
+ type: BeIR/cqadupstack
219
+ name: MTEB CQADupstackAndroidRetrieval
220
+ config: default
221
+ split: test
222
+ metrics:
223
+ - type: map_at_1
224
+ value: 34.660000000000004
225
+ - type: map_at_10
226
+ value: 46.938
227
+ - type: map_at_100
228
+ value: 48.435
229
+ - type: map_at_1000
230
+ value: 48.555
231
+ - type: map_at_3
232
+ value: 43.034
233
+ - type: map_at_5
234
+ value: 45.055
235
+ - type: ndcg_at_1
236
+ value: 42.775
237
+ - type: ndcg_at_10
238
+ value: 53.82900000000001
239
+ - type: ndcg_at_100
240
+ value: 58.74700000000001
241
+ - type: ndcg_at_1000
242
+ value: 60.309000000000005
243
+ - type: ndcg_at_3
244
+ value: 48.487
245
+ - type: ndcg_at_5
246
+ value: 50.722
247
+ - type: precision_at_1
248
+ value: 42.775
249
+ - type: precision_at_10
250
+ value: 10.629
251
+ - type: precision_at_100
252
+ value: 1.652
253
+ - type: precision_at_1000
254
+ value: 0.209
255
+ - type: precision_at_3
256
+ value: 23.366999999999997
257
+ - type: precision_at_5
258
+ value: 16.967
259
+ - type: recall_at_1
260
+ value: 34.660000000000004
261
+ - type: recall_at_10
262
+ value: 66.465
263
+ - type: recall_at_100
264
+ value: 87.559
265
+ - type: recall_at_1000
266
+ value: 97.18299999999999
267
+ - type: recall_at_3
268
+ value: 51.01
269
+ - type: recall_at_5
270
+ value: 57.412
271
+ - task:
272
+ type: Retrieval
273
+ dataset:
274
+ type: BeIR/cqadupstack
275
+ name: MTEB CQADupstackEnglishRetrieval
276
+ config: default
277
+ split: test
278
+ metrics:
279
+ - type: map_at_1
280
+ value: 31.180999999999997
281
+ - type: map_at_10
282
+ value: 41.802
283
+ - type: map_at_100
284
+ value: 43.294
285
+ - type: map_at_1000
286
+ value: 43.438
287
+ - type: map_at_3
288
+ value: 38.668
289
+ - type: map_at_5
290
+ value: 40.559
291
+ - type: ndcg_at_1
292
+ value: 39.489999999999995
293
+ - type: ndcg_at_10
294
+ value: 47.776
295
+ - type: ndcg_at_100
296
+ value: 52.705
297
+ - type: ndcg_at_1000
298
+ value: 54.830999999999996
299
+ - type: ndcg_at_3
300
+ value: 43.649
301
+ - type: ndcg_at_5
302
+ value: 45.885
303
+ - type: precision_at_1
304
+ value: 39.489999999999995
305
+ - type: precision_at_10
306
+ value: 9.121
307
+ - type: precision_at_100
308
+ value: 1.504
309
+ - type: precision_at_1000
310
+ value: 0.2
311
+ - type: precision_at_3
312
+ value: 21.38
313
+ - type: precision_at_5
314
+ value: 15.35
315
+ - type: recall_at_1
316
+ value: 31.180999999999997
317
+ - type: recall_at_10
318
+ value: 57.714
319
+ - type: recall_at_100
320
+ value: 78.342
321
+ - type: recall_at_1000
322
+ value: 91.586
323
+ - type: recall_at_3
324
+ value: 45.255
325
+ - type: recall_at_5
326
+ value: 51.459999999999994
327
+ - task:
328
+ type: Retrieval
329
+ dataset:
330
+ type: BeIR/cqadupstack
331
+ name: MTEB CQADupstackGamingRetrieval
332
+ config: default
333
+ split: test
334
+ metrics:
335
+ - type: map_at_1
336
+ value: 38.732
337
+ - type: map_at_10
338
+ value: 51.03
339
+ - type: map_at_100
340
+ value: 52.078
341
+ - type: map_at_1000
342
+ value: 52.132
343
+ - type: map_at_3
344
+ value: 47.735
345
+ - type: map_at_5
346
+ value: 49.562
347
+ - type: ndcg_at_1
348
+ value: 44.074999999999996
349
+ - type: ndcg_at_10
350
+ value: 56.923
351
+ - type: ndcg_at_100
352
+ value: 61.004999999999995
353
+ - type: ndcg_at_1000
354
+ value: 62.12800000000001
355
+ - type: ndcg_at_3
356
+ value: 51.381
357
+ - type: ndcg_at_5
358
+ value: 54.027
359
+ - type: precision_at_1
360
+ value: 44.074999999999996
361
+ - type: precision_at_10
362
+ value: 9.21
363
+ - type: precision_at_100
364
+ value: 1.221
365
+ - type: precision_at_1000
366
+ value: 0.136
367
+ - type: precision_at_3
368
+ value: 23.009
369
+ - type: precision_at_5
370
+ value: 15.748999999999999
371
+ - type: recall_at_1
372
+ value: 38.732
373
+ - type: recall_at_10
374
+ value: 71.154
375
+ - type: recall_at_100
376
+ value: 88.676
377
+ - type: recall_at_1000
378
+ value: 96.718
379
+ - type: recall_at_3
380
+ value: 56.288000000000004
381
+ - type: recall_at_5
382
+ value: 62.792
383
+ - task:
384
+ type: Retrieval
385
+ dataset:
386
+ type: BeIR/cqadupstack
387
+ name: MTEB CQADupstackGisRetrieval
388
+ config: default
389
+ split: test
390
+ metrics:
391
+ - type: map_at_1
392
+ value: 26.837
393
+ - type: map_at_10
394
+ value: 35.959
395
+ - type: map_at_100
396
+ value: 37.172
397
+ - type: map_at_1000
398
+ value: 37.241
399
+ - type: map_at_3
400
+ value: 33.027
401
+ - type: map_at_5
402
+ value: 34.699000000000005
403
+ - type: ndcg_at_1
404
+ value: 29.378999999999998
405
+ - type: ndcg_at_10
406
+ value: 41.31
407
+ - type: ndcg_at_100
408
+ value: 47.058
409
+ - type: ndcg_at_1000
410
+ value: 48.777
411
+ - type: ndcg_at_3
412
+ value: 35.564
413
+ - type: ndcg_at_5
414
+ value: 38.384
415
+ - type: precision_at_1
416
+ value: 29.378999999999998
417
+ - type: precision_at_10
418
+ value: 6.361999999999999
419
+ - type: precision_at_100
420
+ value: 0.98
421
+ - type: precision_at_1000
422
+ value: 0.117
423
+ - type: precision_at_3
424
+ value: 15.028
425
+ - type: precision_at_5
426
+ value: 10.667
427
+ - type: recall_at_1
428
+ value: 26.837
429
+ - type: recall_at_10
430
+ value: 55.667
431
+ - type: recall_at_100
432
+ value: 81.843
433
+ - type: recall_at_1000
434
+ value: 94.707
435
+ - type: recall_at_3
436
+ value: 40.049
437
+ - type: recall_at_5
438
+ value: 46.92
439
+ - task:
440
+ type: Retrieval
441
+ dataset:
442
+ type: BeIR/cqadupstack
443
+ name: MTEB CQADupstackMathematicaRetrieval
444
+ config: default
445
+ split: test
446
+ metrics:
447
+ - type: map_at_1
448
+ value: 15.142
449
+ - type: map_at_10
450
+ value: 23.727999999999998
451
+ - type: map_at_100
452
+ value: 25.137999999999998
453
+ - type: map_at_1000
454
+ value: 25.256
455
+ - type: map_at_3
456
+ value: 20.673
457
+ - type: map_at_5
458
+ value: 22.325999999999997
459
+ - type: ndcg_at_1
460
+ value: 18.407999999999998
461
+ - type: ndcg_at_10
462
+ value: 29.286
463
+ - type: ndcg_at_100
464
+ value: 35.753
465
+ - type: ndcg_at_1000
466
+ value: 38.541
467
+ - type: ndcg_at_3
468
+ value: 23.599
469
+ - type: ndcg_at_5
470
+ value: 26.262
471
+ - type: precision_at_1
472
+ value: 18.407999999999998
473
+ - type: precision_at_10
474
+ value: 5.697
475
+ - type: precision_at_100
476
+ value: 1.034
477
+ - type: precision_at_1000
478
+ value: 0.14100000000000001
479
+ - type: precision_at_3
480
+ value: 11.567
481
+ - type: precision_at_5
482
+ value: 8.781
483
+ - type: recall_at_1
484
+ value: 15.142
485
+ - type: recall_at_10
486
+ value: 42.476
487
+ - type: recall_at_100
488
+ value: 70.22699999999999
489
+ - type: recall_at_1000
490
+ value: 90.02799999999999
491
+ - type: recall_at_3
492
+ value: 27.056
493
+ - type: recall_at_5
494
+ value: 33.663
495
+ - task:
496
+ type: Retrieval
497
+ dataset:
498
+ type: BeIR/cqadupstack
499
+ name: MTEB CQADupstackPhysicsRetrieval
500
+ config: default
501
+ split: test
502
+ metrics:
503
+ - type: map_at_1
504
+ value: 29.142000000000003
505
+ - type: map_at_10
506
+ value: 40.735
507
+ - type: map_at_100
508
+ value: 42.155
509
+ - type: map_at_1000
510
+ value: 42.27
511
+ - type: map_at_3
512
+ value: 37.491
513
+ - type: map_at_5
514
+ value: 39.475
515
+ - type: ndcg_at_1
516
+ value: 35.515
517
+ - type: ndcg_at_10
518
+ value: 46.982
519
+ - type: ndcg_at_100
520
+ value: 52.913
521
+ - type: ndcg_at_1000
522
+ value: 54.759
523
+ - type: ndcg_at_3
524
+ value: 42.164
525
+ - type: ndcg_at_5
526
+ value: 44.674
527
+ - type: precision_at_1
528
+ value: 35.515
529
+ - type: precision_at_10
530
+ value: 8.624
531
+ - type: precision_at_100
532
+ value: 1.377
533
+ - type: precision_at_1000
534
+ value: 0.173
535
+ - type: precision_at_3
536
+ value: 20.468
537
+ - type: precision_at_5
538
+ value: 14.649000000000001
539
+ - type: recall_at_1
540
+ value: 29.142000000000003
541
+ - type: recall_at_10
542
+ value: 59.693
543
+ - type: recall_at_100
544
+ value: 84.84899999999999
545
+ - type: recall_at_1000
546
+ value: 96.451
547
+ - type: recall_at_3
548
+ value: 46.086
549
+ - type: recall_at_5
550
+ value: 52.556000000000004
551
+ - task:
552
+ type: Retrieval
553
+ dataset:
554
+ type: BeIR/cqadupstack
555
+ name: MTEB CQADupstackProgrammersRetrieval
556
+ config: default
557
+ split: test
558
+ metrics:
559
+ - type: map_at_1
560
+ value: 22.081999999999997
561
+ - type: map_at_10
562
+ value: 32.74
563
+ - type: map_at_100
564
+ value: 34.108
565
+ - type: map_at_1000
566
+ value: 34.233000000000004
567
+ - type: map_at_3
568
+ value: 29.282999999999998
569
+ - type: map_at_5
570
+ value: 31.127
571
+ - type: ndcg_at_1
572
+ value: 26.712000000000003
573
+ - type: ndcg_at_10
574
+ value: 38.968
575
+ - type: ndcg_at_100
576
+ value: 44.64
577
+ - type: ndcg_at_1000
578
+ value: 47.193000000000005
579
+ - type: ndcg_at_3
580
+ value: 33.311
581
+ - type: ndcg_at_5
582
+ value: 35.76
583
+ - type: precision_at_1
584
+ value: 26.712000000000003
585
+ - type: precision_at_10
586
+ value: 7.534000000000001
587
+ - type: precision_at_100
588
+ value: 1.2149999999999999
589
+ - type: precision_at_1000
590
+ value: 0.163
591
+ - type: precision_at_3
592
+ value: 16.476
593
+ - type: precision_at_5
594
+ value: 12.009
595
+ - type: recall_at_1
596
+ value: 22.081999999999997
597
+ - type: recall_at_10
598
+ value: 52.859
599
+ - type: recall_at_100
600
+ value: 76.812
601
+ - type: recall_at_1000
602
+ value: 94.248
603
+ - type: recall_at_3
604
+ value: 36.964999999999996
605
+ - type: recall_at_5
606
+ value: 43.338
607
+ - task:
608
+ type: Retrieval
609
+ dataset:
610
+ type: BeIR/cqadupstack
611
+ name: MTEB CQADupstackRetrieval
612
+ config: default
613
+ split: test
614
+ metrics:
615
+ - type: map_at_1
616
+ value: 25.825750000000003
617
+ - type: map_at_10
618
+ value: 35.614666666666665
619
+ - type: map_at_100
620
+ value: 36.952416666666664
621
+ - type: map_at_1000
622
+ value: 37.07433333333334
623
+ - type: map_at_3
624
+ value: 32.519916666666674
625
+ - type: map_at_5
626
+ value: 34.22966666666667
627
+ - type: ndcg_at_1
628
+ value: 30.616416666666662
629
+ - type: ndcg_at_10
630
+ value: 41.32475
631
+ - type: ndcg_at_100
632
+ value: 46.907
633
+ - type: ndcg_at_1000
634
+ value: 49.12475
635
+ - type: ndcg_at_3
636
+ value: 36.1415
637
+ - type: ndcg_at_5
638
+ value: 38.54916666666666
639
+ - type: precision_at_1
640
+ value: 30.616416666666662
641
+ - type: precision_at_10
642
+ value: 7.427166666666666
643
+ - type: precision_at_100
644
+ value: 1.2174166666666666
645
+ - type: precision_at_1000
646
+ value: 0.16066666666666665
647
+ - type: precision_at_3
648
+ value: 16.849083333333333
649
+ - type: precision_at_5
650
+ value: 12.1105
651
+ - type: recall_at_1
652
+ value: 25.825750000000003
653
+ - type: recall_at_10
654
+ value: 53.95283333333333
655
+ - type: recall_at_100
656
+ value: 78.408
657
+ - type: recall_at_1000
658
+ value: 93.60841666666666
659
+ - type: recall_at_3
660
+ value: 39.51116666666667
661
+ - type: recall_at_5
662
+ value: 45.67041666666667
663
+ - task:
664
+ type: Retrieval
665
+ dataset:
666
+ type: BeIR/cqadupstack
667
+ name: MTEB CQADupstackStatsRetrieval
668
+ config: default
669
+ split: test
670
+ metrics:
671
+ - type: map_at_1
672
+ value: 23.147000000000002
673
+ - type: map_at_10
674
+ value: 30.867
675
+ - type: map_at_100
676
+ value: 31.961000000000002
677
+ - type: map_at_1000
678
+ value: 32.074999999999996
679
+ - type: map_at_3
680
+ value: 28.598000000000003
681
+ - type: map_at_5
682
+ value: 29.715000000000003
683
+ - type: ndcg_at_1
684
+ value: 26.074
685
+ - type: ndcg_at_10
686
+ value: 35.379
687
+ - type: ndcg_at_100
688
+ value: 40.668
689
+ - type: ndcg_at_1000
690
+ value: 43.271
691
+ - type: ndcg_at_3
692
+ value: 31.291000000000004
693
+ - type: ndcg_at_5
694
+ value: 32.828
695
+ - type: precision_at_1
696
+ value: 26.074
697
+ - type: precision_at_10
698
+ value: 5.782
699
+ - type: precision_at_100
700
+ value: 0.9159999999999999
701
+ - type: precision_at_1000
702
+ value: 0.121
703
+ - type: precision_at_3
704
+ value: 13.905999999999999
705
+ - type: precision_at_5
706
+ value: 9.508999999999999
707
+ - type: recall_at_1
708
+ value: 23.147000000000002
709
+ - type: recall_at_10
710
+ value: 46.308
711
+ - type: recall_at_100
712
+ value: 70.529
713
+ - type: recall_at_1000
714
+ value: 89.53
715
+ - type: recall_at_3
716
+ value: 34.504000000000005
717
+ - type: recall_at_5
718
+ value: 38.472
719
+ - task:
720
+ type: Retrieval
721
+ dataset:
722
+ type: BeIR/cqadupstack
723
+ name: MTEB CQADupstackTexRetrieval
724
+ config: default
725
+ split: test
726
+ metrics:
727
+ - type: map_at_1
728
+ value: 17.573
729
+ - type: map_at_10
730
+ value: 25.480999999999998
731
+ - type: map_at_100
732
+ value: 26.740000000000002
733
+ - type: map_at_1000
734
+ value: 26.881
735
+ - type: map_at_3
736
+ value: 22.962
737
+ - type: map_at_5
738
+ value: 24.366
739
+ - type: ndcg_at_1
740
+ value: 21.783
741
+ - type: ndcg_at_10
742
+ value: 30.519000000000002
743
+ - type: ndcg_at_100
744
+ value: 36.449
745
+ - type: ndcg_at_1000
746
+ value: 39.476
747
+ - type: ndcg_at_3
748
+ value: 26.104
749
+ - type: ndcg_at_5
750
+ value: 28.142
751
+ - type: precision_at_1
752
+ value: 21.783
753
+ - type: precision_at_10
754
+ value: 5.716
755
+ - type: precision_at_100
756
+ value: 1.036
757
+ - type: precision_at_1000
758
+ value: 0.149
759
+ - type: precision_at_3
760
+ value: 12.629000000000001
761
+ - type: precision_at_5
762
+ value: 9.188
763
+ - type: recall_at_1
764
+ value: 17.573
765
+ - type: recall_at_10
766
+ value: 41.565999999999995
767
+ - type: recall_at_100
768
+ value: 68.31099999999999
769
+ - type: recall_at_1000
770
+ value: 89.66
771
+ - type: recall_at_3
772
+ value: 28.998
773
+ - type: recall_at_5
774
+ value: 34.409
775
+ - task:
776
+ type: Retrieval
777
+ dataset:
778
+ type: BeIR/cqadupstack
779
+ name: MTEB CQADupstackUnixRetrieval
780
+ config: default
781
+ split: test
782
+ metrics:
783
+ - type: map_at_1
784
+ value: 25.393
785
+ - type: map_at_10
786
+ value: 35.408
787
+ - type: map_at_100
788
+ value: 36.765
789
+ - type: map_at_1000
790
+ value: 36.870000000000005
791
+ - type: map_at_3
792
+ value: 31.858999999999998
793
+ - type: map_at_5
794
+ value: 34.088
795
+ - type: ndcg_at_1
796
+ value: 30.409999999999997
797
+ - type: ndcg_at_10
798
+ value: 41.31
799
+ - type: ndcg_at_100
800
+ value: 47.317
801
+ - type: ndcg_at_1000
802
+ value: 49.451
803
+ - type: ndcg_at_3
804
+ value: 35.156
805
+ - type: ndcg_at_5
806
+ value: 38.550000000000004
807
+ - type: precision_at_1
808
+ value: 30.409999999999997
809
+ - type: precision_at_10
810
+ value: 7.285
811
+ - type: precision_at_100
812
+ value: 1.16
813
+ - type: precision_at_1000
814
+ value: 0.145
815
+ - type: precision_at_3
816
+ value: 16.2
817
+ - type: precision_at_5
818
+ value: 12.015
819
+ - type: recall_at_1
820
+ value: 25.393
821
+ - type: recall_at_10
822
+ value: 54.955
823
+ - type: recall_at_100
824
+ value: 81.074
825
+ - type: recall_at_1000
826
+ value: 95.517
827
+ - type: recall_at_3
828
+ value: 38.646
829
+ - type: recall_at_5
830
+ value: 47.155
831
+ - task:
832
+ type: Retrieval
833
+ dataset:
834
+ type: BeIR/cqadupstack
835
+ name: MTEB CQADupstackWebmastersRetrieval
836
+ config: default
837
+ split: test
838
+ metrics:
839
+ - type: map_at_1
840
+ value: 25.219
841
+ - type: map_at_10
842
+ value: 34.317
843
+ - type: map_at_100
844
+ value: 36.099
845
+ - type: map_at_1000
846
+ value: 36.339
847
+ - type: map_at_3
848
+ value: 31.118000000000002
849
+ - type: map_at_5
850
+ value: 32.759
851
+ - type: ndcg_at_1
852
+ value: 30.04
853
+ - type: ndcg_at_10
854
+ value: 40.467
855
+ - type: ndcg_at_100
856
+ value: 46.918
857
+ - type: ndcg_at_1000
858
+ value: 49.263
859
+ - type: ndcg_at_3
860
+ value: 34.976
861
+ - type: ndcg_at_5
862
+ value: 37.345
863
+ - type: precision_at_1
864
+ value: 30.04
865
+ - type: precision_at_10
866
+ value: 7.786999999999999
867
+ - type: precision_at_100
868
+ value: 1.638
869
+ - type: precision_at_1000
870
+ value: 0.249
871
+ - type: precision_at_3
872
+ value: 16.206
873
+ - type: precision_at_5
874
+ value: 11.976
875
+ - type: recall_at_1
876
+ value: 25.219
877
+ - type: recall_at_10
878
+ value: 52.443
879
+ - type: recall_at_100
880
+ value: 80.523
881
+ - type: recall_at_1000
882
+ value: 95.025
883
+ - type: recall_at_3
884
+ value: 37.216
885
+ - type: recall_at_5
886
+ value: 43.086999999999996
887
+ - task:
888
+ type: Retrieval
889
+ dataset:
890
+ type: BeIR/cqadupstack
891
+ name: MTEB CQADupstackWordpressRetrieval
892
+ config: default
893
+ split: test
894
+ metrics:
895
+ - type: map_at_1
896
+ value: 20.801
897
+ - type: map_at_10
898
+ value: 28.371000000000002
899
+ - type: map_at_100
900
+ value: 29.483999999999998
901
+ - type: map_at_1000
902
+ value: 29.602
903
+ - type: map_at_3
904
+ value: 25.790999999999997
905
+ - type: map_at_5
906
+ value: 27.025
907
+ - type: ndcg_at_1
908
+ value: 22.736
909
+ - type: ndcg_at_10
910
+ value: 33.147999999999996
911
+ - type: ndcg_at_100
912
+ value: 38.711
913
+ - type: ndcg_at_1000
914
+ value: 41.498000000000005
915
+ - type: ndcg_at_3
916
+ value: 28.016000000000002
917
+ - type: ndcg_at_5
918
+ value: 30.011
919
+ - type: precision_at_1
920
+ value: 22.736
921
+ - type: precision_at_10
922
+ value: 5.379
923
+ - type: precision_at_100
924
+ value: 0.876
925
+ - type: precision_at_1000
926
+ value: 0.125
927
+ - type: precision_at_3
928
+ value: 11.953
929
+ - type: precision_at_5
930
+ value: 8.466
931
+ - type: recall_at_1
932
+ value: 20.801
933
+ - type: recall_at_10
934
+ value: 46.134
935
+ - type: recall_at_100
936
+ value: 72.151
937
+ - type: recall_at_1000
938
+ value: 92.648
939
+ - type: recall_at_3
940
+ value: 32.061
941
+ - type: recall_at_5
942
+ value: 36.781000000000006
943
+ - task:
944
+ type: Retrieval
945
+ dataset:
946
+ type: climate-fever
947
+ name: MTEB ClimateFEVER
948
+ config: default
949
+ split: test
950
+ metrics:
951
+ - type: map_at_1
952
+ value: 7.9159999999999995
953
+ - type: map_at_10
954
+ value: 13.769
955
+ - type: map_at_100
956
+ value: 15.447
957
+ - type: map_at_1000
958
+ value: 15.634
959
+ - type: map_at_3
960
+ value: 11.234
961
+ - type: map_at_5
962
+ value: 12.581999999999999
963
+ - type: ndcg_at_1
964
+ value: 17.72
965
+ - type: ndcg_at_10
966
+ value: 20.272000000000002
967
+ - type: ndcg_at_100
968
+ value: 27.748
969
+ - type: ndcg_at_1000
970
+ value: 31.457
971
+ - type: ndcg_at_3
972
+ value: 15.742
973
+ - type: ndcg_at_5
974
+ value: 17.494
975
+ - type: precision_at_1
976
+ value: 17.72
977
+ - type: precision_at_10
978
+ value: 6.554
979
+ - type: precision_at_100
980
+ value: 1.438
981
+ - type: precision_at_1000
982
+ value: 0.212
983
+ - type: precision_at_3
984
+ value: 11.705
985
+ - type: precision_at_5
986
+ value: 9.511
987
+ - type: recall_at_1
988
+ value: 7.9159999999999995
989
+ - type: recall_at_10
990
+ value: 25.389
991
+ - type: recall_at_100
992
+ value: 52.042
993
+ - type: recall_at_1000
994
+ value: 73.166
995
+ - type: recall_at_3
996
+ value: 14.585999999999999
997
+ - type: recall_at_5
998
+ value: 19.145
999
+ - task:
1000
+ type: Retrieval
1001
+ dataset:
1002
+ type: dbpedia-entity
1003
+ name: MTEB DBPedia
1004
+ config: default
1005
+ split: test
1006
+ metrics:
1007
+ - type: map_at_1
1008
+ value: 7.172000000000001
1009
+ - type: map_at_10
1010
+ value: 14.507
1011
+ - type: map_at_100
1012
+ value: 20.094
1013
+ - type: map_at_1000
1014
+ value: 21.357
1015
+ - type: map_at_3
1016
+ value: 10.45
1017
+ - type: map_at_5
1018
+ value: 12.135
1019
+ - type: ndcg_at_1
1020
+ value: 42.375
1021
+ - type: ndcg_at_10
1022
+ value: 32.33
1023
+ - type: ndcg_at_100
1024
+ value: 36.370000000000005
1025
+ - type: ndcg_at_1000
1026
+ value: 43.596000000000004
1027
+ - type: ndcg_at_3
1028
+ value: 35.006
1029
+ - type: ndcg_at_5
1030
+ value: 33.35
1031
+ - type: precision_at_1
1032
+ value: 54.50000000000001
1033
+ - type: precision_at_10
1034
+ value: 26.424999999999997
1035
+ - type: precision_at_100
1036
+ value: 8.24
1037
+ - type: precision_at_1000
1038
+ value: 1.765
1039
+ - type: precision_at_3
1040
+ value: 38.667
1041
+ - type: precision_at_5
1042
+ value: 33.0
1043
+ - type: recall_at_1
1044
+ value: 7.172000000000001
1045
+ - type: recall_at_10
1046
+ value: 19.922
1047
+ - type: recall_at_100
1048
+ value: 43.273
1049
+ - type: recall_at_1000
1050
+ value: 67.157
1051
+ - type: recall_at_3
1052
+ value: 11.521
1053
+ - type: recall_at_5
1054
+ value: 14.667
1055
+ - task:
1056
+ type: Classification
1057
+ dataset:
1058
+ type: mteb/emotion
1059
+ name: MTEB EmotionClassification
1060
+ config: default
1061
+ split: test
1062
+ metrics:
1063
+ - type: accuracy
1064
+ value: 38.43
1065
+ - type: f1
1066
+ value: 35.26220518237799
1067
+ - task:
1068
+ type: Retrieval
1069
+ dataset:
1070
+ type: fever
1071
+ name: MTEB FEVER
1072
+ config: default
1073
+ split: test
1074
+ metrics:
1075
+ - type: map_at_1
1076
+ value: 34.076
1077
+ - type: map_at_10
1078
+ value: 45.482
1079
+ - type: map_at_100
1080
+ value: 46.269
1081
+ - type: map_at_1000
1082
+ value: 46.309
1083
+ - type: map_at_3
1084
+ value: 42.614000000000004
1085
+ - type: map_at_5
1086
+ value: 44.314
1087
+ - type: ndcg_at_1
1088
+ value: 36.529
1089
+ - type: ndcg_at_10
1090
+ value: 51.934000000000005
1091
+ - type: ndcg_at_100
1092
+ value: 55.525000000000006
1093
+ - type: ndcg_at_1000
1094
+ value: 56.568
1095
+ - type: ndcg_at_3
1096
+ value: 46.169
1097
+ - type: ndcg_at_5
1098
+ value: 49.163000000000004
1099
+ - type: precision_at_1
1100
+ value: 36.529
1101
+ - type: precision_at_10
1102
+ value: 7.5649999999999995
1103
+ - type: precision_at_100
1104
+ value: 0.947
1105
+ - type: precision_at_1000
1106
+ value: 0.105
1107
+ - type: precision_at_3
1108
+ value: 19.326999999999998
1109
+ - type: precision_at_5
1110
+ value: 13.239999999999998
1111
+ - type: recall_at_1
1112
+ value: 34.076
1113
+ - type: recall_at_10
1114
+ value: 69.009
1115
+ - type: recall_at_100
1116
+ value: 85.047
1117
+ - type: recall_at_1000
1118
+ value: 92.962
1119
+ - type: recall_at_3
1120
+ value: 53.446000000000005
1121
+ - type: recall_at_5
1122
+ value: 60.622
1123
+ - task:
1124
+ type: Retrieval
1125
+ dataset:
1126
+ type: fiqa
1127
+ name: MTEB FiQA2018
1128
+ config: default
1129
+ split: test
1130
+ metrics:
1131
+ - type: map_at_1
1132
+ value: 17.14
1133
+ - type: map_at_10
1134
+ value: 29.141000000000002
1135
+ - type: map_at_100
1136
+ value: 30.956
1137
+ - type: map_at_1000
1138
+ value: 31.159
1139
+ - type: map_at_3
1140
+ value: 25.188
1141
+ - type: map_at_5
1142
+ value: 27.506999999999998
1143
+ - type: ndcg_at_1
1144
+ value: 34.721999999999994
1145
+ - type: ndcg_at_10
1146
+ value: 36.867
1147
+ - type: ndcg_at_100
1148
+ value: 43.931
1149
+ - type: ndcg_at_1000
1150
+ value: 47.276
1151
+ - type: ndcg_at_3
1152
+ value: 33.18
1153
+ - type: ndcg_at_5
1154
+ value: 34.504000000000005
1155
+ - type: precision_at_1
1156
+ value: 34.721999999999994
1157
+ - type: precision_at_10
1158
+ value: 10.448
1159
+ - type: precision_at_100
1160
+ value: 1.778
1161
+ - type: precision_at_1000
1162
+ value: 0.23600000000000002
1163
+ - type: precision_at_3
1164
+ value: 22.377
1165
+ - type: precision_at_5
1166
+ value: 16.759
1167
+ - type: recall_at_1
1168
+ value: 17.14
1169
+ - type: recall_at_10
1170
+ value: 44.131
1171
+ - type: recall_at_100
1172
+ value: 70.60600000000001
1173
+ - type: recall_at_1000
1174
+ value: 90.672
1175
+ - type: recall_at_3
1176
+ value: 30.536
1177
+ - type: recall_at_5
1178
+ value: 36.706
1179
+ - task:
1180
+ type: Retrieval
1181
+ dataset:
1182
+ type: hotpotqa
1183
+ name: MTEB HotpotQA
1184
+ config: default
1185
+ split: test
1186
+ metrics:
1187
+ - type: map_at_1
1188
+ value: 27.717999999999996
1189
+ - type: map_at_10
1190
+ value: 37.63
1191
+ - type: map_at_100
1192
+ value: 38.534
1193
+ - type: map_at_1000
1194
+ value: 38.619
1195
+ - type: map_at_3
1196
+ value: 35.197
1197
+ - type: map_at_5
1198
+ value: 36.592999999999996
1199
+ - type: ndcg_at_1
1200
+ value: 55.43599999999999
1201
+ - type: ndcg_at_10
1202
+ value: 46.513
1203
+ - type: ndcg_at_100
1204
+ value: 50.21
1205
+ - type: ndcg_at_1000
1206
+ value: 52.049
1207
+ - type: ndcg_at_3
1208
+ value: 42.333999999999996
1209
+ - type: ndcg_at_5
1210
+ value: 44.45
1211
+ - type: precision_at_1
1212
+ value: 55.43599999999999
1213
+ - type: precision_at_10
1214
+ value: 9.741
1215
+ - type: precision_at_100
1216
+ value: 1.2670000000000001
1217
+ - type: precision_at_1000
1218
+ value: 0.151
1219
+ - type: precision_at_3
1220
+ value: 26.194
1221
+ - type: precision_at_5
1222
+ value: 17.396
1223
+ - type: recall_at_1
1224
+ value: 27.717999999999996
1225
+ - type: recall_at_10
1226
+ value: 48.704
1227
+ - type: recall_at_100
1228
+ value: 63.363
1229
+ - type: recall_at_1000
1230
+ value: 75.564
1231
+ - type: recall_at_3
1232
+ value: 39.291
1233
+ - type: recall_at_5
1234
+ value: 43.491
1235
+ - task:
1236
+ type: Classification
1237
+ dataset:
1238
+ type: mteb/imdb
1239
+ name: MTEB ImdbClassification
1240
+ config: default
1241
+ split: test
1242
+ metrics:
1243
+ - type: accuracy
1244
+ value: 60.6612
1245
+ - type: ap
1246
+ value: 56.73559487964456
1247
+ - type: f1
1248
+ value: 60.39970244353384
1249
+ - task:
1250
+ type: Retrieval
1251
+ dataset:
1252
+ type: msmarco
1253
+ name: MTEB MSMARCO
1254
+ config: default
1255
+ split: dev
1256
+ metrics:
1257
+ - type: map_at_1
1258
+ value: 18.715
1259
+ - type: map_at_10
1260
+ value: 30.014999999999997
1261
+ - type: map_at_100
1262
+ value: 31.208999999999996
1263
+ - type: map_at_1000
1264
+ value: 31.269999999999996
1265
+ - type: map_at_3
1266
+ value: 26.299
1267
+ - type: map_at_5
1268
+ value: 28.408
1269
+ - type: ndcg_at_1
1270
+ value: 19.255
1271
+ - type: ndcg_at_10
1272
+ value: 36.542
1273
+ - type: ndcg_at_100
1274
+ value: 42.471
1275
+ - type: ndcg_at_1000
1276
+ value: 44.022
1277
+ - type: ndcg_at_3
1278
+ value: 28.921000000000003
1279
+ - type: ndcg_at_5
1280
+ value: 32.676
1281
+ - type: precision_at_1
1282
+ value: 19.255
1283
+ - type: precision_at_10
1284
+ value: 5.91
1285
+ - type: precision_at_100
1286
+ value: 0.8920000000000001
1287
+ - type: precision_at_1000
1288
+ value: 0.10200000000000001
1289
+ - type: precision_at_3
1290
+ value: 12.388
1291
+ - type: precision_at_5
1292
+ value: 9.33
1293
+ - type: recall_at_1
1294
+ value: 18.715
1295
+ - type: recall_at_10
1296
+ value: 56.76
1297
+ - type: recall_at_100
1298
+ value: 84.481
1299
+ - type: recall_at_1000
1300
+ value: 96.44
1301
+ - type: recall_at_3
1302
+ value: 35.942
1303
+ - type: recall_at_5
1304
+ value: 44.926
1305
+ - task:
1306
+ type: Classification
1307
+ dataset:
1308
+ type: mteb/mtop_domain
1309
+ name: MTEB MTOPDomainClassification (en)
1310
+ config: en
1311
+ split: test
1312
+ metrics:
1313
+ - type: accuracy
1314
+ value: 91.56178750569997
1315
+ - type: f1
1316
+ value: 91.02309252160694
1317
+ - task:
1318
+ type: Classification
1319
+ dataset:
1320
+ type: mteb/mtop_intent
1321
+ name: MTEB MTOPIntentClassification (en)
1322
+ config: en
1323
+ split: test
1324
+ metrics:
1325
+ - type: accuracy
1326
+ value: 62.18194254445966
1327
+ - type: f1
1328
+ value: 43.090624769020444
1329
+ - task:
1330
+ type: Classification
1331
+ dataset:
1332
+ type: mteb/amazon_massive_intent
1333
+ name: MTEB MassiveIntentClassification (en)
1334
+ config: en
1335
+ split: test
1336
+ metrics:
1337
+ - type: accuracy
1338
+ value: 67.404169468729
1339
+ - type: f1
1340
+ value: 64.82901615433794
1341
+ - task:
1342
+ type: Classification
1343
+ dataset:
1344
+ type: mteb/amazon_massive_scenario
1345
+ name: MTEB MassiveScenarioClassification (en)
1346
+ config: en
1347
+ split: test
1348
+ metrics:
1349
+ - type: accuracy
1350
+ value: 75.75655682582381
1351
+ - type: f1
1352
+ value: 74.93126114560368
1353
+ - task:
1354
+ type: Clustering
1355
+ dataset:
1356
+ type: mteb/medrxiv-clustering-p2p
1357
+ name: MTEB MedrxivClusteringP2P
1358
+ config: default
1359
+ split: test
1360
+ metrics:
1361
+ - type: v_measure
1362
+ value: 34.40873490143895
1363
+ - task:
1364
+ type: Clustering
1365
+ dataset:
1366
+ type: mteb/medrxiv-clustering-s2s
1367
+ name: MTEB MedrxivClusteringS2S
1368
+ config: default
1369
+ split: test
1370
+ metrics:
1371
+ - type: v_measure
1372
+ value: 32.292207500530914
1373
+ - task:
1374
+ type: Reranking
1375
+ dataset:
1376
+ type: mteb/mind_small
1377
+ name: MTEB MindSmallReranking
1378
+ config: default
1379
+ split: test
1380
+ metrics:
1381
+ - type: map
1382
+ value: 30.798042020200267
1383
+ - type: mrr
1384
+ value: 31.803264263405513
1385
+ - task:
1386
+ type: Retrieval
1387
+ dataset:
1388
+ type: nfcorpus
1389
+ name: MTEB NFCorpus
1390
+ config: default
1391
+ split: test
1392
+ metrics:
1393
+ - type: map_at_1
1394
+ value: 4.3229999999999995
1395
+ - type: map_at_10
1396
+ value: 11.048
1397
+ - type: map_at_100
1398
+ value: 14.244000000000002
1399
+ - type: map_at_1000
1400
+ value: 15.684000000000001
1401
+ - type: map_at_3
1402
+ value: 7.7219999999999995
1403
+ - type: map_at_5
1404
+ value: 9.231
1405
+ - type: ndcg_at_1
1406
+ value: 39.474
1407
+ - type: ndcg_at_10
1408
+ value: 31.594
1409
+ - type: ndcg_at_100
1410
+ value: 29.455
1411
+ - type: ndcg_at_1000
1412
+ value: 38.283
1413
+ - type: ndcg_at_3
1414
+ value: 36.355
1415
+ - type: ndcg_at_5
1416
+ value: 34.164
1417
+ - type: precision_at_1
1418
+ value: 41.486000000000004
1419
+ - type: precision_at_10
1420
+ value: 24.334
1421
+ - type: precision_at_100
1422
+ value: 7.981000000000001
1423
+ - type: precision_at_1000
1424
+ value: 2.096
1425
+ - type: precision_at_3
1426
+ value: 34.881
1427
+ - type: precision_at_5
1428
+ value: 30.279
1429
+ - type: recall_at_1
1430
+ value: 4.3229999999999995
1431
+ - type: recall_at_10
1432
+ value: 15.498999999999999
1433
+ - type: recall_at_100
1434
+ value: 31.151
1435
+ - type: recall_at_1000
1436
+ value: 63.211
1437
+ - type: recall_at_3
1438
+ value: 9.053
1439
+ - type: recall_at_5
1440
+ value: 11.959
1441
+ - task:
1442
+ type: Retrieval
1443
+ dataset:
1444
+ type: nq
1445
+ name: MTEB NQ
1446
+ config: default
1447
+ split: test
1448
+ metrics:
1449
+ - type: map_at_1
1450
+ value: 22.644000000000002
1451
+ - type: map_at_10
1452
+ value: 36.335
1453
+ - type: map_at_100
1454
+ value: 37.687
1455
+ - type: map_at_1000
1456
+ value: 37.733
1457
+ - type: map_at_3
1458
+ value: 31.928
1459
+ - type: map_at_5
1460
+ value: 34.586
1461
+ - type: ndcg_at_1
1462
+ value: 25.607999999999997
1463
+ - type: ndcg_at_10
1464
+ value: 43.869
1465
+ - type: ndcg_at_100
1466
+ value: 49.730000000000004
1467
+ - type: ndcg_at_1000
1468
+ value: 50.749
1469
+ - type: ndcg_at_3
1470
+ value: 35.418
1471
+ - type: ndcg_at_5
1472
+ value: 39.961999999999996
1473
+ - type: precision_at_1
1474
+ value: 25.607999999999997
1475
+ - type: precision_at_10
1476
+ value: 7.697
1477
+ - type: precision_at_100
1478
+ value: 1.093
1479
+ - type: precision_at_1000
1480
+ value: 0.11900000000000001
1481
+ - type: precision_at_3
1482
+ value: 16.522000000000002
1483
+ - type: precision_at_5
1484
+ value: 12.486
1485
+ - type: recall_at_1
1486
+ value: 22.644000000000002
1487
+ - type: recall_at_10
1488
+ value: 64.711
1489
+ - type: recall_at_100
1490
+ value: 90.32900000000001
1491
+ - type: recall_at_1000
1492
+ value: 97.82
1493
+ - type: recall_at_3
1494
+ value: 42.754999999999995
1495
+ - type: recall_at_5
1496
+ value: 53.37
1497
+ - task:
1498
+ type: Retrieval
1499
+ dataset:
1500
+ type: quora
1501
+ name: MTEB QuoraRetrieval
1502
+ config: default
1503
+ split: test
1504
+ metrics:
1505
+ - type: map_at_1
1506
+ value: 69.76
1507
+ - type: map_at_10
1508
+ value: 83.64200000000001
1509
+ - type: map_at_100
1510
+ value: 84.312
1511
+ - type: map_at_1000
1512
+ value: 84.329
1513
+ - type: map_at_3
1514
+ value: 80.537
1515
+ - type: map_at_5
1516
+ value: 82.494
1517
+ - type: ndcg_at_1
1518
+ value: 80.41
1519
+ - type: ndcg_at_10
1520
+ value: 87.556
1521
+ - type: ndcg_at_100
1522
+ value: 88.847
1523
+ - type: ndcg_at_1000
1524
+ value: 88.959
1525
+ - type: ndcg_at_3
1526
+ value: 84.466
1527
+ - type: ndcg_at_5
1528
+ value: 86.193
1529
+ - type: precision_at_1
1530
+ value: 80.41
1531
+ - type: precision_at_10
1532
+ value: 13.374
1533
+ - type: precision_at_100
1534
+ value: 1.529
1535
+ - type: precision_at_1000
1536
+ value: 0.157
1537
+ - type: precision_at_3
1538
+ value: 36.953
1539
+ - type: precision_at_5
1540
+ value: 24.401999999999997
1541
+ - type: recall_at_1
1542
+ value: 69.76
1543
+ - type: recall_at_10
1544
+ value: 95.029
1545
+ - type: recall_at_100
1546
+ value: 99.44
1547
+ - type: recall_at_1000
1548
+ value: 99.979
1549
+ - type: recall_at_3
1550
+ value: 86.215
1551
+ - type: recall_at_5
1552
+ value: 91.03999999999999
1553
+ - task:
1554
+ type: Clustering
1555
+ dataset:
1556
+ type: mteb/reddit-clustering
1557
+ name: MTEB RedditClustering
1558
+ config: default
1559
+ split: test
1560
+ metrics:
1561
+ - type: v_measure
1562
+ value: 50.66969274980475
1563
+ - task:
1564
+ type: Clustering
1565
+ dataset:
1566
+ type: mteb/reddit-clustering-p2p
1567
+ name: MTEB RedditClusteringP2P
1568
+ config: default
1569
+ split: test
1570
+ metrics:
1571
+ - type: v_measure
1572
+ value: 54.15176409632201
1573
+ - task:
1574
+ type: Retrieval
1575
+ dataset:
1576
+ type: scidocs
1577
+ name: MTEB SCIDOCS
1578
+ config: default
1579
+ split: test
1580
+ metrics:
1581
+ - type: map_at_1
1582
+ value: 4.853
1583
+ - type: map_at_10
1584
+ value: 12.937999999999999
1585
+ - type: map_at_100
1586
+ value: 15.588
1587
+ - type: map_at_1000
1588
+ value: 15.939
1589
+ - type: map_at_3
1590
+ value: 9.135
1591
+ - type: map_at_5
1592
+ value: 11.004
1593
+ - type: ndcg_at_1
1594
+ value: 24.0
1595
+ - type: ndcg_at_10
1596
+ value: 21.641
1597
+ - type: ndcg_at_100
1598
+ value: 31.212
1599
+ - type: ndcg_at_1000
1600
+ value: 36.854
1601
+ - type: ndcg_at_3
1602
+ value: 20.284
1603
+ - type: ndcg_at_5
1604
+ value: 17.737
1605
+ - type: precision_at_1
1606
+ value: 24.0
1607
+ - type: precision_at_10
1608
+ value: 11.4
1609
+ - type: precision_at_100
1610
+ value: 2.516
1611
+ - type: precision_at_1000
1612
+ value: 0.387
1613
+ - type: precision_at_3
1614
+ value: 19.167
1615
+ - type: precision_at_5
1616
+ value: 15.72
1617
+ - type: recall_at_1
1618
+ value: 4.853
1619
+ - type: recall_at_10
1620
+ value: 23.087
1621
+ - type: recall_at_100
1622
+ value: 51.012
1623
+ - type: recall_at_1000
1624
+ value: 78.49000000000001
1625
+ - type: recall_at_3
1626
+ value: 11.658
1627
+ - type: recall_at_5
1628
+ value: 15.923000000000002
1629
+ - task:
1630
+ type: STS
1631
+ dataset:
1632
+ type: mteb/sickr-sts
1633
+ name: MTEB SICK-R
1634
+ config: default
1635
+ split: test
1636
+ metrics:
1637
+ - type: cos_sim_pearson
1638
+ value: 83.91595834747078
1639
+ - type: cos_sim_spearman
1640
+ value: 77.58245130495686
1641
+ - type: euclidean_pearson
1642
+ value: 80.77605511224702
1643
+ - type: euclidean_spearman
1644
+ value: 77.58244681255565
1645
+ - type: manhattan_pearson
1646
+ value: 80.70675261518134
1647
+ - type: manhattan_spearman
1648
+ value: 77.48238642250558
1649
+ - task:
1650
+ type: STS
1651
+ dataset:
1652
+ type: mteb/sts12-sts
1653
+ name: MTEB STS12
1654
+ config: default
1655
+ split: test
1656
+ metrics:
1657
+ - type: cos_sim_pearson
1658
+ value: 81.35998585185463
1659
+ - type: cos_sim_spearman
1660
+ value: 72.36900735029991
1661
+ - type: euclidean_pearson
1662
+ value: 77.44425972881783
1663
+ - type: euclidean_spearman
1664
+ value: 72.36900735029991
1665
+ - type: manhattan_pearson
1666
+ value: 77.48268272405316
1667
+ - type: manhattan_spearman
1668
+ value: 72.36650357806357
1669
+ - task:
1670
+ type: STS
1671
+ dataset:
1672
+ type: mteb/sts13-sts
1673
+ name: MTEB STS13
1674
+ config: default
1675
+ split: test
1676
+ metrics:
1677
+ - type: cos_sim_pearson
1678
+ value: 80.15192226911441
1679
+ - type: cos_sim_spearman
1680
+ value: 80.60316722220763
1681
+ - type: euclidean_pearson
1682
+ value: 79.9515074804673
1683
+ - type: euclidean_spearman
1684
+ value: 80.60316715056034
1685
+ - type: manhattan_pearson
1686
+ value: 80.01037050043855
1687
+ - type: manhattan_spearman
1688
+ value: 80.70244228209006
1689
+ - task:
1690
+ type: STS
1691
+ dataset:
1692
+ type: mteb/sts14-sts
1693
+ name: MTEB STS14
1694
+ config: default
1695
+ split: test
1696
+ metrics:
1697
+ - type: cos_sim_pearson
1698
+ value: 80.80137749134273
1699
+ - type: cos_sim_spearman
1700
+ value: 75.58912800301661
1701
+ - type: euclidean_pearson
1702
+ value: 78.89739732785547
1703
+ - type: euclidean_spearman
1704
+ value: 75.58912800301661
1705
+ - type: manhattan_pearson
1706
+ value: 78.88130916509184
1707
+ - type: manhattan_spearman
1708
+ value: 75.56512617108156
1709
+ - task:
1710
+ type: STS
1711
+ dataset:
1712
+ type: mteb/sts15-sts
1713
+ name: MTEB STS15
1714
+ config: default
1715
+ split: test
1716
+ metrics:
1717
+ - type: cos_sim_pearson
1718
+ value: 84.73605558012511
1719
+ - type: cos_sim_spearman
1720
+ value: 85.38966051883823
1721
+ - type: euclidean_pearson
1722
+ value: 84.65792305262497
1723
+ - type: euclidean_spearman
1724
+ value: 85.38965068015148
1725
+ - type: manhattan_pearson
1726
+ value: 84.6284531553976
1727
+ - type: manhattan_spearman
1728
+ value: 85.36525580485275
1729
+ - task:
1730
+ type: STS
1731
+ dataset:
1732
+ type: mteb/sts16-sts
1733
+ name: MTEB STS16
1734
+ config: default
1735
+ split: test
1736
+ metrics:
1737
+ - type: cos_sim_pearson
1738
+ value: 77.93667023468089
1739
+ - type: cos_sim_spearman
1740
+ value: 78.98945343973261
1741
+ - type: euclidean_pearson
1742
+ value: 78.55627105899589
1743
+ - type: euclidean_spearman
1744
+ value: 78.98945343973261
1745
+ - type: manhattan_pearson
1746
+ value: 78.47171138630095
1747
+ - type: manhattan_spearman
1748
+ value: 78.90029153062082
1749
+ - task:
1750
+ type: STS
1751
+ dataset:
1752
+ type: mteb/sts17-crosslingual-sts
1753
+ name: MTEB STS17 (ko-ko)
1754
+ config: ko-ko
1755
+ split: test
1756
+ metrics:
1757
+ - type: cos_sim_pearson
1758
+ value: 38.02556869388448
1759
+ - type: cos_sim_spearman
1760
+ value: 43.39452386216687
1761
+ - type: euclidean_pearson
1762
+ value: 42.85346056221848
1763
+ - type: euclidean_spearman
1764
+ value: 43.39454482701475
1765
+ - type: manhattan_pearson
1766
+ value: 42.80255086270408
1767
+ - type: manhattan_spearman
1768
+ value: 43.35745739810561
1769
+ - task:
1770
+ type: STS
1771
+ dataset:
1772
+ type: mteb/sts17-crosslingual-sts
1773
+ name: MTEB STS17 (ar-ar)
1774
+ config: ar-ar
1775
+ split: test
1776
+ metrics:
1777
+ - type: cos_sim_pearson
1778
+ value: 50.19733275252325
1779
+ - type: cos_sim_spearman
1780
+ value: 50.892912699226166
1781
+ - type: euclidean_pearson
1782
+ value: 53.38352259940662
1783
+ - type: euclidean_spearman
1784
+ value: 50.892912699226166
1785
+ - type: manhattan_pearson
1786
+ value: 53.48429031763742
1787
+ - type: manhattan_spearman
1788
+ value: 50.961509277559394
1789
+ - task:
1790
+ type: STS
1791
+ dataset:
1792
+ type: mteb/sts17-crosslingual-sts
1793
+ name: MTEB STS17 (en-ar)
1794
+ config: en-ar
1795
+ split: test
1796
+ metrics:
1797
+ - type: cos_sim_pearson
1798
+ value: -5.346248828225636
1799
+ - type: cos_sim_spearman
1800
+ value: -4.276245759627542
1801
+ - type: euclidean_pearson
1802
+ value: -5.34997238478067
1803
+ - type: euclidean_spearman
1804
+ value: -4.276245759627542
1805
+ - type: manhattan_pearson
1806
+ value: -1.599674226848396
1807
+ - type: manhattan_spearman
1808
+ value: -0.6972996366546237
1809
+ - task:
1810
+ type: STS
1811
+ dataset:
1812
+ type: mteb/sts17-crosslingual-sts
1813
+ name: MTEB STS17 (en-de)
1814
+ config: en-de
1815
+ split: test
1816
+ metrics:
1817
+ - type: cos_sim_pearson
1818
+ value: 37.0025013483991
1819
+ - type: cos_sim_spearman
1820
+ value: 35.81883942216964
1821
+ - type: euclidean_pearson
1822
+ value: 36.69612954510884
1823
+ - type: euclidean_spearman
1824
+ value: 35.81883942216964
1825
+ - type: manhattan_pearson
1826
+ value: 35.141229073611555
1827
+ - type: manhattan_spearman
1828
+ value: 32.04594883372404
1829
+ - task:
1830
+ type: STS
1831
+ dataset:
1832
+ type: mteb/sts17-crosslingual-sts
1833
+ name: MTEB STS17 (en-en)
1834
+ config: en-en
1835
+ split: test
1836
+ metrics:
1837
+ - type: cos_sim_pearson
1838
+ value: 88.02366672243191
1839
+ - type: cos_sim_spearman
1840
+ value: 87.58779089494524
1841
+ - type: euclidean_pearson
1842
+ value: 87.99011173645361
1843
+ - type: euclidean_spearman
1844
+ value: 87.58779089494524
1845
+ - type: manhattan_pearson
1846
+ value: 87.71266341564564
1847
+ - type: manhattan_spearman
1848
+ value: 87.24437101621581
1849
+ - task:
1850
+ type: STS
1851
+ dataset:
1852
+ type: mteb/sts17-crosslingual-sts
1853
+ name: MTEB STS17 (en-tr)
1854
+ config: en-tr
1855
+ split: test
1856
+ metrics:
1857
+ - type: cos_sim_pearson
1858
+ value: 6.928208810824121
1859
+ - type: cos_sim_spearman
1860
+ value: 4.496540073637865
1861
+ - type: euclidean_pearson
1862
+ value: 7.258004484570359
1863
+ - type: euclidean_spearman
1864
+ value: 4.496540073637865
1865
+ - type: manhattan_pearson
1866
+ value: 4.294687250993676
1867
+ - type: manhattan_spearman
1868
+ value: 2.517822531443102
1869
+ - task:
1870
+ type: STS
1871
+ dataset:
1872
+ type: mteb/sts17-crosslingual-sts
1873
+ name: MTEB STS17 (es-en)
1874
+ config: es-en
1875
+ split: test
1876
+ metrics:
1877
+ - type: cos_sim_pearson
1878
+ value: 17.49363358339176
1879
+ - type: cos_sim_spearman
1880
+ value: 16.31316318682868
1881
+ - type: euclidean_pearson
1882
+ value: 17.834234153786475
1883
+ - type: euclidean_spearman
1884
+ value: 16.31316318682868
1885
+ - type: manhattan_pearson
1886
+ value: 16.928139101229352
1887
+ - type: manhattan_spearman
1888
+ value: 15.00071366769135
1889
+ - task:
1890
+ type: STS
1891
+ dataset:
1892
+ type: mteb/sts17-crosslingual-sts
1893
+ name: MTEB STS17 (es-es)
1894
+ config: es-es
1895
+ split: test
1896
+ metrics:
1897
+ - type: cos_sim_pearson
1898
+ value: 77.04145671005833
1899
+ - type: cos_sim_spearman
1900
+ value: 76.11599994398748
1901
+ - type: euclidean_pearson
1902
+ value: 78.21801117699432
1903
+ - type: euclidean_spearman
1904
+ value: 76.11599994398748
1905
+ - type: manhattan_pearson
1906
+ value: 77.87062358292948
1907
+ - type: manhattan_spearman
1908
+ value: 75.64561332109221
1909
+ - task:
1910
+ type: STS
1911
+ dataset:
1912
+ type: mteb/sts17-crosslingual-sts
1913
+ name: MTEB STS17 (fr-en)
1914
+ config: fr-en
1915
+ split: test
1916
+ metrics:
1917
+ - type: cos_sim_pearson
1918
+ value: 37.9961687967439
1919
+ - type: cos_sim_spearman
1920
+ value: 37.09338306656542
1921
+ - type: euclidean_pearson
1922
+ value: 37.81002317191932
1923
+ - type: euclidean_spearman
1924
+ value: 37.09338306656542
1925
+ - type: manhattan_pearson
1926
+ value: 37.58237523973875
1927
+ - type: manhattan_spearman
1928
+ value: 36.52020936925911
1929
+ - task:
1930
+ type: STS
1931
+ dataset:
1932
+ type: mteb/sts17-crosslingual-sts
1933
+ name: MTEB STS17 (it-en)
1934
+ config: it-en
1935
+ split: test
1936
+ metrics:
1937
+ - type: cos_sim_pearson
1938
+ value: 26.739991134614716
1939
+ - type: cos_sim_spearman
1940
+ value: 24.4457755448559
1941
+ - type: euclidean_pearson
1942
+ value: 26.804935356831862
1943
+ - type: euclidean_spearman
1944
+ value: 24.442532087041023
1945
+ - type: manhattan_pearson
1946
+ value: 27.571123840765026
1947
+ - type: manhattan_spearman
1948
+ value: 25.554721155049045
1949
+ - task:
1950
+ type: STS
1951
+ dataset:
1952
+ type: mteb/sts17-crosslingual-sts
1953
+ name: MTEB STS17 (nl-en)
1954
+ config: nl-en
1955
+ split: test
1956
+ metrics:
1957
+ - type: cos_sim_pearson
1958
+ value: 32.71761762628939
1959
+ - type: cos_sim_spearman
1960
+ value: 28.99879893370601
1961
+ - type: euclidean_pearson
1962
+ value: 32.92831060810701
1963
+ - type: euclidean_spearman
1964
+ value: 28.99879893370601
1965
+ - type: manhattan_pearson
1966
+ value: 33.30410551798337
1967
+ - type: manhattan_spearman
1968
+ value: 29.442853829506593
1969
+ - task:
1970
+ type: STS
1971
+ dataset:
1972
+ type: mteb/sts22-crosslingual-sts
1973
+ name: MTEB STS22 (en)
1974
+ config: en
1975
+ split: test
1976
+ metrics:
1977
+ - type: cos_sim_pearson
1978
+ value: 67.09882753030891
1979
+ - type: cos_sim_spearman
1980
+ value: 67.21465212910987
1981
+ - type: euclidean_pearson
1982
+ value: 68.21374069918403
1983
+ - type: euclidean_spearman
1984
+ value: 67.21465212910987
1985
+ - type: manhattan_pearson
1986
+ value: 68.41388868877884
1987
+ - type: manhattan_spearman
1988
+ value: 67.83615682571867
1989
+ - task:
1990
+ type: STS
1991
+ dataset:
1992
+ type: mteb/sts22-crosslingual-sts
1993
+ name: MTEB STS22 (de)
1994
+ config: de
1995
+ split: test
1996
+ metrics:
1997
+ - type: cos_sim_pearson
1998
+ value: 26.596033966146116
1999
+ - type: cos_sim_spearman
2000
+ value: 31.044353994772354
2001
+ - type: euclidean_pearson
2002
+ value: 21.51728902500591
2003
+ - type: euclidean_spearman
2004
+ value: 31.044353994772354
2005
+ - type: manhattan_pearson
2006
+ value: 21.718468273577894
2007
+ - type: manhattan_spearman
2008
+ value: 31.197915595597696
2009
+ - task:
2010
+ type: STS
2011
+ dataset:
2012
+ type: mteb/sts22-crosslingual-sts
2013
+ name: MTEB STS22 (es)
2014
+ config: es
2015
+ split: test
2016
+ metrics:
2017
+ - type: cos_sim_pearson
2018
+ value: 44.33815143022264
2019
+ - type: cos_sim_spearman
2020
+ value: 54.77772552456677
2021
+ - type: euclidean_pearson
2022
+ value: 48.483578263920634
2023
+ - type: euclidean_spearman
2024
+ value: 54.77772552456677
2025
+ - type: manhattan_pearson
2026
+ value: 49.29424073081744
2027
+ - type: manhattan_spearman
2028
+ value: 55.259696552690954
2029
+ - task:
2030
+ type: STS
2031
+ dataset:
2032
+ type: mteb/sts22-crosslingual-sts
2033
+ name: MTEB STS22 (pl)
2034
+ config: pl
2035
+ split: test
2036
+ metrics:
2037
+ - type: cos_sim_pearson
2038
+ value: 8.000336595206134
2039
+ - type: cos_sim_spearman
2040
+ value: 26.768906191975933
2041
+ - type: euclidean_pearson
2042
+ value: 1.4181188576056134
2043
+ - type: euclidean_spearman
2044
+ value: 26.768906191975933
2045
+ - type: manhattan_pearson
2046
+ value: 1.588769366202155
2047
+ - type: manhattan_spearman
2048
+ value: 26.76300987426348
2049
+ - task:
2050
+ type: STS
2051
+ dataset:
2052
+ type: mteb/sts22-crosslingual-sts
2053
+ name: MTEB STS22 (tr)
2054
+ config: tr
2055
+ split: test
2056
+ metrics:
2057
+ - type: cos_sim_pearson
2058
+ value: 20.597902459466386
2059
+ - type: cos_sim_spearman
2060
+ value: 33.694510807738595
2061
+ - type: euclidean_pearson
2062
+ value: 26.964862787540962
2063
+ - type: euclidean_spearman
2064
+ value: 33.694510807738595
2065
+ - type: manhattan_pearson
2066
+ value: 27.530294926210807
2067
+ - type: manhattan_spearman
2068
+ value: 33.74254435313719
2069
+ - task:
2070
+ type: STS
2071
+ dataset:
2072
+ type: mteb/sts22-crosslingual-sts
2073
+ name: MTEB STS22 (ar)
2074
+ config: ar
2075
+ split: test
2076
+ metrics:
2077
+ - type: cos_sim_pearson
2078
+ value: 5.006610360999117
2079
+ - type: cos_sim_spearman
2080
+ value: 22.63866797712348
2081
+ - type: euclidean_pearson
2082
+ value: 13.082283087945362
2083
+ - type: euclidean_spearman
2084
+ value: 22.63866797712348
2085
+ - type: manhattan_pearson
2086
+ value: 13.260328120447722
2087
+ - type: manhattan_spearman
2088
+ value: 22.340169287120716
2089
+ - task:
2090
+ type: STS
2091
+ dataset:
2092
+ type: mteb/sts22-crosslingual-sts
2093
+ name: MTEB STS22 (ru)
2094
+ config: ru
2095
+ split: test
2096
+ metrics:
2097
+ - type: cos_sim_pearson
2098
+ value: 0.03100716792233671
2099
+ - type: cos_sim_spearman
2100
+ value: 14.721380413194854
2101
+ - type: euclidean_pearson
2102
+ value: 4.871526064730011
2103
+ - type: euclidean_spearman
2104
+ value: 14.721380413194854
2105
+ - type: manhattan_pearson
2106
+ value: 5.7576102223040735
2107
+ - type: manhattan_spearman
2108
+ value: 15.08182690716095
2109
+ - task:
2110
+ type: STS
2111
+ dataset:
2112
+ type: mteb/sts22-crosslingual-sts
2113
+ name: MTEB STS22 (zh)
2114
+ config: zh
2115
+ split: test
2116
+ metrics:
2117
+ - type: cos_sim_pearson
2118
+ value: 23.127885111414432
2119
+ - type: cos_sim_spearman
2120
+ value: 44.92964024177277
2121
+ - type: euclidean_pearson
2122
+ value: 31.061639313469925
2123
+ - type: euclidean_spearman
2124
+ value: 44.92964024177277
2125
+ - type: manhattan_pearson
2126
+ value: 31.77656358573927
2127
+ - type: manhattan_spearman
2128
+ value: 44.964763982886375
2129
+ - task:
2130
+ type: STS
2131
+ dataset:
2132
+ type: mteb/sts22-crosslingual-sts
2133
+ name: MTEB STS22 (fr)
2134
+ config: fr
2135
+ split: test
2136
+ metrics:
2137
+ - type: cos_sim_pearson
2138
+ value: 70.64344773137496
2139
+ - type: cos_sim_spearman
2140
+ value: 77.00398643056744
2141
+ - type: euclidean_pearson
2142
+ value: 71.58320199923101
2143
+ - type: euclidean_spearman
2144
+ value: 77.00398643056744
2145
+ - type: manhattan_pearson
2146
+ value: 71.64373853764818
2147
+ - type: manhattan_spearman
2148
+ value: 76.71158725879226
2149
+ - task:
2150
+ type: STS
2151
+ dataset:
2152
+ type: mteb/sts22-crosslingual-sts
2153
+ name: MTEB STS22 (de-en)
2154
+ config: de-en
2155
+ split: test
2156
+ metrics:
2157
+ - type: cos_sim_pearson
2158
+ value: 47.54531236654512
2159
+ - type: cos_sim_spearman
2160
+ value: 44.038685024247606
2161
+ - type: euclidean_pearson
2162
+ value: 48.46975590869453
2163
+ - type: euclidean_spearman
2164
+ value: 44.038685024247606
2165
+ - type: manhattan_pearson
2166
+ value: 48.10217367438755
2167
+ - type: manhattan_spearman
2168
+ value: 44.4428504653391
2169
+ - task:
2170
+ type: STS
2171
+ dataset:
2172
+ type: mteb/sts22-crosslingual-sts
2173
+ name: MTEB STS22 (es-en)
2174
+ config: es-en
2175
+ split: test
2176
+ metrics:
2177
+ - type: cos_sim_pearson
2178
+ value: 49.93601240112664
2179
+ - type: cos_sim_spearman
2180
+ value: 53.41895837272506
2181
+ - type: euclidean_pearson
2182
+ value: 50.16469746986203
2183
+ - type: euclidean_spearman
2184
+ value: 53.41895837272506
2185
+ - type: manhattan_pearson
2186
+ value: 49.86265183075983
2187
+ - type: manhattan_spearman
2188
+ value: 53.10065931046005
2189
+ - task:
2190
+ type: STS
2191
+ dataset:
2192
+ type: mteb/sts22-crosslingual-sts
2193
+ name: MTEB STS22 (it)
2194
+ config: it
2195
+ split: test
2196
+ metrics:
2197
+ - type: cos_sim_pearson
2198
+ value: 57.4312835830767
2199
+ - type: cos_sim_spearman
2200
+ value: 60.39610834515271
2201
+ - type: euclidean_pearson
2202
+ value: 57.81507077373551
2203
+ - type: euclidean_spearman
2204
+ value: 60.39610834515271
2205
+ - type: manhattan_pearson
2206
+ value: 57.83823485037898
2207
+ - type: manhattan_spearman
2208
+ value: 60.374938260317535
2209
+ - task:
2210
+ type: STS
2211
+ dataset:
2212
+ type: mteb/sts22-crosslingual-sts
2213
+ name: MTEB STS22 (pl-en)
2214
+ config: pl-en
2215
+ split: test
2216
+ metrics:
2217
+ - type: cos_sim_pearson
2218
+ value: 35.08730015173829
2219
+ - type: cos_sim_spearman
2220
+ value: 32.79791295777814
2221
+ - type: euclidean_pearson
2222
+ value: 34.54132550386404
2223
+ - type: euclidean_spearman
2224
+ value: 32.79791295777814
2225
+ - type: manhattan_pearson
2226
+ value: 36.273935331272256
2227
+ - type: manhattan_spearman
2228
+ value: 35.88704294252439
2229
+ - task:
2230
+ type: STS
2231
+ dataset:
2232
+ type: mteb/sts22-crosslingual-sts
2233
+ name: MTEB STS22 (zh-en)
2234
+ config: zh-en
2235
+ split: test
2236
+ metrics:
2237
+ - type: cos_sim_pearson
2238
+ value: 37.41111741585122
2239
+ - type: cos_sim_spearman
2240
+ value: 41.64399741744448
2241
+ - type: euclidean_pearson
2242
+ value: 36.83160927711053
2243
+ - type: euclidean_spearman
2244
+ value: 41.64399741744448
2245
+ - type: manhattan_pearson
2246
+ value: 35.71015224548175
2247
+ - type: manhattan_spearman
2248
+ value: 41.460551673456045
2249
+ - task:
2250
+ type: STS
2251
+ dataset:
2252
+ type: mteb/sts22-crosslingual-sts
2253
+ name: MTEB STS22 (es-it)
2254
+ config: es-it
2255
+ split: test
2256
+ metrics:
2257
+ - type: cos_sim_pearson
2258
+ value: 42.568537775842245
2259
+ - type: cos_sim_spearman
2260
+ value: 44.2699366594503
2261
+ - type: euclidean_pearson
2262
+ value: 43.569828137034264
2263
+ - type: euclidean_spearman
2264
+ value: 44.2699366594503
2265
+ - type: manhattan_pearson
2266
+ value: 43.954212787242284
2267
+ - type: manhattan_spearman
2268
+ value: 44.32159550471527
2269
+ - task:
2270
+ type: STS
2271
+ dataset:
2272
+ type: mteb/sts22-crosslingual-sts
2273
+ name: MTEB STS22 (de-fr)
2274
+ config: de-fr
2275
+ split: test
2276
+ metrics:
2277
+ - type: cos_sim_pearson
2278
+ value: 26.472844763068938
2279
+ - type: cos_sim_spearman
2280
+ value: 30.067587482078228
2281
+ - type: euclidean_pearson
2282
+ value: 26.87230792075073
2283
+ - type: euclidean_spearman
2284
+ value: 30.067587482078228
2285
+ - type: manhattan_pearson
2286
+ value: 25.808959063835424
2287
+ - type: manhattan_spearman
2288
+ value: 27.996294873002153
2289
+ - task:
2290
+ type: STS
2291
+ dataset:
2292
+ type: mteb/sts22-crosslingual-sts
2293
+ name: MTEB STS22 (de-pl)
2294
+ config: de-pl
2295
+ split: test
2296
+ metrics:
2297
+ - type: cos_sim_pearson
2298
+ value: 7.026566971631159
2299
+ - type: cos_sim_spearman
2300
+ value: 4.9270565599404135
2301
+ - type: euclidean_pearson
2302
+ value: 6.729027056926625
2303
+ - type: euclidean_spearman
2304
+ value: 4.9270565599404135
2305
+ - type: manhattan_pearson
2306
+ value: 9.01762174854638
2307
+ - type: manhattan_spearman
2308
+ value: 7.359790736410993
2309
+ - task:
2310
+ type: STS
2311
+ dataset:
2312
+ type: mteb/sts22-crosslingual-sts
2313
+ name: MTEB STS22 (fr-pl)
2314
+ config: fr-pl
2315
+ split: test
2316
+ metrics:
2317
+ - type: cos_sim_pearson
2318
+ value: 54.305559003968206
2319
+ - type: cos_sim_spearman
2320
+ value: 50.709255283710995
2321
+ - type: euclidean_pearson
2322
+ value: 53.00660084455784
2323
+ - type: euclidean_spearman
2324
+ value: 50.709255283710995
2325
+ - type: manhattan_pearson
2326
+ value: 52.33784187543789
2327
+ - type: manhattan_spearman
2328
+ value: 50.709255283710995
2329
+ - task:
2330
+ type: STS
2331
+ dataset:
2332
+ type: mteb/stsbenchmark-sts
2333
+ name: MTEB STSBenchmark
2334
+ config: default
2335
+ split: test
2336
+ metrics:
2337
+ - type: cos_sim_pearson
2338
+ value: 82.7406424090513
2339
+ - type: cos_sim_spearman
2340
+ value: 82.03246731235654
2341
+ - type: euclidean_pearson
2342
+ value: 82.55616747173353
2343
+ - type: euclidean_spearman
2344
+ value: 82.03246731235654
2345
+ - type: manhattan_pearson
2346
+ value: 82.49144455072748
2347
+ - type: manhattan_spearman
2348
+ value: 81.94552526855261
2349
+ - task:
2350
+ type: Reranking
2351
+ dataset:
2352
+ type: mteb/scidocs-reranking
2353
+ name: MTEB SciDocsRR
2354
+ config: default
2355
+ split: test
2356
+ metrics:
2357
+ - type: map
2358
+ value: 87.11941318470207
2359
+ - type: mrr
2360
+ value: 96.39370705547176
2361
+ - task:
2362
+ type: Retrieval
2363
+ dataset:
2364
+ type: scifact
2365
+ name: MTEB SciFact
2366
+ config: default
2367
+ split: test
2368
+ metrics:
2369
+ - type: map_at_1
2370
+ value: 48.233
2371
+ - type: map_at_10
2372
+ value: 59.592999999999996
2373
+ - type: map_at_100
2374
+ value: 60.307
2375
+ - type: map_at_1000
2376
+ value: 60.343
2377
+ - type: map_at_3
2378
+ value: 56.564
2379
+ - type: map_at_5
2380
+ value: 58.826
2381
+ - type: ndcg_at_1
2382
+ value: 50.333000000000006
2383
+ - type: ndcg_at_10
2384
+ value: 64.508
2385
+ - type: ndcg_at_100
2386
+ value: 67.66499999999999
2387
+ - type: ndcg_at_1000
2388
+ value: 68.552
2389
+ - type: ndcg_at_3
2390
+ value: 59.673
2391
+ - type: ndcg_at_5
2392
+ value: 62.928
2393
+ - type: precision_at_1
2394
+ value: 50.333000000000006
2395
+ - type: precision_at_10
2396
+ value: 8.833
2397
+ - type: precision_at_100
2398
+ value: 1.053
2399
+ - type: precision_at_1000
2400
+ value: 0.11199999999999999
2401
+ - type: precision_at_3
2402
+ value: 23.778
2403
+ - type: precision_at_5
2404
+ value: 16.400000000000002
2405
+ - type: recall_at_1
2406
+ value: 48.233
2407
+ - type: recall_at_10
2408
+ value: 78.333
2409
+ - type: recall_at_100
2410
+ value: 92.5
2411
+ - type: recall_at_1000
2412
+ value: 99.333
2413
+ - type: recall_at_3
2414
+ value: 66.033
2415
+ - type: recall_at_5
2416
+ value: 73.79400000000001
2417
+ - task:
2418
+ type: PairClassification
2419
+ dataset:
2420
+ type: mteb/sprintduplicatequestions-pairclassification
2421
+ name: MTEB SprintDuplicateQuestions
2422
+ config: default
2423
+ split: test
2424
+ metrics:
2425
+ - type: cos_sim_accuracy
2426
+ value: 99.78514851485149
2427
+ - type: cos_sim_ap
2428
+ value: 94.55063045792446
2429
+ - type: cos_sim_f1
2430
+ value: 89.01265822784809
2431
+ - type: cos_sim_precision
2432
+ value: 90.15384615384615
2433
+ - type: cos_sim_recall
2434
+ value: 87.9
2435
+ - type: dot_accuracy
2436
+ value: 99.78514851485149
2437
+ - type: dot_ap
2438
+ value: 94.55063045792447
2439
+ - type: dot_f1
2440
+ value: 89.01265822784809
2441
+ - type: dot_precision
2442
+ value: 90.15384615384615
2443
+ - type: dot_recall
2444
+ value: 87.9
2445
+ - type: euclidean_accuracy
2446
+ value: 99.78514851485149
2447
+ - type: euclidean_ap
2448
+ value: 94.55063045792447
2449
+ - type: euclidean_f1
2450
+ value: 89.01265822784809
2451
+ - type: euclidean_precision
2452
+ value: 90.15384615384615
2453
+ - type: euclidean_recall
2454
+ value: 87.9
2455
+ - type: manhattan_accuracy
2456
+ value: 99.78415841584159
2457
+ - type: manhattan_ap
2458
+ value: 94.54002074215008
2459
+ - type: manhattan_f1
2460
+ value: 88.98989898989899
2461
+ - type: manhattan_precision
2462
+ value: 89.89795918367346
2463
+ - type: manhattan_recall
2464
+ value: 88.1
2465
+ - type: max_accuracy
2466
+ value: 99.78514851485149
2467
+ - type: max_ap
2468
+ value: 94.55063045792447
2469
+ - type: max_f1
2470
+ value: 89.01265822784809
2471
+ - task:
2472
+ type: Clustering
2473
+ dataset:
2474
+ type: mteb/stackexchange-clustering
2475
+ name: MTEB StackExchangeClustering
2476
+ config: default
2477
+ split: test
2478
+ metrics:
2479
+ - type: v_measure
2480
+ value: 53.361421662036015
2481
+ - task:
2482
+ type: Clustering
2483
+ dataset:
2484
+ type: mteb/stackexchange-clustering-p2p
2485
+ name: MTEB StackExchangeClusteringP2P
2486
+ config: default
2487
+ split: test
2488
+ metrics:
2489
+ - type: v_measure
2490
+ value: 38.001825627800976
2491
+ - task:
2492
+ type: Reranking
2493
+ dataset:
2494
+ type: mteb/stackoverflowdupquestions-reranking
2495
+ name: MTEB StackOverflowDupQuestions
2496
+ config: default
2497
+ split: test
2498
+ metrics:
2499
+ - type: map
2500
+ value: 50.762134384316084
2501
+ - type: mrr
2502
+ value: 51.39383594346829
2503
+ - task:
2504
+ type: Summarization
2505
+ dataset:
2506
+ type: mteb/summeval
2507
+ name: MTEB SummEval
2508
+ config: default
2509
+ split: test
2510
+ metrics:
2511
+ - type: cos_sim_pearson
2512
+ value: 30.508420334813536
2513
+ - type: cos_sim_spearman
2514
+ value: 30.808757671244493
2515
+ - type: dot_pearson
2516
+ value: 30.508418240633862
2517
+ - type: dot_spearman
2518
+ value: 30.808757671244493
2519
+ - task:
2520
+ type: Retrieval
2521
+ dataset:
2522
+ type: trec-covid
2523
+ name: MTEB TRECCOVID
2524
+ config: default
2525
+ split: test
2526
+ metrics:
2527
+ - type: map_at_1
2528
+ value: 0.169
2529
+ - type: map_at_10
2530
+ value: 1.054
2531
+ - type: map_at_100
2532
+ value: 5.308
2533
+ - type: map_at_1000
2534
+ value: 13.313
2535
+ - type: map_at_3
2536
+ value: 0.40800000000000003
2537
+ - type: map_at_5
2538
+ value: 0.627
2539
+ - type: ndcg_at_1
2540
+ value: 56.00000000000001
2541
+ - type: ndcg_at_10
2542
+ value: 47.246
2543
+ - type: ndcg_at_100
2544
+ value: 35.172
2545
+ - type: ndcg_at_1000
2546
+ value: 34.031
2547
+ - type: ndcg_at_3
2548
+ value: 51.939
2549
+ - type: ndcg_at_5
2550
+ value: 50.568999999999996
2551
+ - type: precision_at_1
2552
+ value: 62.0
2553
+ - type: precision_at_10
2554
+ value: 50.4
2555
+ - type: precision_at_100
2556
+ value: 36.14
2557
+ - type: precision_at_1000
2558
+ value: 15.45
2559
+ - type: precision_at_3
2560
+ value: 56.00000000000001
2561
+ - type: precision_at_5
2562
+ value: 55.2
2563
+ - type: recall_at_1
2564
+ value: 0.169
2565
+ - type: recall_at_10
2566
+ value: 1.284
2567
+ - type: recall_at_100
2568
+ value: 8.552
2569
+ - type: recall_at_1000
2570
+ value: 32.81
2571
+ - type: recall_at_3
2572
+ value: 0.44
2573
+ - type: recall_at_5
2574
+ value: 0.709
2575
+ - task:
2576
+ type: Retrieval
2577
+ dataset:
2578
+ type: webis-touche2020
2579
+ name: MTEB Touche2020
2580
+ config: default
2581
+ split: test
2582
+ metrics:
2583
+ - type: map_at_1
2584
+ value: 1.49
2585
+ - type: map_at_10
2586
+ value: 6.39
2587
+ - type: map_at_100
2588
+ value: 11.424
2589
+ - type: map_at_1000
2590
+ value: 12.847
2591
+ - type: map_at_3
2592
+ value: 3.055
2593
+ - type: map_at_5
2594
+ value: 3.966
2595
+ - type: ndcg_at_1
2596
+ value: 17.347
2597
+ - type: ndcg_at_10
2598
+ value: 16.904
2599
+ - type: ndcg_at_100
2600
+ value: 29.187
2601
+ - type: ndcg_at_1000
2602
+ value: 40.994
2603
+ - type: ndcg_at_3
2604
+ value: 15.669
2605
+ - type: ndcg_at_5
2606
+ value: 16.034000000000002
2607
+ - type: precision_at_1
2608
+ value: 18.367
2609
+ - type: precision_at_10
2610
+ value: 16.326999999999998
2611
+ - type: precision_at_100
2612
+ value: 6.673
2613
+ - type: precision_at_1000
2614
+ value: 1.439
2615
+ - type: precision_at_3
2616
+ value: 17.687
2617
+ - type: precision_at_5
2618
+ value: 17.143
2619
+ - type: recall_at_1
2620
+ value: 1.49
2621
+ - type: recall_at_10
2622
+ value: 12.499
2623
+ - type: recall_at_100
2624
+ value: 41.711
2625
+ - type: recall_at_1000
2626
+ value: 78.286
2627
+ - type: recall_at_3
2628
+ value: 4.055000000000001
2629
+ - type: recall_at_5
2630
+ value: 6.5040000000000004
2631
+ - task:
2632
+ type: Classification
2633
+ dataset:
2634
+ type: mteb/toxic_conversations_50k
2635
+ name: MTEB ToxicConversationsClassification
2636
+ config: default
2637
+ split: test
2638
+ metrics:
2639
+ - type: accuracy
2640
+ value: 66.9918
2641
+ - type: ap
2642
+ value: 12.24755801720171
2643
+ - type: f1
2644
+ value: 51.31653313211933
2645
+ - task:
2646
+ type: Classification
2647
+ dataset:
2648
+ type: mteb/tweet_sentiment_extraction
2649
+ name: MTEB TweetSentimentExtractionClassification
2650
+ config: default
2651
+ split: test
2652
+ metrics:
2653
+ - type: accuracy
2654
+ value: 55.410299943406905
2655
+ - type: f1
2656
+ value: 55.71547395803944
2657
+ - task:
2658
+ type: Clustering
2659
+ dataset:
2660
+ type: mteb/twentynewsgroups-clustering
2661
+ name: MTEB TwentyNewsgroupsClustering
2662
+ config: default
2663
+ split: test
2664
+ metrics:
2665
+ - type: v_measure
2666
+ value: 46.860271427647774
2667
+ - task:
2668
+ type: PairClassification
2669
+ dataset:
2670
+ type: mteb/twittersemeval2015-pairclassification
2671
+ name: MTEB TwitterSemEval2015
2672
+ config: default
2673
+ split: test
2674
+ metrics:
2675
+ - type: cos_sim_accuracy
2676
+ value: 84.1151576563152
2677
+ - type: cos_sim_ap
2678
+ value: 67.85802440228593
2679
+ - type: cos_sim_f1
2680
+ value: 64.08006919560113
2681
+ - type: cos_sim_precision
2682
+ value: 60.260283523123405
2683
+ - type: cos_sim_recall
2684
+ value: 68.41688654353561
2685
+ - type: dot_accuracy
2686
+ value: 84.1151576563152
2687
+ - type: dot_ap
2688
+ value: 67.85802503410727
2689
+ - type: dot_f1
2690
+ value: 64.08006919560113
2691
+ - type: dot_precision
2692
+ value: 60.260283523123405
2693
+ - type: dot_recall
2694
+ value: 68.41688654353561
2695
+ - type: euclidean_accuracy
2696
+ value: 84.1151576563152
2697
+ - type: euclidean_ap
2698
+ value: 67.85802845168082
2699
+ - type: euclidean_f1
2700
+ value: 64.08006919560113
2701
+ - type: euclidean_precision
2702
+ value: 60.260283523123405
2703
+ - type: euclidean_recall
2704
+ value: 68.41688654353561
2705
+ - type: manhattan_accuracy
2706
+ value: 83.96614412588663
2707
+ - type: manhattan_ap
2708
+ value: 67.66935451307549
2709
+ - type: manhattan_f1
2710
+ value: 63.82363570654138
2711
+ - type: manhattan_precision
2712
+ value: 58.72312125914432
2713
+ - type: manhattan_recall
2714
+ value: 69.89445910290237
2715
+ - type: max_accuracy
2716
+ value: 84.1151576563152
2717
+ - type: max_ap
2718
+ value: 67.85802845168082
2719
+ - type: max_f1
2720
+ value: 64.08006919560113
2721
+ - task:
2722
+ type: PairClassification
2723
+ dataset:
2724
+ type: mteb/twitterurlcorpus-pairclassification
2725
+ name: MTEB TwitterURLCorpus
2726
+ config: default
2727
+ split: test
2728
+ metrics:
2729
+ - type: cos_sim_accuracy
2730
+ value: 88.2504754142896
2731
+ - type: cos_sim_ap
2732
+ value: 84.70165951451109
2733
+ - type: cos_sim_f1
2734
+ value: 76.57057281916886
2735
+ - type: cos_sim_precision
2736
+ value: 74.5226643346451
2737
+ - type: cos_sim_recall
2738
+ value: 78.73421619956883
2739
+ - type: dot_accuracy
2740
+ value: 88.2504754142896
2741
+ - type: dot_ap
2742
+ value: 84.7016596919848
2743
+ - type: dot_f1
2744
+ value: 76.57057281916886
2745
+ - type: dot_precision
2746
+ value: 74.5226643346451
2747
+ - type: dot_recall
2748
+ value: 78.73421619956883
2749
+ - type: euclidean_accuracy
2750
+ value: 88.2504754142896
2751
+ - type: euclidean_ap
2752
+ value: 84.70166029488888
2753
+ - type: euclidean_f1
2754
+ value: 76.57057281916886
2755
+ - type: euclidean_precision
2756
+ value: 74.5226643346451
2757
+ - type: euclidean_recall
2758
+ value: 78.73421619956883
2759
+ - type: manhattan_accuracy
2760
+ value: 88.27376101214732
2761
+ - type: manhattan_ap
2762
+ value: 84.63518812822186
2763
+ - type: manhattan_f1
2764
+ value: 76.55138674594514
2765
+ - type: manhattan_precision
2766
+ value: 74.86934118513065
2767
+ - type: manhattan_recall
2768
+ value: 78.31074838312288
2769
+ - type: max_accuracy
2770
+ value: 88.27376101214732
2771
+ - type: max_ap
2772
+ value: 84.70166029488888
2773
+ - type: max_f1
2774
+ value: 76.57057281916886
2775
  ---
2776
 
2777