atasoglu commited on
Commit
dc6b7d8
1 Parent(s): 1523f08

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -4
README.md CHANGED
@@ -6,14 +6,22 @@ tags:
6
  - feature-extraction
7
  - sentence-similarity
8
  - transformers
9
-
 
 
 
 
 
 
10
  ---
11
 
12
  # atasoglu/distilbert-base-turkish-cased-nli-stsb-tr
13
 
14
  This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
15
 
16
- <!--- Describe your model here -->
 
 
17
 
18
  ## Usage (Sentence-Transformers)
19
 
@@ -76,9 +84,14 @@ print(sentence_embeddings)
76
 
77
  ## Evaluation Results
78
 
79
- <!--- Describe how your model was evaluated -->
80
 
81
- For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=atasoglu/distilbert-base-turkish-cased-nli-stsb-tr)
 
 
 
 
 
82
 
83
 
84
  ## Training
 
6
  - feature-extraction
7
  - sentence-similarity
8
  - transformers
9
+ license: mit
10
+ datasets:
11
+ - nli_tr
12
+ - emrecan/stsb-mt-turkish
13
+ language:
14
+ - tr
15
+ base_model: dbmdz/distilbert-base-turkish-cased
16
  ---
17
 
18
  # atasoglu/distilbert-base-turkish-cased-nli-stsb-tr
19
 
20
  This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 768 dimensional dense vector space and can be used for tasks like clustering or semantic search.
21
 
22
+ This model was adapted from [dbmdz/distilbert-base-turkish-cased](https://huggingface.co/dbmdz/distilbert-base-turkish-cased) and fine-tuned on these datasets:
23
+ - [nli_tr](https://huggingface.co/datasets/nli_tr)
24
+ - [emrecan/stsb-mt-turkish](https://huggingface.co/datasets/emrecan/stsb-mt-turkish)
25
 
26
  ## Usage (Sentence-Transformers)
27
 
 
84
 
85
  ## Evaluation Results
86
 
87
+ Achieved results on the [STS-b](https://huggingface.co/datasets/emrecan/stsb-mt-turkish) test split are given below:
88
 
89
+ ```txt
90
+ Cosine-Similarity : Pearson: 0.8167 Spearman: 0.8158
91
+ Manhattan-Distance: Pearson: 0.7540 Spearman: 0.7463
92
+ Euclidean-Distance: Pearson: 0.7545 Spearman: 0.7470
93
+ Dot-Product-Similarity: Pearson: 0.6543 Spearman: 0.6571
94
+ ```
95
 
96
 
97
  ## Training