eevvgg commited on
Commit
31635d8
1 Parent(s): db4871b

up readme 4

Browse files
Files changed (1) hide show
  1. README.md +15 -13
README.md CHANGED
@@ -48,19 +48,6 @@ Fine-tuned on a 2k sample of manually annotated Reddit (EN) and Twitter (PL) dat
48
  - **License:** [More Information Needed]
49
  - **Finetuned from model:** [cardiffnlp/twitter-xlm-roberta-base-sentiment](https://huggingface.co/cardiffnlp/twitter-xlm-roberta-base-sentiment)
50
 
51
- ## Model Sources
52
-
53
- - **Repository:** [Colab notebook](https://colab.research.google.com/drive/1Rqgjp2tlReZ-hOZz63jw9cIwcZmcL9lR?usp=sharing)
54
- - **Paper:** TBA
55
- - **BibTex citation:**
56
- ```
57
- @misc{SentimenTwGK2023,
58
- author={Gajewska, Ewelina and Konat, Barbara},
59
- title={SentimenTw XLM-RoBERTa-base Model for Multilingual Sentiment Classification on Social Media},
60
- year={2023},
61
- howpublished = {\url{https://huggingface.co/eevvgg/sentimenTw-political}},
62
- }
63
- ```
64
 
65
  # Uses
66
 
@@ -84,6 +71,21 @@ labels = [i['label'] for i in result] # ['neutral', 'positive']
84
 
85
  ```
86
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
87
  # Training Details
88
 
89
  - Trained for 3 epochs, mini-batch size of 8.
 
48
  - **License:** [More Information Needed]
49
  - **Finetuned from model:** [cardiffnlp/twitter-xlm-roberta-base-sentiment](https://huggingface.co/cardiffnlp/twitter-xlm-roberta-base-sentiment)
50
 
 
 
 
 
 
 
 
 
 
 
 
 
 
51
 
52
  # Uses
53
 
 
71
 
72
  ```
73
 
74
+
75
+ ## Model Sources
76
+
77
+ - **Repository:** [Colab notebook](https://colab.research.google.com/drive/1Rqgjp2tlReZ-hOZz63jw9cIwcZmcL9lR?usp=sharing)
78
+ - **Paper:** TBA
79
+ - **BibTex citation:**
80
+ ```
81
+ @misc{SentimenTwGK2023,
82
+ author={Gajewska, Ewelina and Konat, Barbara},
83
+ title={SentimenTw XLM-RoBERTa-base Model for Multilingual Sentiment Classification on Social Media},
84
+ year={2023},
85
+ howpublished = {\url{https://huggingface.co/eevvgg/sentimenTw-political}},
86
+ }
87
+ ```
88
+
89
  # Training Details
90
 
91
  - Trained for 3 epochs, mini-batch size of 8.