qfrodicio commited on
Commit
52b00b8
1 Parent(s): f2c03bb

Training complete

Browse files
README.md CHANGED
@@ -20,11 +20,11 @@ should probably proofread and complete it, then remove this comment. -->
20
 
21
  This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
- - Loss: 0.9438
24
- - Precision: 0.7910
25
- - Recall: 0.7910
26
- - F1: 0.7910
27
- - Accuracy: 0.7817
28
 
29
  ## Model description
30
 
@@ -43,7 +43,7 @@ More information needed
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
46
- - learning_rate: 6.042200829392303e-05
47
  - train_batch_size: 64
48
  - eval_batch_size: 64
49
  - seed: 42
@@ -55,16 +55,16 @@ The following hyperparameters were used during training:
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
- | 2.4003 | 1.0 | 26 | 1.4948 | 0.6512 | 0.6512 | 0.6512 | 0.6206 |
59
- | 1.2197 | 2.0 | 52 | 1.0008 | 0.7501 | 0.7501 | 0.7501 | 0.7356 |
60
- | 0.81 | 3.0 | 78 | 0.8907 | 0.7696 | 0.7696 | 0.7696 | 0.7555 |
61
- | 0.5619 | 4.0 | 104 | 0.9091 | 0.7628 | 0.7628 | 0.7628 | 0.7495 |
62
- | 0.3944 | 5.0 | 130 | 0.8791 | 0.7853 | 0.7853 | 0.7853 | 0.7749 |
63
- | 0.2907 | 6.0 | 156 | 0.8973 | 0.7845 | 0.7845 | 0.7845 | 0.7733 |
64
- | 0.2214 | 7.0 | 182 | 0.9209 | 0.7874 | 0.7874 | 0.7874 | 0.7779 |
65
- | 0.1722 | 8.0 | 208 | 0.9446 | 0.7878 | 0.7878 | 0.7878 | 0.7787 |
66
- | 0.1454 | 9.0 | 234 | 0.9406 | 0.7882 | 0.7882 | 0.7882 | 0.7789 |
67
- | 0.128 | 10.0 | 260 | 0.9438 | 0.7910 | 0.7910 | 0.7910 | 0.7817 |
68
 
69
 
70
  ### Framework versions
 
20
 
21
  This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on an unknown dataset.
22
  It achieves the following results on the evaluation set:
23
+ - Loss: 1.2008
24
+ - Precision: 0.7823
25
+ - Recall: 0.7823
26
+ - F1: 0.7823
27
+ - Accuracy: 0.7745
28
 
29
  ## Model description
30
 
 
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
46
+ - learning_rate: 9.96098704459956e-05
47
  - train_batch_size: 64
48
  - eval_batch_size: 64
49
  - seed: 42
 
55
 
56
  | Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
57
  |:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
58
+ | 0.4985 | 1.0 | 26 | 1.0570 | 0.7288 | 0.7288 | 0.7288 | 0.7177 |
59
+ | 0.2387 | 2.0 | 52 | 1.0720 | 0.7775 | 0.7775 | 0.7775 | 0.7672 |
60
+ | 0.1348 | 3.0 | 78 | 1.0681 | 0.7775 | 0.7775 | 0.7775 | 0.7682 |
61
+ | 0.0811 | 4.0 | 104 | 1.1143 | 0.7884 | 0.7884 | 0.7884 | 0.7781 |
62
+ | 0.0565 | 5.0 | 130 | 1.1128 | 0.7867 | 0.7867 | 0.7867 | 0.7783 |
63
+ | 0.0419 | 6.0 | 156 | 1.1051 | 0.7966 | 0.7966 | 0.7966 | 0.7880 |
64
+ | 0.0257 | 7.0 | 182 | 1.1567 | 0.7839 | 0.7839 | 0.7839 | 0.7737 |
65
+ | 0.0217 | 8.0 | 208 | 1.1675 | 0.7859 | 0.7859 | 0.7859 | 0.7757 |
66
+ | 0.0145 | 9.0 | 234 | 1.1976 | 0.7847 | 0.7847 | 0.7847 | 0.7769 |
67
+ | 0.0115 | 10.0 | 260 | 1.2008 | 0.7823 | 0.7823 | 0.7823 | 0.7745 |
68
 
69
 
70
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7ef9100ad55bba7f19ff0bfcc834c9c38daee1c891bfe9c4b3495f4afb569ccf
3
  size 260905184
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2945765a668a2fb8dbc3945d3cf882dc5dfc875ce07f82042434825921f0025e
3
  size 260905184
runs/May04_10-55-01_19b9be41010d/events.out.tfevents.1714820104.19b9be41010d.571.2 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3403700c1ae86f1f47d1e5ab9a70de1fdffe9f69c94d135abe1413a50f5db6cf
3
- size 10647
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c8e4d4fb36bb1229fcd477378b71bd587bf323f4bcbd51c0cf39c11036c38b56
3
+ size 13733