Edit model card

bankstatementmodelver8

This model is a fine-tuned version of deepset/roberta-base-squad2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 11
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 150

Training results

Training Loss Epoch Step Validation Loss
0.1067 1.0 981 0.0322
0.0357 2.0 1962 0.0228
0.0239 3.0 2943 0.0172
0.0253 4.0 3924 0.0158
0.0206 5.0 4905 0.0127
0.0168 6.0 5886 0.0160
0.0158 7.0 6867 0.0154
0.0169 8.0 7848 0.0134
0.0162 9.0 8829 0.0081
0.0162 10.0 9810 0.0101
0.0126 11.0 10791 0.0082
0.0128 12.0 11772 0.0080
0.013 13.0 12753 0.0119
0.0117 14.0 13734 0.0105
0.0117 15.0 14715 0.0106
0.0112 16.0 15696 0.0100
0.0103 17.0 16677 0.0078
0.0075 18.0 17658 0.0060
0.0057 19.0 18639 0.0088
0.0077 20.0 19620 0.0076
0.006 21.0 20601 0.0149
0.0065 22.0 21582 0.0062
0.0093 23.0 22563 0.0081
0.0045 24.0 23544 0.0054
0.005 25.0 24525 0.0048
0.0068 26.0 25506 0.0122
0.0063 27.0 26487 0.0038
0.0043 28.0 27468 0.0063
0.0055 29.0 28449 0.0096
0.0034 30.0 29430 0.0045
0.0033 31.0 30411 0.0025
0.0027 32.0 31392 0.0047
0.002 33.0 32373 0.0053
0.0055 34.0 33354 0.0026
0.0044 35.0 34335 0.0010
0.0047 36.0 35316 0.0008
0.0019 37.0 36297 0.0011
0.0006 38.0 37278 0.0030
0.0015 39.0 38259 0.0010
0.0005 40.0 39240 0.0008
0.0018 41.0 40221 0.0001
0.0026 42.0 41202 0.0017
0.0 43.0 42183 0.0002
0.002 44.0 43164 0.0009
0.0012 45.0 44145 0.0000
0.0018 46.0 45126 0.0110
0.0006 47.0 46107 0.0018
0.0016 48.0 47088 0.0000
0.0017 49.0 48069 0.0000
0.0014 50.0 49050 0.0000
0.0001 51.0 50031 0.0000
0.0018 52.0 51012 0.0020
0.0001 53.0 51993 0.0001
0.0009 54.0 52974 0.0040
0.0021 55.0 53955 0.0000
0.0018 56.0 54936 0.0000
0.0005 57.0 55917 0.0000
0.0 58.0 56898 0.0000
0.0014 59.0 57879 0.0000
0.0008 60.0 58860 0.0000
0.0002 61.0 59841 0.0000
0.0018 62.0 60822 0.0000
0.0016 63.0 61803 0.0003
0.0 64.0 62784 0.0000
0.0001 65.0 63765 0.0000
0.0014 66.0 64746 0.0004
0.0006 67.0 65727 0.0000
0.0 68.0 66708 0.0000
0.0 69.0 67689 0.0000
0.0002 70.0 68670 0.0000
0.0001 71.0 69651 0.0000
0.0 72.0 70632 0.0000
0.0005 73.0 71613 0.0000
0.0009 74.0 72594 0.0000
0.0007 75.0 73575 0.0000
0.0 76.0 74556 0.0005
0.0 77.0 75537 0.0000
0.0 78.0 76518 0.0000
0.0004 79.0 77499 0.0000
0.0001 80.0 78480 0.0000
0.0 81.0 79461 0.0000
0.0013 82.0 80442 0.0000
0.0 83.0 81423 0.0000
0.0 84.0 82404 0.0000
0.0 85.0 83385 0.0000
0.0001 86.0 84366 0.0000
0.001 87.0 85347 0.0000
0.0 88.0 86328 0.0000
0.0001 89.0 87309 0.0000
0.0004 90.0 88290 0.0000
0.0 91.0 89271 0.0000
0.0 92.0 90252 0.0000
0.0 93.0 91233 0.0000
0.001 94.0 92214 0.0000
0.0 95.0 93195 0.0000
0.0 96.0 94176 0.0000
0.0 97.0 95157 0.0000
0.0007 98.0 96138 0.0000
0.0 99.0 97119 0.0000
0.0 100.0 98100 0.0000
0.0 101.0 99081 0.0000
0.0 102.0 100062 0.0000
0.0 103.0 101043 0.0
0.0 104.0 102024 0.0000
0.0 105.0 103005 0.0000
0.0 106.0 103986 0.0000
0.0 107.0 104967 0.0
0.0 108.0 105948 0.0000
0.0006 109.0 106929 0.0000
0.0 110.0 107910 0.0000
0.0 111.0 108891 0.0
0.0 112.0 109872 0.0
0.0 113.0 110853 0.0
0.0 114.0 111834 0.0
0.0 115.0 112815 0.0000
0.0 116.0 113796 0.0000
0.0 117.0 114777 0.0000
0.0 118.0 115758 0.0000
0.0 119.0 116739 0.0000
0.0 120.0 117720 0.0
0.0 121.0 118701 0.0
0.0 122.0 119682 0.0
0.0 123.0 120663 0.0
0.0013 124.0 121644 0.0000
0.0 125.0 122625 0.0000
0.0 126.0 123606 0.0000
0.0 127.0 124587 0.0000
0.0 128.0 125568 0.0000
0.0 129.0 126549 0.0000
0.0 130.0 127530 0.0
0.0 131.0 128511 0.0
0.0 132.0 129492 0.0
0.0 133.0 130473 0.0
0.0 134.0 131454 0.0
0.0 135.0 132435 0.0
0.0 136.0 133416 0.0
0.0 137.0 134397 0.0
0.0 138.0 135378 0.0
0.0 139.0 136359 0.0
0.0 140.0 137340 0.0
0.0 141.0 138321 0.0
0.0 142.0 139302 0.0
0.0 143.0 140283 0.0
0.0 144.0 141264 0.0
0.0 145.0 142245 0.0
0.0 146.0 143226 0.0
0.0 147.0 144207 0.0
0.0 148.0 145188 0.0
0.0 149.0 146169 0.0
0.0 150.0 147150 0.0

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1+cu118
  • Tokenizers 0.13.3
Downloads last month
3
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for Souvik123/bankstatementmodelver8

Finetuned
this model

Space using Souvik123/bankstatementmodelver8 1