Edit model card

nutrition-extractor

This model is a fine-tuned version of microsoft/layoutlmv3-large on the v5 of the nutrient extraction dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0773
  • Precision: 0.9403
  • Recall: 0.9464
  • F1: 0.9433
  • Accuracy: 0.9888

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 8
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 3000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
1.9661 0.1896 15 1.1509 0.0 0.0 0.0 0.8105
0.9772 0.3791 30 0.8344 0.0452 0.0079 0.0134 0.8130
0.8483 0.5687 45 0.6955 0.1203 0.1271 0.1236 0.8443
0.6646 0.7583 60 0.6043 0.1366 0.1890 0.1586 0.8586
0.6051 0.9479 75 0.5304 0.2540 0.3138 0.2808 0.8792
0.5079 1.1374 90 0.4677 0.3022 0.3683 0.3320 0.8869
0.4924 1.3270 105 0.4053 0.3902 0.4566 0.4208 0.9011
0.4113 1.5166 120 0.3613 0.4406 0.5037 0.4700 0.9081
0.3866 1.7062 135 0.3171 0.4975 0.5582 0.5261 0.9216
0.3464 1.8957 150 0.2863 0.5246 0.6109 0.5645 0.9303
0.3171 2.0853 165 0.2557 0.5986 0.6918 0.6418 0.9458
0.2938 2.2749 180 0.2342 0.6400 0.7149 0.6754 0.9492
0.2678 2.4645 195 0.2214 0.6571 0.7518 0.7013 0.9536
0.2362 2.6540 210 0.2066 0.6873 0.7588 0.7213 0.9566
0.2175 2.8436 225 0.1944 0.7137 0.7810 0.7458 0.9593
0.2128 3.0332 240 0.1820 0.7432 0.8184 0.7790 0.9646
0.1968 3.2227 255 0.1728 0.7592 0.8202 0.7885 0.9662
0.185 3.4123 270 0.1656 0.7667 0.8276 0.7960 0.9675
0.1702 3.6019 285 0.1640 0.7512 0.8161 0.7823 0.9655
0.1686 3.7915 300 0.1541 0.7831 0.8341 0.8078 0.9675
0.1671 3.9810 315 0.1527 0.7696 0.8383 0.8025 0.9679
0.1532 4.1706 330 0.1421 0.8114 0.8567 0.8334 0.9710
0.1505 4.3602 345 0.1301 0.8185 0.8646 0.8409 0.9727
0.126 4.5498 360 0.1274 0.8330 0.8688 0.8505 0.9746
0.128 4.7393 375 0.1166 0.8395 0.8771 0.8579 0.9757
0.1313 4.9289 390 0.1158 0.8583 0.8872 0.8725 0.9767
0.1112 5.1185 405 0.1140 0.8447 0.8794 0.8617 0.9752
0.0968 5.3081 420 0.1002 0.8646 0.8970 0.8805 0.9787
0.1129 5.4976 435 0.0993 0.8605 0.8919 0.8759 0.9786
0.1109 5.6872 450 0.0974 0.8517 0.8812 0.8662 0.9779
0.0868 5.8768 465 0.0957 0.8741 0.9043 0.8889 0.9801
0.1022 6.0664 480 0.0922 0.8778 0.9099 0.8936 0.9808
0.0814 6.2559 495 0.0890 0.8821 0.9025 0.8922 0.9806
0.0843 6.4455 510 0.0946 0.8847 0.9039 0.8942 0.9809
0.0803 6.6351 525 0.0846 0.8951 0.9145 0.9047 0.9826
0.0936 6.8246 540 0.0856 0.8893 0.9168 0.9028 0.9823
0.08 7.0142 555 0.0806 0.8990 0.9168 0.9078 0.9831
0.0672 7.2038 570 0.0839 0.8859 0.9117 0.8987 0.9822
0.0675 7.3934 585 0.0836 0.8903 0.9150 0.9025 0.9827
0.0757 7.5829 600 0.0837 0.8927 0.9191 0.9057 0.9830
0.0719 7.7725 615 0.0803 0.8998 0.9214 0.9105 0.9833
0.0671 7.9621 630 0.0831 0.8985 0.9201 0.9091 0.9831
0.0661 8.1517 645 0.0760 0.9061 0.9228 0.9144 0.9840
0.051 8.3412 660 0.0780 0.9121 0.9302 0.9211 0.9848
0.0524 8.5308 675 0.0805 0.9071 0.9247 0.9158 0.9851
0.067 8.7204 690 0.0768 0.9050 0.9242 0.9145 0.9846
0.0647 8.9100 705 0.0802 0.9135 0.9274 0.9204 0.9844
0.0581 9.0995 720 0.0721 0.9064 0.9214 0.9138 0.9846
0.0461 9.2891 735 0.0762 0.9044 0.9307 0.9173 0.9850
0.0496 9.4787 750 0.0748 0.9120 0.9288 0.9203 0.9851
0.0495 9.6682 765 0.0799 0.9126 0.9311 0.9218 0.9854
0.0523 9.8578 780 0.0772 0.9103 0.9335 0.9217 0.9856
0.0517 10.0474 795 0.0809 0.9120 0.9238 0.9178 0.9847
0.0411 10.2370 810 0.0792 0.9118 0.9316 0.9216 0.9860
0.0435 10.4265 825 0.0735 0.9157 0.9339 0.9247 0.9855
0.0393 10.6161 840 0.0729 0.9095 0.9293 0.9193 0.9853
0.0445 10.8057 855 0.0761 0.9069 0.9316 0.9191 0.9856
0.0432 10.9953 870 0.0739 0.9056 0.9307 0.9180 0.9851
0.0387 11.1848 885 0.0699 0.9146 0.9348 0.9246 0.9869
0.042 11.3744 900 0.0711 0.9187 0.9298 0.9242 0.9863
0.0374 11.5640 915 0.0697 0.9210 0.9325 0.9268 0.9864
0.0328 11.7536 930 0.0746 0.9111 0.9325 0.9217 0.9861
0.0403 11.9431 945 0.0749 0.9131 0.9325 0.9227 0.9857
0.0327 12.1327 960 0.0731 0.9191 0.9395 0.9292 0.9868
0.0284 12.3223 975 0.0738 0.9172 0.9372 0.9271 0.9862
0.0314 12.5118 990 0.0711 0.9253 0.9330 0.9291 0.9867
0.0342 12.7014 1005 0.0775 0.9254 0.9339 0.9296 0.9862
0.038 12.8910 1020 0.0725 0.9271 0.9348 0.9310 0.9864
0.0262 13.0806 1035 0.0756 0.9313 0.9330 0.9321 0.9869
0.0282 13.2701 1050 0.0734 0.9215 0.9385 0.9299 0.9869
0.0251 13.4597 1065 0.0701 0.9254 0.9395 0.9324 0.9872
0.0304 13.6493 1080 0.0712 0.9296 0.9395 0.9345 0.9873
0.0327 13.8389 1095 0.0746 0.9209 0.9362 0.9285 0.9867
0.0318 14.0284 1110 0.0721 0.9265 0.9325 0.9295 0.9870
0.0293 14.2180 1125 0.0739 0.9290 0.9376 0.9333 0.9869
0.0251 14.4076 1140 0.0753 0.9281 0.9372 0.9326 0.9869
0.0269 14.5972 1155 0.0769 0.9205 0.9362 0.9283 0.9862
0.0263 14.7867 1170 0.0745 0.9209 0.9358 0.9283 0.9866
0.0239 14.9763 1185 0.0754 0.9298 0.9372 0.9335 0.9876
0.0214 15.1659 1200 0.0756 0.9298 0.9367 0.9332 0.9873
0.0218 15.3555 1215 0.0759 0.9294 0.9372 0.9333 0.9875
0.0212 15.5450 1230 0.0744 0.9275 0.9344 0.9309 0.9871
0.0243 15.7346 1245 0.0716 0.9224 0.9395 0.9309 0.9875
0.0268 15.9242 1260 0.0730 0.9220 0.9399 0.9309 0.9869
0.0229 16.1137 1275 0.0755 0.9170 0.9395 0.9281 0.9867
0.0201 16.3033 1290 0.0714 0.9300 0.9390 0.9345 0.9879
0.0224 16.4929 1305 0.0693 0.9297 0.9413 0.9355 0.9878
0.0243 16.6825 1320 0.0699 0.9253 0.9450 0.9351 0.9878
0.022 16.8720 1335 0.0733 0.9192 0.9404 0.9296 0.9868
0.0207 17.0616 1350 0.0685 0.9329 0.9376 0.9352 0.9884
0.0211 17.2512 1365 0.0749 0.9314 0.9418 0.9366 0.9877
0.0188 17.4408 1380 0.0731 0.9323 0.9418 0.9370 0.9876
0.0187 17.6303 1395 0.0734 0.9330 0.9390 0.9360 0.9878
0.0203 17.8199 1410 0.0732 0.9357 0.9409 0.9382 0.9879
0.0204 18.0095 1425 0.0729 0.9306 0.9418 0.9362 0.9878
0.0151 18.1991 1440 0.0718 0.9285 0.9418 0.9351 0.9879
0.016 18.3886 1455 0.0730 0.9309 0.9404 0.9356 0.9879
0.0225 18.5782 1470 0.0743 0.9326 0.9404 0.9365 0.9877
0.0158 18.7678 1485 0.0759 0.9298 0.9427 0.9362 0.9875
0.0169 18.9573 1500 0.0780 0.9348 0.9409 0.9378 0.9883
0.0167 19.1469 1515 0.0790 0.9347 0.9399 0.9373 0.9877
0.0163 19.3365 1530 0.0743 0.9336 0.9422 0.9379 0.9875
0.0158 19.5261 1545 0.0737 0.9370 0.9413 0.9391 0.9883
0.0183 19.7156 1560 0.0766 0.9247 0.9418 0.9332 0.9870
0.0154 19.9052 1575 0.0770 0.9322 0.9409 0.9365 0.9875
0.018 20.0948 1590 0.0758 0.9298 0.9432 0.9365 0.9873
0.0142 20.2844 1605 0.0744 0.9255 0.9413 0.9333 0.9877
0.0151 20.4739 1620 0.0730 0.9323 0.9418 0.9370 0.9883
0.018 20.6635 1635 0.0727 0.9321 0.9455 0.9387 0.9886
0.0137 20.8531 1650 0.0708 0.9375 0.9427 0.9401 0.9887
0.0139 21.0427 1665 0.0714 0.9330 0.9459 0.9394 0.9885
0.0134 21.2322 1680 0.0699 0.9322 0.9404 0.9363 0.9883
0.0155 21.4218 1695 0.0713 0.9330 0.9455 0.9392 0.9886
0.0142 21.6114 1710 0.0711 0.9365 0.9473 0.9419 0.9887
0.0147 21.8009 1725 0.0701 0.9335 0.9464 0.9399 0.9887
0.0129 21.9905 1740 0.0700 0.9364 0.9459 0.9411 0.9886
0.0131 22.1801 1755 0.0737 0.9399 0.9399 0.9399 0.9880
0.0121 22.3697 1770 0.0726 0.9312 0.9441 0.9376 0.9883
0.0133 22.5592 1785 0.0733 0.9249 0.9450 0.9349 0.9878
0.0126 22.7488 1800 0.0753 0.9331 0.9409 0.9370 0.9878
0.0133 22.9384 1815 0.0738 0.9286 0.9436 0.9361 0.9882
0.0113 23.1280 1830 0.0733 0.9327 0.9413 0.9370 0.9882
0.012 23.3175 1845 0.0774 0.9413 0.9413 0.9413 0.9884
0.0109 23.5071 1860 0.0772 0.9414 0.9427 0.9420 0.9887
0.0135 23.6967 1875 0.0782 0.9386 0.9390 0.9388 0.9879
0.0122 23.8863 1890 0.0767 0.9368 0.9385 0.9377 0.9880
0.0109 24.0758 1905 0.0763 0.9406 0.9445 0.9426 0.9887
0.0117 24.2654 1920 0.0763 0.9430 0.9404 0.9417 0.9884
0.013 24.4550 1935 0.0741 0.9415 0.9441 0.9428 0.9885
0.0092 24.6445 1950 0.0768 0.9326 0.9469 0.9397 0.9885
0.0101 24.8341 1965 0.0737 0.9328 0.9422 0.9375 0.9884
0.0092 25.0237 1980 0.0732 0.9385 0.9455 0.9420 0.9887
0.0084 25.2133 1995 0.0738 0.9331 0.9478 0.9404 0.9886
0.0088 25.4028 2010 0.0769 0.9365 0.9399 0.9382 0.9881
0.0116 25.5924 2025 0.0739 0.9402 0.9445 0.9424 0.9886
0.0101 25.7820 2040 0.0733 0.9399 0.9473 0.9436 0.9890
0.0122 25.9716 2055 0.0759 0.9375 0.9432 0.9403 0.9883
0.0098 26.1611 2070 0.0762 0.9415 0.9445 0.9430 0.9884
0.0088 26.3507 2085 0.0776 0.9393 0.9432 0.9412 0.9880
0.01 26.5403 2100 0.0769 0.9354 0.9427 0.9390 0.9878
0.0112 26.7299 2115 0.0750 0.9375 0.9422 0.9398 0.9881
0.0087 26.9194 2130 0.0745 0.9389 0.9441 0.9415 0.9883
0.0076 27.1090 2145 0.0728 0.9372 0.9445 0.9409 0.9885
0.0091 27.2986 2160 0.0749 0.9401 0.9436 0.9419 0.9886
0.01 27.4882 2175 0.0766 0.9414 0.9436 0.9425 0.9886
0.008 27.6777 2190 0.0774 0.9401 0.9432 0.9416 0.9885
0.0095 27.8673 2205 0.0784 0.9419 0.9436 0.9428 0.9886
0.0109 28.0569 2220 0.0776 0.9329 0.9450 0.9389 0.9883
0.0088 28.2464 2235 0.0794 0.9357 0.9413 0.9385 0.9879
0.0068 28.4360 2250 0.0800 0.9389 0.9441 0.9415 0.9886
0.0097 28.6256 2265 0.0817 0.9400 0.9418 0.9409 0.9880
0.0075 28.8152 2280 0.0804 0.9403 0.9395 0.9399 0.9878
0.0103 29.0047 2295 0.0750 0.9338 0.9450 0.9394 0.9887
0.01 29.1943 2310 0.0749 0.9333 0.9441 0.9387 0.9887
0.0087 29.3839 2325 0.0754 0.9368 0.9450 0.9409 0.9886
0.0092 29.5735 2340 0.0758 0.9374 0.9418 0.9396 0.9881
0.008 29.7630 2355 0.0776 0.9368 0.9445 0.9406 0.9884
0.0065 29.9526 2370 0.0781 0.9384 0.9427 0.9405 0.9883
0.0076 30.1422 2385 0.0776 0.9372 0.9445 0.9409 0.9886
0.0087 30.3318 2400 0.0768 0.9340 0.9418 0.9379 0.9883
0.0064 30.5213 2415 0.0766 0.9368 0.9445 0.9406 0.9883
0.0083 30.7109 2430 0.0779 0.9355 0.9385 0.9370 0.9881
0.0086 30.9005 2445 0.0756 0.9348 0.9409 0.9378 0.9879
0.0073 31.0900 2460 0.0753 0.9351 0.9450 0.9400 0.9884
0.0082 31.2796 2475 0.0763 0.9402 0.9441 0.9421 0.9884
0.0058 31.4692 2490 0.0759 0.9411 0.9455 0.9433 0.9890
0.0071 31.6588 2505 0.0766 0.9388 0.9432 0.9410 0.9886
0.008 31.8483 2520 0.0774 0.9366 0.9418 0.9392 0.9882
0.0084 32.0379 2535 0.0771 0.9390 0.9464 0.9427 0.9890
0.0058 32.2275 2550 0.0766 0.9400 0.9478 0.9439 0.9892
0.0079 32.4171 2565 0.0777 0.9354 0.9432 0.9393 0.9884
0.0066 32.6066 2580 0.0770 0.9394 0.9459 0.9427 0.9889
0.0068 32.7962 2595 0.0755 0.9364 0.9455 0.9409 0.9889
0.0071 32.9858 2610 0.0742 0.9358 0.9436 0.9397 0.9887
0.0105 33.1754 2625 0.0748 0.9341 0.9427 0.9384 0.9885
0.006 33.3649 2640 0.0753 0.9386 0.9464 0.9425 0.9889
0.0075 33.5545 2655 0.0748 0.9390 0.9455 0.9422 0.9889
0.0068 33.7441 2670 0.0757 0.9381 0.9455 0.9418 0.9889
0.0061 33.9336 2685 0.0757 0.9360 0.9455 0.9407 0.9888
0.0074 34.1232 2700 0.0753 0.9364 0.9450 0.9407 0.9885
0.006 34.3128 2715 0.0760 0.9362 0.9432 0.9397 0.9884
0.0064 34.5024 2730 0.0759 0.9364 0.9459 0.9411 0.9888
0.0055 34.6919 2745 0.0760 0.9368 0.9445 0.9406 0.9889
0.0063 34.8815 2760 0.0767 0.9375 0.9432 0.9403 0.9886
0.0065 35.0711 2775 0.0769 0.9385 0.9445 0.9415 0.9888
0.0075 35.2607 2790 0.0767 0.9407 0.9464 0.9436 0.9889
0.0067 35.4502 2805 0.0767 0.9399 0.9459 0.9429 0.9888
0.0058 35.6398 2820 0.0771 0.9403 0.9455 0.9429 0.9886
0.0044 35.8294 2835 0.0775 0.9406 0.9445 0.9426 0.9884
0.0062 36.0190 2850 0.0776 0.9398 0.9445 0.9422 0.9885
0.0067 36.2085 2865 0.0778 0.9411 0.9450 0.9430 0.9886
0.0073 36.3981 2880 0.0773 0.9408 0.9469 0.9438 0.9888
0.0057 36.5877 2895 0.0773 0.9412 0.9464 0.9438 0.9887
0.0062 36.7773 2910 0.0773 0.9398 0.9455 0.9426 0.9887
0.0056 36.9668 2925 0.0773 0.9390 0.9459 0.9424 0.9887
0.006 37.1564 2940 0.0772 0.9390 0.9464 0.9427 0.9889
0.007 37.3460 2955 0.0773 0.9394 0.9464 0.9429 0.9888
0.0066 37.5355 2970 0.0773 0.9399 0.9464 0.9431 0.9889
0.0066 37.7251 2985 0.0773 0.9399 0.9464 0.9431 0.9889
0.0067 37.9147 3000 0.0773 0.9403 0.9464 0.9433 0.9888

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.4.1
  • Datasets 2.19.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
357M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for openfoodfacts/nutrition-extractor

Finetuned
this model