results / README.md
amaye15's picture
End of training
e8b7c34 verified
metadata
license: apache-2.0
base_model: facebook/detr-resnet-50
tags:
  - generated_from_trainer
model-index:
  - name: results
    results: []

results

This model is a fine-tuned version of facebook/detr-resnet-50 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.0368

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 2000

Training results

Training Loss Epoch Step Validation Loss
3.4991 0.3137 100 3.6244
3.1728 0.3451 110 3.4319
2.7857 0.3765 120 3.2574
3.0606 0.4078 130 3.1484
2.6704 0.4392 140 3.0390
2.7332 0.4706 150 2.9956
2.8436 0.5020 160 2.9110
2.8464 0.5333 170 2.8551
2.1192 0.5647 180 2.8163
2.6557 0.5961 190 2.8145
2.3224 0.6275 200 2.7858
2.6007 0.6588 210 2.7064
2.6117 0.6902 220 2.6602
2.4549 0.7216 230 2.6368
2.5487 0.7529 240 2.6029
2.6048 0.7843 250 2.5573
2.0348 0.8157 260 2.5203
2.4741 0.8471 270 2.4935
2.5855 0.8784 280 2.4731
2.1076 0.9098 290 2.4283
2.3073 0.9412 300 2.3896
2.214 0.9725 310 2.3919
2.2078 1.0039 320 2.3343
2.2391 1.0353 330 2.2970
2.3607 1.0667 340 2.2921
2.0244 1.0980 350 2.2751
2.251 1.1294 360 2.2713
2.1133 1.1608 370 2.2701
2.124 1.1922 380 2.2618
2.1989 1.2235 390 2.2429
2.2315 1.2549 400 2.2463
2.2398 1.2863 410 2.2386
2.261 1.3176 420 2.2360
2.2144 1.3490 430 2.2427
2.3344 1.3804 440 2.2452
2.0412 1.4118 450 2.2092
2.0854 1.4431 460 2.2197
2.1636 1.4745 470 2.1830
1.7776 1.5059 480 2.1904
2.1118 1.5373 490 2.2194
2.1203 1.5686 500 2.1978
2.2468 1.6 510 2.1968
2.2992 1.6314 520 2.1963
2.2596 1.6627 530 2.1816
2.1836 1.6941 540 2.1800
2.2672 1.7255 550 2.1679
2.0702 1.7569 560 2.1607
2.5606 1.7882 570 2.1568
2.1392 1.8196 580 2.1578
1.9255 1.8510 590 2.1799
2.0995 1.8824 600 2.1995
2.1153 1.9137 610 2.1741
2.2068 1.9451 620 2.1638
1.8698 1.9765 630 2.1819
1.8849 2.0078 640 2.1807
2.0291 2.0392 650 2.1636
2.2092 2.0706 660 2.1356
2.1117 2.1020 670 2.1682
1.8318 2.1333 680 2.1719
1.9884 2.1647 690 2.2114
2.1933 2.1961 700 2.1526
2.2953 2.2275 710 2.1525
2.2841 2.2588 720 2.1417
1.9865 2.2902 730 2.1399
1.9193 2.3216 740 2.1313
1.8882 2.3529 750 2.1362
1.8967 2.3843 760 2.1454
1.9424 2.4157 770 2.1356
1.8531 2.4471 780 2.1340
1.9435 2.4784 790 2.1413
2.0455 2.5098 800 2.1558
1.9384 2.5412 810 2.1519
2.0826 2.5725 820 2.1381
2.0008 2.6039 830 2.1136
1.922 2.6353 840 2.1160
1.9567 2.6667 850 2.0991
2.2798 2.6980 860 2.0998
2.4014 2.7294 870 2.0922
2.3427 2.7608 880 2.0976
2.2701 2.7922 890 2.0823
2.1405 2.8235 900 2.1009
1.9259 2.8549 910 2.1075
2.0055 2.8863 920 2.1041
1.9902 2.9176 930 2.0854
1.9821 2.9490 940 2.1107
2.0292 2.9804 950 2.0901
1.9811 3.0118 960 2.1227
2.2674 3.0431 970 2.0934
2.0632 3.0745 980 2.0935
2.1232 3.1059 990 2.0843
2.0056 3.1373 1000 2.0891
2.0188 3.1686 1010 2.0811
2.0898 3.2 1020 2.0848
2.1809 3.2314 1030 2.0883
2.1636 3.2627 1040 2.0931
1.9941 3.2941 1050 2.0894
1.9761 3.3255 1060 2.0957
1.9908 3.3569 1070 2.0715
2.0806 3.3882 1080 2.0774
1.9419 3.4196 1090 2.0713
1.8643 3.4510 1100 2.0654
1.969 3.4824 1110 2.0636
2.0104 3.5137 1120 2.0710
1.6745 3.5451 1130 2.0551
2.047 3.5765 1140 2.0598
2.1289 3.6078 1150 2.0426
2.1158 3.6392 1160 2.0525
1.8543 3.6706 1170 2.0515
2.0206 3.7020 1180 2.0508
2.1992 3.7333 1190 2.0485
1.6875 3.7647 1200 2.0558
1.8452 3.7961 1210 2.0543
2.2061 3.8275 1220 2.0594
2.0418 3.8588 1230 2.0652
2.0411 3.8902 1240 2.0679
2.0835 3.9216 1250 2.0731
1.9003 3.9529 1260 2.0574
1.7881 3.9843 1270 2.0777
2.1354 4.0157 1280 2.0630
1.8935 4.0471 1290 2.0607
2.1067 4.0784 1300 2.0576
1.8225 4.1098 1310 2.0767
1.8132 4.1412 1320 2.0507
1.985 4.1725 1330 2.0669
2.112 4.2039 1340 2.0836
1.7993 4.2353 1350 2.0718
1.9784 4.2667 1360 2.0676
2.1628 4.2980 1370 2.0525
1.876 4.3294 1380 2.0615
2.0081 4.3608 1390 2.0736
1.8642 4.3922 1400 2.0565
1.9308 4.4235 1410 2.0608
2.2296 4.4549 1420 2.0553
2.0166 4.4863 1430 2.0575
2.0422 4.5176 1440 2.0543
1.8729 4.5490 1450 2.0552
2.0323 4.5804 1460 2.0656
1.9935 4.6118 1470 2.0794
1.8534 4.6431 1480 2.0685
1.8363 4.6745 1490 2.0581
1.9679 4.7059 1500 2.0353
1.8585 4.7373 1510 2.0334
1.9772 4.7686 1520 2.0420
1.8753 4.8 1530 2.0427
1.8911 4.8314 1540 2.0499
2.0614 4.8627 1550 2.0481
2.1184 4.8941 1560 2.0481
1.9504 4.9255 1570 2.0541
2.1337 4.9569 1580 2.0480
2.4391 4.9882 1590 2.0416
1.72 5.0196 1600 2.0412
2.0808 5.0510 1610 2.0458
1.8639 5.0824 1620 2.0438
1.9462 5.1137 1630 2.0428
2.0055 5.1451 1640 2.0366
2.0345 5.1765 1650 2.0644
1.9321 5.2078 1660 2.0454
1.8705 5.2392 1670 2.0394
2.0345 5.2706 1680 2.0475
1.9992 5.3020 1690 2.0567
2.2208 5.3333 1700 2.0558
1.8253 5.3647 1710 2.0413
2.0765 5.3961 1720 2.0319
2.2315 5.4275 1730 2.0360
2.2432 5.4588 1740 2.0436
2.0666 5.4902 1750 2.0451
2.0603 5.5216 1760 2.0296
1.6625 5.5529 1770 2.0513
2.0946 5.5843 1780 2.0306
1.9464 5.6157 1790 2.0315
2.0183 5.6471 1800 2.0276
2.0794 5.6784 1810 2.0512
2.0289 5.7098 1820 2.0369
2.1014 5.7412 1830 2.0520
1.9159 5.7725 1840 2.0491
2.2446 5.8039 1850 2.0508
1.9383 5.8353 1860 2.0327
2.0132 5.8667 1870 2.0161
2.2234 5.8980 1880 2.0406
2.2556 5.9294 1890 2.0365
2.2061 5.9608 1900 2.0314
1.7465 5.9922 1910 2.0543
1.9388 6.0235 1920 2.0525
1.9223 6.0549 1930 2.0325
1.9386 6.0863 1940 2.0282
1.9171 6.1176 1950 2.0462
1.9319 6.1490 1960 2.0369
1.7689 6.1804 1970 2.0364
2.0063 6.2118 1980 2.0388
2.1053 6.2431 1990 2.0346
2.1074 6.2745 2000 2.0368

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1
  • Datasets 2.19.2
  • Tokenizers 0.19.1