--- license: apache-2.0 base_model: google/vit-base-patch16-224-in21k tags: - generated_from_trainer datasets: - imagefolder metrics: - accuracy model-index: - name: sashes_model results: - task: name: Image Classification type: image-classification dataset: name: imagefolder type: imagefolder config: default split: train args: default metrics: - name: Accuracy type: accuracy value: 0.9393939393939394 --- # sashes_model This model is a fine-tuned version of [google/vit-base-patch16-224-in21k](https://huggingface.co/google/vit-base-patch16-224-in21k) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 0.1902 - Accuracy: 0.9394 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 112 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-------:|:----:|:---------------:|:--------:| | No log | 0.7273 | 2 | 1.1089 | 0.2909 | | No log | 1.8182 | 5 | 1.0961 | 0.3758 | | No log | 2.9091 | 8 | 1.0822 | 0.3758 | | 1.0988 | 4.0 | 11 | 1.0480 | 0.5939 | | 1.0988 | 4.7273 | 13 | 1.0333 | 0.6061 | | 1.0988 | 5.8182 | 16 | 0.9881 | 0.6667 | | 1.0988 | 6.9091 | 19 | 0.9274 | 0.6970 | | 1.0061 | 8.0 | 22 | 0.8714 | 0.8364 | | 1.0061 | 8.7273 | 24 | 0.8210 | 0.7879 | | 1.0061 | 9.8182 | 27 | 0.7619 | 0.8121 | | 0.8136 | 10.9091 | 30 | 0.6483 | 0.8667 | | 0.8136 | 12.0 | 33 | 0.6937 | 0.7939 | | 0.8136 | 12.7273 | 35 | 0.5885 | 0.8545 | | 0.8136 | 13.8182 | 38 | 0.6046 | 0.8182 | | 0.642 | 14.9091 | 41 | 0.5518 | 0.8364 | | 0.642 | 16.0 | 44 | 0.5370 | 0.8242 | | 0.642 | 16.7273 | 46 | 0.4765 | 0.8788 | | 0.642 | 17.8182 | 49 | 0.4416 | 0.8606 | | 0.5145 | 18.9091 | 52 | 0.4140 | 0.8970 | | 0.5145 | 20.0 | 55 | 0.4007 | 0.8970 | | 0.5145 | 20.7273 | 57 | 0.3803 | 0.8970 | | 0.4226 | 21.8182 | 60 | 0.3167 | 0.9394 | | 0.4226 | 22.9091 | 63 | 0.3398 | 0.9030 | | 0.4226 | 24.0 | 66 | 0.3147 | 0.9273 | | 0.4226 | 24.7273 | 68 | 0.3273 | 0.8970 | | 0.3282 | 25.8182 | 71 | 0.3125 | 0.9030 | | 0.3282 | 26.9091 | 74 | 0.2712 | 0.9212 | | 0.3282 | 28.0 | 77 | 0.2871 | 0.9273 | | 0.3282 | 28.7273 | 79 | 0.2534 | 0.9273 | | 0.3076 | 29.8182 | 82 | 0.2620 | 0.9273 | | 0.3076 | 30.9091 | 85 | 0.3845 | 0.8848 | | 0.3076 | 32.0 | 88 | 0.2495 | 0.9273 | | 0.3081 | 32.7273 | 90 | 0.3018 | 0.9091 | | 0.3081 | 33.8182 | 93 | 0.2204 | 0.9455 | | 0.3081 | 34.9091 | 96 | 0.2769 | 0.9152 | | 0.3081 | 36.0 | 99 | 0.2261 | 0.9394 | | 0.2451 | 36.7273 | 101 | 0.2092 | 0.9515 | | 0.2451 | 37.8182 | 104 | 0.3196 | 0.8727 | | 0.2451 | 38.9091 | 107 | 0.2629 | 0.9091 | | 0.2741 | 40.0 | 110 | 0.2360 | 0.9333 | | 0.2741 | 40.7273 | 112 | 0.1927 | 0.9515 | | 0.2741 | 41.8182 | 115 | 0.2834 | 0.9030 | | 0.2741 | 42.9091 | 118 | 0.2173 | 0.9394 | | 0.244 | 44.0 | 121 | 0.1997 | 0.9394 | | 0.244 | 44.7273 | 123 | 0.2163 | 0.9273 | | 0.244 | 45.8182 | 126 | 0.2865 | 0.8970 | | 0.244 | 46.9091 | 129 | 0.2483 | 0.9152 | | 0.224 | 48.0 | 132 | 0.1707 | 0.9576 | | 0.224 | 48.7273 | 134 | 0.1988 | 0.9455 | | 0.224 | 49.8182 | 137 | 0.2168 | 0.9455 | | 0.213 | 50.9091 | 140 | 0.1807 | 0.9576 | | 0.213 | 52.0 | 143 | 0.2478 | 0.9152 | | 0.213 | 52.7273 | 145 | 0.1975 | 0.9455 | | 0.213 | 53.8182 | 148 | 0.2218 | 0.9212 | | 0.2298 | 54.9091 | 151 | 0.2046 | 0.9455 | | 0.2298 | 56.0 | 154 | 0.2557 | 0.9152 | | 0.2298 | 56.7273 | 156 | 0.1962 | 0.9394 | | 0.2298 | 57.8182 | 159 | 0.1879 | 0.9394 | | 0.2189 | 58.9091 | 162 | 0.1983 | 0.9576 | | 0.2189 | 60.0 | 165 | 0.1285 | 0.9697 | | 0.2189 | 60.7273 | 167 | 0.2227 | 0.9212 | | 0.211 | 61.8182 | 170 | 0.1671 | 0.9515 | | 0.211 | 62.9091 | 173 | 0.1489 | 0.9636 | | 0.211 | 64.0 | 176 | 0.1842 | 0.9394 | | 0.211 | 64.7273 | 178 | 0.1687 | 0.9636 | | 0.1834 | 65.8182 | 181 | 0.2118 | 0.9091 | | 0.1834 | 66.9091 | 184 | 0.2191 | 0.9273 | | 0.1834 | 68.0 | 187 | 0.2014 | 0.9273 | | 0.1834 | 68.7273 | 189 | 0.1861 | 0.9515 | | 0.1846 | 69.8182 | 192 | 0.1309 | 0.9758 | | 0.1846 | 70.9091 | 195 | 0.1236 | 0.9636 | | 0.1846 | 72.0 | 198 | 0.1541 | 0.9455 | | 0.1581 | 72.7273 | 200 | 0.1577 | 0.9576 | | 0.1581 | 73.8182 | 203 | 0.1927 | 0.9273 | | 0.1581 | 74.9091 | 206 | 0.2247 | 0.9273 | | 0.1581 | 76.0 | 209 | 0.1811 | 0.9576 | | 0.1742 | 76.7273 | 211 | 0.2190 | 0.9273 | | 0.1742 | 77.8182 | 214 | 0.1487 | 0.9697 | | 0.1742 | 78.9091 | 217 | 0.1836 | 0.9576 | | 0.1837 | 80.0 | 220 | 0.1228 | 0.9758 | | 0.1837 | 80.7273 | 222 | 0.1400 | 0.9636 | | 0.1837 | 81.4545 | 224 | 0.1902 | 0.9394 | ### Framework versions - Transformers 4.42.3 - Pytorch 2.3.1+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1