metadata
license: apache-2.0
base_model: microsoft/swin-tiny-patch4-window7-224
tags:
- generated_from_trainer
datasets:
- imagefolder
metrics:
- accuracy
model-index:
- name: cards_bottom_left_swin-tiny-patch4-window7-224-finetuned-dough_100_epochs
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: imagefolder
type: imagefolder
config: default
split: test
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.5946802405369663
cards_bottom_left_swin-tiny-patch4-window7-224-finetuned-dough_100_epochs
This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 1.0025
- Accuracy: 0.5947
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.6956 | 1.0 | 1252 | 1.4843 | 0.3970 |
1.5633 | 2.0 | 2504 | 1.2584 | 0.4782 |
1.5568 | 3.0 | 3756 | 1.1976 | 0.4918 |
1.4727 | 4.0 | 5009 | 1.1884 | 0.4916 |
1.468 | 5.0 | 6261 | 1.1909 | 0.4889 |
1.4663 | 6.0 | 7513 | 1.1263 | 0.5288 |
1.4409 | 7.0 | 8765 | 1.0967 | 0.5441 |
1.4329 | 8.0 | 10018 | 1.0976 | 0.5388 |
1.4842 | 9.0 | 11270 | 1.1076 | 0.5315 |
1.4253 | 10.0 | 12522 | 1.0634 | 0.5511 |
1.3888 | 11.0 | 13774 | 1.0489 | 0.5634 |
1.3681 | 12.0 | 15027 | 1.0663 | 0.5567 |
1.3802 | 13.0 | 16279 | 1.0304 | 0.5667 |
1.4016 | 14.0 | 17531 | 1.0592 | 0.5518 |
1.376 | 15.0 | 18783 | 1.0080 | 0.5776 |
1.3539 | 16.0 | 20036 | 1.0103 | 0.5742 |
1.3725 | 17.0 | 21288 | 1.0261 | 0.5636 |
1.3104 | 18.0 | 22540 | 1.0304 | 0.5686 |
1.3448 | 19.0 | 23792 | 1.0184 | 0.5687 |
1.3479 | 20.0 | 25045 | 0.9968 | 0.5809 |
1.3517 | 21.0 | 26297 | 1.1350 | 0.5182 |
1.3367 | 22.0 | 27549 | 0.9835 | 0.5867 |
1.3002 | 23.0 | 28801 | 1.0193 | 0.5736 |
1.3238 | 24.0 | 30054 | 0.9820 | 0.5875 |
1.2865 | 25.0 | 31306 | 1.0267 | 0.5617 |
1.3029 | 26.0 | 32558 | 1.0086 | 0.5730 |
1.3173 | 27.0 | 33810 | 0.9750 | 0.5924 |
1.297 | 28.0 | 35063 | 0.9851 | 0.5848 |
1.3105 | 29.0 | 36315 | 1.0306 | 0.5685 |
1.3477 | 30.0 | 37567 | 0.9977 | 0.5845 |
1.2565 | 31.0 | 38819 | 0.9900 | 0.5851 |
1.2657 | 32.0 | 40072 | 1.0137 | 0.5862 |
1.2911 | 33.0 | 41324 | 0.9947 | 0.5889 |
1.2539 | 34.0 | 42576 | 0.9821 | 0.5914 |
1.2441 | 35.0 | 43828 | 1.0296 | 0.5763 |
1.2176 | 36.0 | 45081 | 1.0350 | 0.5806 |
1.25 | 37.0 | 46333 | 1.0195 | 0.5779 |
1.2647 | 38.0 | 47585 | 1.0021 | 0.5903 |
1.2428 | 39.0 | 48837 | 1.0087 | 0.5892 |
1.2364 | 40.0 | 50090 | 1.0025 | 0.5947 |
1.2083 | 41.0 | 51342 | 1.0427 | 0.5862 |
1.2002 | 42.0 | 52594 | 1.0303 | 0.5878 |
1.2071 | 43.0 | 53846 | 1.0190 | 0.5909 |
1.1536 | 44.0 | 55099 | 1.0314 | 0.5920 |
1.2029 | 45.0 | 56351 | 1.0570 | 0.5839 |
1.2249 | 46.0 | 57603 | 1.0508 | 0.5828 |
1.1913 | 47.0 | 58855 | 1.0493 | 0.5853 |
1.1938 | 48.0 | 60108 | 1.0575 | 0.5857 |
1.1724 | 49.0 | 61360 | 1.0700 | 0.5905 |
1.1536 | 50.0 | 62612 | 1.0841 | 0.5853 |
1.1239 | 51.0 | 63864 | 1.0803 | 0.5865 |
1.1743 | 52.0 | 65117 | 1.0864 | 0.5880 |
1.1414 | 53.0 | 66369 | 1.1224 | 0.5819 |
1.1411 | 54.0 | 67621 | 1.1316 | 0.5780 |
1.1029 | 55.0 | 68873 | 1.1070 | 0.5860 |
1.1353 | 56.0 | 70126 | 1.1247 | 0.5847 |
1.1293 | 57.0 | 71378 | 1.1279 | 0.5805 |
1.1335 | 58.0 | 72630 | 1.1482 | 0.5812 |
1.1157 | 59.0 | 73882 | 1.1960 | 0.5674 |
1.0891 | 60.0 | 75135 | 1.1414 | 0.5848 |
1.1299 | 61.0 | 76387 | 1.1658 | 0.5790 |
1.0828 | 62.0 | 77639 | 1.1753 | 0.5806 |
1.0866 | 63.0 | 78891 | 1.1767 | 0.5755 |
1.0721 | 64.0 | 80144 | 1.1861 | 0.5808 |
1.0682 | 65.0 | 81396 | 1.2083 | 0.5749 |
1.0747 | 66.0 | 82648 | 1.2204 | 0.5755 |
1.0902 | 67.0 | 83900 | 1.2175 | 0.5750 |
1.0381 | 68.0 | 85153 | 1.2445 | 0.5738 |
1.049 | 69.0 | 86405 | 1.2674 | 0.5707 |
1.0501 | 70.0 | 87657 | 1.2602 | 0.5740 |
1.0117 | 71.0 | 88909 | 1.2549 | 0.5687 |
1.0179 | 72.0 | 90162 | 1.3010 | 0.5690 |
1.0788 | 73.0 | 91414 | 1.2723 | 0.5726 |
1.0234 | 74.0 | 92666 | 1.3162 | 0.5717 |
1.0325 | 75.0 | 93918 | 1.3136 | 0.5692 |
1.0079 | 76.0 | 95171 | 1.3337 | 0.5655 |
1.058 | 77.0 | 96423 | 1.3171 | 0.5719 |
0.9968 | 78.0 | 97675 | 1.3470 | 0.5693 |
1.0217 | 79.0 | 98927 | 1.3418 | 0.5733 |
1.0124 | 80.0 | 100180 | 1.3518 | 0.5700 |
0.9823 | 81.0 | 101432 | 1.3646 | 0.5700 |
0.9627 | 82.0 | 102684 | 1.3658 | 0.5686 |
0.9773 | 83.0 | 103936 | 1.3811 | 0.5674 |
0.9855 | 84.0 | 105189 | 1.4082 | 0.5638 |
0.9928 | 85.0 | 106441 | 1.3877 | 0.5612 |
1.0025 | 86.0 | 107693 | 1.3925 | 0.5653 |
0.9583 | 87.0 | 108945 | 1.4313 | 0.5625 |
0.977 | 88.0 | 110198 | 1.4153 | 0.5651 |
0.9825 | 89.0 | 111450 | 1.4426 | 0.5619 |
0.9315 | 90.0 | 112702 | 1.4376 | 0.5643 |
0.8916 | 91.0 | 113954 | 1.4630 | 0.5618 |
0.9495 | 92.0 | 115207 | 1.4501 | 0.5627 |
0.9372 | 93.0 | 116459 | 1.4606 | 0.5622 |
0.9284 | 94.0 | 117711 | 1.4725 | 0.5608 |
0.9266 | 95.0 | 118963 | 1.4680 | 0.5607 |
0.8858 | 96.0 | 120216 | 1.4705 | 0.5626 |
0.9025 | 97.0 | 121468 | 1.4818 | 0.5616 |
0.902 | 98.0 | 122720 | 1.4871 | 0.5606 |
0.8961 | 99.0 | 123972 | 1.4881 | 0.5612 |
0.9204 | 99.98 | 125200 | 1.4894 | 0.5609 |
Framework versions
- Transformers 4.31.0
- Pytorch 2.0.1+cu117
- Datasets 2.17.0
- Tokenizers 0.13.3