swin-tiny-patch4-window7-224-finetuned-bootcamp

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8963
  • Accuracy: 0.7324

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.8889 6 4.2849 0.0047
4.3139 1.9259 13 4.1846 0.0329
4.1651 2.9630 20 4.0585 0.0563
4.1651 4.0 27 3.9527 0.0610
3.9272 4.8889 33 3.8813 0.0610
3.7461 5.9259 40 3.7536 0.0845
3.7461 6.9630 47 3.6486 0.1080
3.5254 8.0 54 3.5603 0.1362
3.3478 8.8889 60 3.4566 0.1362
3.3478 9.9259 67 3.2986 0.1502
3.0423 10.9630 74 3.2166 0.1549
2.7931 12.0 81 3.0203 0.2160
2.7931 12.8889 87 2.8991 0.2911
2.541 13.9259 94 2.7941 0.2911
2.3487 14.9630 101 2.7337 0.2911
2.3487 16.0 108 2.5401 0.3662
2.1043 16.8889 114 2.5088 0.3803
1.8892 17.9259 121 2.3596 0.4131
1.8892 18.9630 128 2.3180 0.4178
1.7167 20.0 135 2.1820 0.4272
1.5748 20.8889 141 2.0547 0.4413
1.5748 21.9259 148 1.9472 0.4930
1.4052 22.9630 155 1.9053 0.4883
1.2535 24.0 162 1.8179 0.5117
1.2535 24.8889 168 1.7600 0.5305
1.1687 25.9259 175 1.6922 0.5493
1.0719 26.9630 182 1.6076 0.5587
1.0719 28.0 189 1.5316 0.5587
1.0577 28.8889 195 1.5365 0.5775
0.9558 29.9259 202 1.4488 0.6291
0.9558 30.9630 209 1.4185 0.6150
0.8771 32.0 216 1.3906 0.6056
0.8146 32.8889 222 1.3828 0.6150
0.8146 33.9259 229 1.3927 0.5822
0.8228 34.9630 236 1.3036 0.6385
0.6878 36.0 243 1.2240 0.6808
0.6878 36.8889 249 1.2388 0.6714
0.6471 37.9259 256 1.1345 0.6808
0.6102 38.9630 263 1.1815 0.6573
0.6599 40.0 270 1.1720 0.6526
0.6599 40.8889 276 1.1336 0.6526
0.5742 41.9259 283 1.0863 0.6714
0.5478 42.9630 290 1.0910 0.6714
0.5478 44.0 297 1.0746 0.6620
0.557 44.8889 303 1.0724 0.6808
0.5753 45.9259 310 1.0108 0.7136
0.5753 46.9630 317 1.1296 0.6432
0.5325 48.0 324 1.0361 0.6901
0.4349 48.8889 330 1.0237 0.6995
0.4349 49.9259 337 0.9790 0.7183
0.447 50.9630 344 1.0409 0.6808
0.4502 52.0 351 1.0467 0.6714
0.4502 52.8889 357 0.9773 0.7183
0.4345 53.9259 364 0.9931 0.6808
0.4557 54.9630 371 0.9685 0.7136
0.4557 56.0 378 0.9547 0.7371
0.4109 56.8889 384 1.0015 0.6948
0.4406 57.9259 391 0.9410 0.7230
0.4406 58.9630 398 0.9765 0.6808
0.4039 60.0 405 0.9505 0.7089
0.396 60.8889 411 0.9539 0.7183
0.396 61.9259 418 1.0391 0.6761
0.3958 62.9630 425 0.9576 0.7136
0.3763 64.0 432 0.9380 0.7230
0.3763 64.8889 438 0.9363 0.7277
0.3985 65.9259 445 0.9400 0.7089
0.3701 66.9630 452 0.9769 0.7183
0.3701 68.0 459 0.9604 0.7277
0.3729 68.8889 465 0.9883 0.7089
0.3958 69.9259 472 0.9516 0.7277
0.3958 70.9630 479 0.9252 0.7183
0.359 72.0 486 0.9196 0.7136
0.362 72.8889 492 0.9104 0.7230
0.362 73.9259 499 0.9255 0.7136
0.353 74.9630 506 0.9359 0.7089
0.345 76.0 513 0.9274 0.7230
0.345 76.8889 519 0.9206 0.7371
0.3414 77.9259 526 0.9229 0.7277
0.3298 78.9630 533 0.9102 0.7418
0.3394 80.0 540 0.8955 0.7512
0.3394 80.8889 546 0.8956 0.7371
0.3384 81.9259 553 0.8927 0.7277
0.3164 82.9630 560 0.8885 0.7418
0.3164 84.0 567 0.8941 0.7371
0.3055 84.8889 573 0.8963 0.7418
0.3355 85.9259 580 0.8992 0.7324
0.3355 86.9630 587 0.8988 0.7324
0.3101 88.0 594 0.8969 0.7324
0.3218 88.8889 600 0.8963 0.7324

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.3.1
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for NithaRenjith/swin-tiny-patch4-window7-224-finetuned-bootcamp

Finetuned
(492)
this model

Evaluation results