ayubkfupm's picture
Model save
c752d44 verified
|
raw
history blame
15 kB
metadata
license: apache-2.0
base_model: microsoft/swin-tiny-patch4-window7-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: swin-tiny-patch4-window7-224-finetuned-st-ucihar
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8363553943789664

swin-tiny-patch4-window7-224-finetuned-st-ucihar

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7803
  • Accuracy: 0.8364

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.6826 0.9938 40 1.6510 0.3504
1.3306 1.9876 80 1.2070 0.4710
1.0249 2.9814 120 0.9149 0.5512
0.8922 4.0 161 0.7313 0.6587
0.791 4.9938 201 0.6777 0.6881
0.7267 5.9876 241 0.6398 0.7026
0.7106 6.9814 281 0.6299 0.7035
0.7038 8.0 322 0.6062 0.7058
0.6694 8.9938 362 0.5736 0.7226
0.6489 9.9876 402 0.5858 0.7248
0.6587 10.9814 442 0.5437 0.7434
0.6376 12.0 483 0.5259 0.7520
0.6494 12.9938 523 0.5807 0.7307
0.6236 13.9876 563 0.5236 0.7430
0.5931 14.9814 603 0.5463 0.7384
0.605 16.0 644 0.5360 0.7371
0.5422 16.9938 684 0.4954 0.7788
0.5485 17.9876 724 0.4910 0.7756
0.5586 18.9814 764 0.5052 0.7738
0.6011 20.0 805 0.5036 0.7679
0.5726 20.9938 845 0.5308 0.7588
0.514 21.9876 885 0.5064 0.7720
0.5431 22.9814 925 0.4680 0.7892
0.5348 24.0 966 0.6083 0.7430
0.532 24.9938 1006 0.4889 0.7815
0.5263 25.9876 1046 0.5268 0.7634
0.497 26.9814 1086 0.5057 0.7779
0.4929 28.0 1127 0.5560 0.7570
0.4828 28.9938 1167 0.4701 0.7928
0.4604 29.9876 1207 0.4656 0.8001
0.4472 30.9814 1247 0.4619 0.8083
0.455 32.0 1288 0.4920 0.7969
0.4467 32.9938 1328 0.4698 0.7987
0.4327 33.9876 1368 0.4489 0.8055
0.3918 34.9814 1408 0.5249 0.7906
0.4222 36.0 1449 0.4380 0.8069
0.3864 36.9938 1489 0.4695 0.8128
0.3865 37.9876 1529 0.5208 0.7811
0.3759 38.9814 1569 0.4972 0.8019
0.375 40.0 1610 0.4667 0.8105
0.4016 40.9938 1650 0.4632 0.8178
0.3788 41.9876 1690 0.4670 0.8078
0.3517 42.9814 1730 0.4745 0.8101
0.3707 44.0 1771 0.4526 0.8187
0.3422 44.9938 1811 0.4893 0.8005
0.3661 45.9876 1851 0.4748 0.8087
0.3522 46.9814 1891 0.4923 0.7992
0.385 48.0 1932 0.4335 0.8273
0.306 48.9938 1972 0.4823 0.8228
0.3163 49.9876 2012 0.5157 0.8069
0.3199 50.9814 2052 0.5283 0.8015
0.3248 52.0 2093 0.4652 0.8191
0.2969 52.9938 2133 0.5260 0.8024
0.3138 53.9876 2173 0.4862 0.8069
0.2665 54.9814 2213 0.4781 0.8250
0.2932 56.0 2254 0.5094 0.8214
0.2819 56.9938 2294 0.4921 0.8178
0.2677 57.9876 2334 0.5372 0.8087
0.2623 58.9814 2374 0.4847 0.8286
0.2584 60.0 2415 0.5754 0.8069
0.2637 60.9938 2455 0.5297 0.8182
0.2391 61.9876 2495 0.5187 0.8214
0.2426 62.9814 2535 0.5719 0.8137
0.2405 64.0 2576 0.5118 0.8232
0.2132 64.9938 2616 0.5691 0.8123
0.2572 65.9876 2656 0.5452 0.8209
0.2255 66.9814 2696 0.5650 0.8073
0.2614 68.0 2737 0.5387 0.8214
0.2284 68.9938 2777 0.6056 0.8141
0.2371 69.9876 2817 0.5906 0.8128
0.2089 70.9814 2857 0.5550 0.8119
0.2276 72.0 2898 0.5511 0.8214
0.2192 72.9938 2938 0.6162 0.8259
0.2076 73.9876 2978 0.5663 0.8237
0.1938 74.9814 3018 0.6118 0.8191
0.2274 76.0 3059 0.5603 0.8268
0.2271 76.9938 3099 0.6312 0.8128
0.2023 77.9876 3139 0.6300 0.8123
0.1792 78.9814 3179 0.5776 0.8268
0.1796 80.0 3220 0.6266 0.8209
0.1994 80.9938 3260 0.5468 0.8228
0.1857 81.9876 3300 0.6080 0.8205
0.1636 82.9814 3340 0.7066 0.8160
0.1665 84.0 3381 0.6064 0.8277
0.183 84.9938 3421 0.6019 0.8273
0.1761 85.9876 3461 0.6420 0.8196
0.1673 86.9814 3501 0.6287 0.8255
0.1946 88.0 3542 0.6024 0.8228
0.1511 88.9938 3582 0.6774 0.8169
0.1828 89.9876 3622 0.6015 0.8255
0.1758 90.9814 3662 0.5969 0.8300
0.1797 92.0 3703 0.6464 0.8200
0.176 92.9938 3743 0.6287 0.8173
0.1616 93.9876 3783 0.6914 0.8209
0.1783 94.9814 3823 0.6511 0.8218
0.1492 96.0 3864 0.6382 0.8264
0.1578 96.9938 3904 0.6391 0.8241
0.1574 97.9876 3944 0.6505 0.8255
0.1556 98.9814 3984 0.6302 0.8241
0.1396 100.0 4025 0.6634 0.8155
0.1246 100.9938 4065 0.6633 0.8264
0.1592 101.9876 4105 0.6815 0.8160
0.1393 102.9814 4145 0.6418 0.8237
0.1722 104.0 4186 0.6322 0.8318
0.1499 104.9938 4226 0.6901 0.8196
0.1282 105.9876 4266 0.6544 0.8309
0.1428 106.9814 4306 0.6581 0.8291
0.1478 108.0 4347 0.6825 0.8291
0.1453 108.9938 4387 0.6873 0.8237
0.1216 109.9876 4427 0.7075 0.8223
0.1449 110.9814 4467 0.6929 0.8232
0.137 112.0 4508 0.7139 0.8205
0.1177 112.9938 4548 0.6981 0.8305
0.1005 113.9876 4588 0.6840 0.8205
0.1305 114.9814 4628 0.6747 0.8273
0.1192 116.0 4669 0.6886 0.8259
0.1067 116.9938 4709 0.6612 0.8209
0.1122 117.9876 4749 0.6500 0.8259
0.1295 118.9814 4789 0.6948 0.8232
0.1304 120.0 4830 0.6651 0.8309
0.1334 120.9938 4870 0.7304 0.8187
0.1104 121.9876 4910 0.7365 0.8205
0.1132 122.9814 4950 0.7270 0.8300
0.1115 124.0 4991 0.7062 0.8228
0.1079 124.9938 5031 0.7579 0.8268
0.1192 125.9876 5071 0.7321 0.8205
0.0994 126.9814 5111 0.7219 0.8291
0.111 128.0 5152 0.7064 0.8273
0.1089 128.9938 5192 0.7056 0.8282
0.1062 129.9876 5232 0.6814 0.8323
0.1046 130.9814 5272 0.6843 0.8309
0.1013 132.0 5313 0.6807 0.8327
0.0879 132.9938 5353 0.7080 0.8336
0.1114 133.9876 5393 0.7129 0.8241
0.1133 134.9814 5433 0.7376 0.8264
0.1067 136.0 5474 0.7579 0.8259
0.1104 136.9938 5514 0.7178 0.8291
0.0893 137.9876 5554 0.7315 0.8300
0.1074 138.9814 5594 0.7312 0.8318
0.0983 140.0 5635 0.7362 0.8286
0.1093 140.9938 5675 0.7493 0.8286
0.1166 141.9876 5715 0.7205 0.8286
0.0969 142.9814 5755 0.7494 0.8291
0.1174 144.0 5796 0.6960 0.8336
0.1044 144.9938 5836 0.7111 0.8282
0.0866 145.9876 5876 0.7152 0.8364
0.092 146.9814 5916 0.7078 0.8327
0.0883 148.0 5957 0.7182 0.8341
0.0824 148.9938 5997 0.7095 0.8359
0.0953 149.9876 6037 0.7324 0.8354
0.0896 150.9814 6077 0.7032 0.8400
0.1025 152.0 6118 0.6938 0.8323
0.0966 152.9938 6158 0.6991 0.8404
0.0891 153.9876 6198 0.7346 0.8354
0.0733 154.9814 6238 0.7340 0.8350
0.0944 156.0 6279 0.7525 0.8277
0.0934 156.9938 6319 0.7683 0.8305
0.0768 157.9876 6359 0.7692 0.8286
0.0918 158.9814 6399 0.7387 0.8413
0.0886 160.0 6440 0.7705 0.8327
0.0836 160.9938 6480 0.7491 0.8327
0.0968 161.9876 6520 0.7663 0.8246
0.0748 162.9814 6560 0.7460 0.8305
0.0696 164.0 6601 0.7491 0.8332
0.0853 164.9938 6641 0.7788 0.8327
0.0726 165.9876 6681 0.7440 0.8382
0.0715 166.9814 6721 0.7518 0.8373
0.0699 168.0 6762 0.7574 0.8354
0.0749 168.9938 6802 0.7564 0.8323
0.0842 169.9876 6842 0.7829 0.8286
0.0822 170.9814 6882 0.7753 0.8327
0.0807 172.0 6923 0.7611 0.8359
0.0752 172.9938 6963 0.7673 0.8345
0.075 173.9876 7003 0.7815 0.8364
0.0845 174.9814 7043 0.7745 0.8382
0.0827 176.0 7084 0.7683 0.8373
0.0883 176.9938 7124 0.7842 0.8327
0.0774 177.9876 7164 0.7736 0.8368
0.0817 178.9814 7204 0.7852 0.8341
0.0804 180.0 7245 0.7686 0.8314
0.0671 180.9938 7285 0.7767 0.8359
0.076 181.9876 7325 0.7715 0.8350
0.0572 182.9814 7365 0.7740 0.8286
0.0823 184.0 7406 0.7757 0.8341
0.0662 184.9938 7446 0.7720 0.8336
0.0805 185.9876 7486 0.7696 0.8368
0.0763 186.9814 7526 0.7768 0.8377
0.0711 188.0 7567 0.7720 0.8350
0.0576 188.9938 7607 0.7845 0.8314
0.0667 189.9876 7647 0.7749 0.8336
0.0631 190.9814 7687 0.7774 0.8350
0.0744 192.0 7728 0.7778 0.8327
0.0672 192.9938 7768 0.7862 0.8323
0.0738 193.9876 7808 0.7843 0.8345
0.0754 194.9814 7848 0.7850 0.8368
0.0887 196.0 7889 0.7835 0.8364
0.0898 196.9938 7929 0.7810 0.8373
0.0543 197.9876 7969 0.7801 0.8364
0.0605 198.7578 8000 0.7803 0.8364

Framework versions

  • Transformers 4.44.0
  • Pytorch 1.12.1+cu113
  • Datasets 2.21.0
  • Tokenizers 0.19.1