Niraya666's picture
End of training
b688fb4
|
raw
history blame
14.2 kB
metadata
license: apache-2.0
base_model: microsoft/swin-tiny-patch4-window7-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: swin-tiny-patch4-window7-224-finetuned-ADC-3cls-0922
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8142857142857143

swin-tiny-patch4-window7-224-finetuned-ADC-3cls-0922

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6875
  • Accuracy: 0.8143

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.2
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 2 1.0694 0.4143
No log 2.0 4 1.0689 0.4143
No log 3.0 6 1.0682 0.4143
No log 4.0 8 1.0671 0.4143
1.096 5.0 10 1.0657 0.4286
1.096 6.0 12 1.0640 0.4286
1.096 7.0 14 1.0621 0.4143
1.096 8.0 16 1.0598 0.4
1.096 9.0 18 1.0572 0.4
1.0906 10.0 20 1.0545 0.4
1.0906 11.0 22 1.0517 0.4143
1.0906 12.0 24 1.0486 0.4143
1.0906 13.0 26 1.0453 0.4143
1.0906 14.0 28 1.0418 0.4143
1.0647 15.0 30 1.0380 0.4143
1.0647 16.0 32 1.0343 0.4143
1.0647 17.0 34 1.0307 0.4143
1.0647 18.0 36 1.0268 0.4286
1.0647 19.0 38 1.0229 0.4286
1.0451 20.0 40 1.0191 0.4429
1.0451 21.0 42 1.0153 0.4571
1.0451 22.0 44 1.0116 0.4714
1.0451 23.0 46 1.0082 0.4714
1.0451 24.0 48 1.0049 0.4714
1.037 25.0 50 1.0016 0.4714
1.037 26.0 52 0.9979 0.4714
1.037 27.0 54 0.9944 0.4714
1.037 28.0 56 0.9913 0.4714
1.037 29.0 58 0.9883 0.4714
1.0214 30.0 60 0.9847 0.4714
1.0214 31.0 62 0.9809 0.4571
1.0214 32.0 64 0.9768 0.4714
1.0214 33.0 66 0.9723 0.4714
1.0214 34.0 68 0.9671 0.4714
1.0181 35.0 70 0.9616 0.4714
1.0181 36.0 72 0.9561 0.4857
1.0181 37.0 74 0.9505 0.5
1.0181 38.0 76 0.9446 0.5286
1.0181 39.0 78 0.9388 0.5286
0.9646 40.0 80 0.9331 0.5286
0.9646 41.0 82 0.9276 0.5143
0.9646 42.0 84 0.9224 0.5286
0.9646 43.0 86 0.9172 0.5286
0.9646 44.0 88 0.9120 0.5286
0.946 45.0 90 0.9070 0.5143
0.946 46.0 92 0.9021 0.5286
0.946 47.0 94 0.8976 0.5429
0.946 48.0 96 0.8933 0.5429
0.946 49.0 98 0.8891 0.5714
0.9244 50.0 100 0.8846 0.5714
0.9244 51.0 102 0.8803 0.5714
0.9244 52.0 104 0.8759 0.5714
0.9244 53.0 106 0.8716 0.5714
0.9244 54.0 108 0.8674 0.5714
0.9228 55.0 110 0.8634 0.5857
0.9228 56.0 112 0.8598 0.6
0.9228 57.0 114 0.8562 0.5857
0.9228 58.0 116 0.8527 0.6
0.9228 59.0 118 0.8492 0.6
0.8956 60.0 120 0.8456 0.6143
0.8956 61.0 122 0.8421 0.6
0.8956 62.0 124 0.8385 0.6
0.8956 63.0 126 0.8351 0.6
0.8956 64.0 128 0.8318 0.6143
0.8943 65.0 130 0.8286 0.6143
0.8943 66.0 132 0.8255 0.6
0.8943 67.0 134 0.8223 0.6286
0.8943 68.0 136 0.8191 0.6429
0.8943 69.0 138 0.8159 0.6286
0.854 70.0 140 0.8129 0.6429
0.854 71.0 142 0.8100 0.6714
0.854 72.0 144 0.8073 0.6714
0.854 73.0 146 0.8048 0.6571
0.854 74.0 148 0.8025 0.6714
0.8615 75.0 150 0.8001 0.6571
0.8615 76.0 152 0.7976 0.6571
0.8615 77.0 154 0.7952 0.6571
0.8615 78.0 156 0.7928 0.6571
0.8615 79.0 158 0.7904 0.6571
0.8507 80.0 160 0.7882 0.6714
0.8507 81.0 162 0.7858 0.6714
0.8507 82.0 164 0.7835 0.6857
0.8507 83.0 166 0.7811 0.6857
0.8507 84.0 168 0.7788 0.6857
0.838 85.0 170 0.7765 0.6857
0.838 86.0 172 0.7743 0.6857
0.838 87.0 174 0.7723 0.6857
0.838 88.0 176 0.7703 0.6857
0.838 89.0 178 0.7684 0.6857
0.8245 90.0 180 0.7664 0.6857
0.8245 91.0 182 0.7644 0.6857
0.8245 92.0 184 0.7625 0.6857
0.8245 93.0 186 0.7606 0.7143
0.8245 94.0 188 0.7587 0.7143
0.8124 95.0 190 0.7569 0.7143
0.8124 96.0 192 0.7551 0.7286
0.8124 97.0 194 0.7533 0.7286
0.8124 98.0 196 0.7517 0.7286
0.8124 99.0 198 0.7500 0.7429
0.8102 100.0 200 0.7483 0.7429
0.8102 101.0 202 0.7465 0.7429
0.8102 102.0 204 0.7450 0.7429
0.8102 103.0 206 0.7434 0.7429
0.8102 104.0 208 0.7419 0.7429
0.821 105.0 210 0.7404 0.7571
0.821 106.0 212 0.7389 0.7571
0.821 107.0 214 0.7374 0.7571
0.821 108.0 216 0.7359 0.7571
0.821 109.0 218 0.7345 0.7571
0.7918 110.0 220 0.7330 0.7571
0.7918 111.0 222 0.7316 0.7571
0.7918 112.0 224 0.7302 0.7571
0.7918 113.0 226 0.7289 0.7571
0.7918 114.0 228 0.7275 0.7571
0.8063 115.0 230 0.7262 0.7714
0.8063 116.0 232 0.7247 0.7714
0.8063 117.0 234 0.7232 0.7571
0.8063 118.0 236 0.7218 0.7571
0.8063 119.0 238 0.7204 0.7571
0.7897 120.0 240 0.7192 0.7571
0.7897 121.0 242 0.7180 0.7571
0.7897 122.0 244 0.7168 0.7571
0.7897 123.0 246 0.7158 0.7571
0.7897 124.0 248 0.7149 0.7714
0.7845 125.0 250 0.7140 0.7571
0.7845 126.0 252 0.7131 0.7571
0.7845 127.0 254 0.7121 0.7571
0.7845 128.0 256 0.7110 0.7571
0.7845 129.0 258 0.7099 0.7571
0.7781 130.0 260 0.7088 0.7571
0.7781 131.0 262 0.7076 0.7571
0.7781 132.0 264 0.7066 0.7571
0.7781 133.0 266 0.7055 0.7571
0.7781 134.0 268 0.7045 0.7714
0.7708 135.0 270 0.7034 0.7714
0.7708 136.0 272 0.7025 0.7571
0.7708 137.0 274 0.7016 0.7571
0.7708 138.0 276 0.7008 0.7571
0.7708 139.0 278 0.6999 0.7571
0.797 140.0 280 0.6990 0.7571
0.797 141.0 282 0.6981 0.7714
0.797 142.0 284 0.6973 0.7714
0.797 143.0 286 0.6966 0.7714
0.797 144.0 288 0.6959 0.7714
0.7768 145.0 290 0.6952 0.7714
0.7768 146.0 292 0.6944 0.7714
0.7768 147.0 294 0.6936 0.7714
0.7768 148.0 296 0.6928 0.7857
0.7768 149.0 298 0.6920 0.7857
0.7569 150.0 300 0.6912 0.7857
0.7569 151.0 302 0.6904 0.8
0.7569 152.0 304 0.6897 0.8
0.7569 153.0 306 0.6890 0.8
0.7569 154.0 308 0.6882 0.8
0.7807 155.0 310 0.6875 0.8143
0.7807 156.0 312 0.6868 0.8143
0.7807 157.0 314 0.6861 0.8143
0.7807 158.0 316 0.6854 0.8143
0.7807 159.0 318 0.6848 0.8143
0.7472 160.0 320 0.6842 0.8143
0.7472 161.0 322 0.6836 0.8143
0.7472 162.0 324 0.6831 0.8143
0.7472 163.0 326 0.6826 0.8143
0.7472 164.0 328 0.6822 0.8143
0.7665 165.0 330 0.6818 0.8
0.7665 166.0 332 0.6814 0.8
0.7665 167.0 334 0.6810 0.8
0.7665 168.0 336 0.6807 0.7857
0.7665 169.0 338 0.6803 0.7857
0.7684 170.0 340 0.6800 0.7857
0.7684 171.0 342 0.6797 0.7857
0.7684 172.0 344 0.6794 0.7857
0.7684 173.0 346 0.6790 0.7857
0.7684 174.0 348 0.6787 0.7857
0.7459 175.0 350 0.6784 0.7857
0.7459 176.0 352 0.6781 0.7857
0.7459 177.0 354 0.6778 0.7857
0.7459 178.0 356 0.6775 0.7857
0.7459 179.0 358 0.6772 0.7857
0.742 180.0 360 0.6769 0.7857
0.742 181.0 362 0.6766 0.7857
0.742 182.0 364 0.6764 0.7857
0.742 183.0 366 0.6762 0.7857
0.742 184.0 368 0.6760 0.7857
0.7642 185.0 370 0.6758 0.7857
0.7642 186.0 372 0.6756 0.7857
0.7642 187.0 374 0.6754 0.7857
0.7642 188.0 376 0.6752 0.7857
0.7642 189.0 378 0.6750 0.7857
0.7277 190.0 380 0.6749 0.7857
0.7277 191.0 382 0.6748 0.7857
0.7277 192.0 384 0.6746 0.7857
0.7277 193.0 386 0.6745 0.7857
0.7277 194.0 388 0.6745 0.7857
0.764 195.0 390 0.6744 0.7857
0.764 196.0 392 0.6743 0.7857
0.764 197.0 394 0.6742 0.7857
0.764 198.0 396 0.6742 0.8
0.764 199.0 398 0.6742 0.8
0.7444 200.0 400 0.6742 0.8

Framework versions

  • Transformers 4.33.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3