Melo1512's picture
End of training
2df3e56 verified
metadata
library_name: transformers
license: apache-2.0
base_model: Melo1512/vit-msn-small-wbc-blur-detector
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: vit-msn-small-wbc-classifier-100
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.9271936241481539

vit-msn-small-wbc-classifier-100

This model is a fine-tuned version of Melo1512/vit-msn-small-wbc-blur-detector on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2005
  • Accuracy: 0.9272

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.2356 1.0 208 0.2005 0.9272
0.2305 2.0 416 0.2259 0.9195
0.246 3.0 624 0.2097 0.9210
0.2585 4.0 832 0.2184 0.9180
0.2593 5.0 1040 0.2331 0.9171
0.2483 6.0 1248 0.2170 0.9198
0.268 7.0 1456 0.2228 0.9181
0.3112 8.0 1664 0.2361 0.9171
0.2679 9.0 1872 0.2273 0.9185
0.3099 10.0 2080 0.2303 0.9144
0.2749 11.0 2288 0.2658 0.9125
0.2475 12.0 2496 0.2247 0.9179
0.2338 13.0 2704 0.2333 0.9139
0.2731 14.0 2912 0.2295 0.9153
0.229 15.0 3120 0.2505 0.9138
0.2462 16.0 3328 0.2534 0.9137
0.2274 17.0 3536 0.2652 0.9079
0.2339 18.0 3744 0.2550 0.9153
0.2053 19.0 3952 0.2819 0.9106
0.2063 20.0 4160 0.2747 0.9129
0.1964 21.0 4368 0.2975 0.9118
0.1953 22.0 4576 0.2799 0.9145
0.1938 23.0 4784 0.3197 0.9100
0.1851 24.0 4992 0.3143 0.9138
0.1931 25.0 5200 0.3331 0.9125
0.1877 26.0 5408 0.3044 0.9110
0.177 27.0 5616 0.3271 0.9109
0.1529 28.0 5824 0.3382 0.9094
0.1684 29.0 6032 0.3415 0.9128
0.176 30.0 6240 0.3463 0.9095
0.1496 31.0 6448 0.3952 0.9136
0.1509 32.0 6656 0.3690 0.9121
0.1463 33.0 6864 0.3999 0.9094
0.1354 34.0 7072 0.3996 0.9135
0.1546 35.0 7280 0.3810 0.9116
0.1513 36.0 7488 0.3992 0.9121
0.115 37.0 7696 0.4295 0.9132
0.1479 38.0 7904 0.4363 0.9123
0.1455 39.0 8112 0.4220 0.9140
0.1353 40.0 8320 0.4112 0.9127
0.141 41.0 8528 0.4322 0.9139
0.1272 42.0 8736 0.4176 0.9119
0.1402 43.0 8944 0.4041 0.9108
0.1236 44.0 9152 0.4478 0.9095
0.1349 45.0 9360 0.4211 0.9112
0.1472 46.0 9568 0.4510 0.9113
0.1115 47.0 9776 0.4373 0.9119
0.1122 48.0 9984 0.4689 0.9129
0.1297 49.0 10192 0.4569 0.9140
0.1337 50.0 10400 0.4622 0.9111
0.1194 51.0 10608 0.4579 0.9151
0.1322 52.0 10816 0.4728 0.9104
0.1179 53.0 11024 0.4729 0.9125
0.1216 54.0 11232 0.5199 0.9114
0.1234 55.0 11440 0.4769 0.9135
0.1125 56.0 11648 0.4871 0.9118
0.1234 57.0 11856 0.4667 0.9146
0.1103 58.0 12064 0.4741 0.9119
0.1103 59.0 12272 0.4864 0.9129
0.1222 60.0 12480 0.4550 0.9143
0.127 61.0 12688 0.4919 0.9135
0.1117 62.0 12896 0.4946 0.9139
0.1078 63.0 13104 0.5040 0.9133
0.1127 64.0 13312 0.4804 0.9126
0.1122 65.0 13520 0.4997 0.9136
0.1089 66.0 13728 0.5134 0.9139
0.1179 67.0 13936 0.5246 0.9155
0.0934 68.0 14144 0.5158 0.9126
0.1011 69.0 14352 0.5361 0.9140
0.1063 70.0 14560 0.5326 0.9135
0.1021 71.0 14768 0.5151 0.9143
0.1007 72.0 14976 0.5390 0.9143
0.0946 73.0 15184 0.5256 0.9114
0.097 74.0 15392 0.5247 0.9135
0.0967 75.0 15600 0.5154 0.9144
0.0985 76.0 15808 0.5412 0.9154
0.0856 77.0 16016 0.5335 0.9148
0.103 78.0 16224 0.5210 0.9162
0.1033 79.0 16432 0.5165 0.9156
0.109 80.0 16640 0.5303 0.9150
0.0999 81.0 16848 0.5299 0.9158
0.0966 82.0 17056 0.5324 0.9167
0.0952 83.0 17264 0.5229 0.9168
0.1071 84.0 17472 0.5303 0.9176
0.0899 85.0 17680 0.5228 0.9160
0.0868 86.0 17888 0.5297 0.9149
0.1011 87.0 18096 0.5370 0.9156
0.0867 88.0 18304 0.5430 0.9158
0.0936 89.0 18512 0.5346 0.9165
0.0929 90.0 18720 0.5387 0.9163
0.0792 91.0 18928 0.5459 0.9150
0.0918 92.0 19136 0.5257 0.9165
0.0853 93.0 19344 0.5426 0.9155
0.0908 94.0 19552 0.5429 0.9153
0.0981 95.0 19760 0.5394 0.9155
0.0825 96.0 19968 0.5345 0.9168
0.0849 97.0 20176 0.5388 0.9164
0.0992 98.0 20384 0.5357 0.9168
0.0909 99.0 20592 0.5375 0.9167
0.0861 100.0 20800 0.5372 0.9166

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.19.1