amaye15's picture
End of training
b834f7b verified
---
license: apache-2.0
base_model: microsoft/resnet-50
tags:
- generated_from_trainer
datasets:
- stanford-dogs
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: microsoft-resnet-50-batch32-lr0.005-standford-dogs
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: stanford-dogs
type: stanford-dogs
config: default
split: full
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.82555879494655
- name: F1
type: f1
value: 0.8098053489000772
- name: Precision
type: precision
value: 0.8426096100022951
- name: Recall
type: recall
value: 0.817750070550628
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# microsoft-resnet-50-batch32-lr0.005-standford-dogs
This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co./microsoft/resnet-50) on the stanford-dogs dataset.
It achieves the following results on the evaluation set:
- Loss: 1.1192
- Accuracy: 0.8256
- F1: 0.8098
- Precision: 0.8426
- Recall: 0.8178
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 1000
### Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|:-------------:|:------:|:----:|:---------------:|:--------:|:------:|:---------:|:------:|
| 4.7839 | 0.0777 | 10 | 4.7747 | 0.2556 | 0.2410 | 0.4479 | 0.2436 |
| 4.7731 | 0.1553 | 20 | 4.7576 | 0.3511 | 0.3282 | 0.6032 | 0.3338 |
| 4.7617 | 0.2330 | 30 | 4.7363 | 0.4184 | 0.3974 | 0.6668 | 0.3947 |
| 4.7445 | 0.3107 | 40 | 4.7115 | 0.5265 | 0.4927 | 0.7032 | 0.4993 |
| 4.7266 | 0.3883 | 50 | 4.6846 | 0.5561 | 0.5413 | 0.7422 | 0.5333 |
| 4.7081 | 0.4660 | 60 | 4.6547 | 0.6062 | 0.5767 | 0.7392 | 0.5828 |
| 4.6807 | 0.5437 | 70 | 4.6161 | 0.5909 | 0.5750 | 0.7740 | 0.5673 |
| 4.6572 | 0.6214 | 80 | 4.5761 | 0.6324 | 0.6162 | 0.8021 | 0.6102 |
| 4.6286 | 0.6990 | 90 | 4.5274 | 0.6297 | 0.6241 | 0.8188 | 0.6080 |
| 4.598 | 0.7767 | 100 | 4.4746 | 0.6569 | 0.6609 | 0.8380 | 0.6370 |
| 4.5578 | 0.8544 | 110 | 4.4193 | 0.6674 | 0.6713 | 0.8301 | 0.6486 |
| 4.521 | 0.9320 | 120 | 4.3553 | 0.6914 | 0.6868 | 0.8215 | 0.6729 |
| 4.4888 | 1.0097 | 130 | 4.2924 | 0.7082 | 0.7064 | 0.8415 | 0.6904 |
| 4.4312 | 1.0874 | 140 | 4.2125 | 0.7155 | 0.7076 | 0.8381 | 0.6980 |
| 4.3865 | 1.1650 | 150 | 4.1433 | 0.7145 | 0.7115 | 0.8315 | 0.6984 |
| 4.336 | 1.2427 | 160 | 4.0630 | 0.7082 | 0.7010 | 0.8353 | 0.6930 |
| 4.2903 | 1.3204 | 170 | 3.9781 | 0.7148 | 0.7024 | 0.8109 | 0.6982 |
| 4.2465 | 1.3981 | 180 | 3.8896 | 0.7376 | 0.7234 | 0.8328 | 0.7217 |
| 4.1924 | 1.4757 | 190 | 3.8117 | 0.7476 | 0.7310 | 0.8161 | 0.7322 |
| 4.1217 | 1.5534 | 200 | 3.7499 | 0.7510 | 0.7344 | 0.8105 | 0.7372 |
| 4.068 | 1.6311 | 210 | 3.6340 | 0.7551 | 0.7355 | 0.8183 | 0.7409 |
| 4.0148 | 1.7087 | 220 | 3.5678 | 0.7546 | 0.7358 | 0.8066 | 0.7413 |
| 3.9682 | 1.7864 | 230 | 3.4852 | 0.7663 | 0.7477 | 0.8145 | 0.7530 |
| 3.9196 | 1.8641 | 240 | 3.3841 | 0.7648 | 0.7464 | 0.8075 | 0.7520 |
| 3.8481 | 1.9417 | 250 | 3.3003 | 0.7626 | 0.7421 | 0.8056 | 0.7495 |
| 3.8017 | 2.0194 | 260 | 3.2395 | 0.7578 | 0.7370 | 0.8045 | 0.7461 |
| 3.7528 | 2.0971 | 270 | 3.1183 | 0.7578 | 0.7349 | 0.8007 | 0.7457 |
| 3.6614 | 2.1748 | 280 | 3.0364 | 0.7655 | 0.7435 | 0.8011 | 0.7531 |
| 3.6522 | 2.2524 | 290 | 2.9775 | 0.7629 | 0.7415 | 0.7990 | 0.7507 |
| 3.5922 | 2.3301 | 300 | 2.8995 | 0.7665 | 0.7466 | 0.8090 | 0.7551 |
| 3.519 | 2.4078 | 310 | 2.8049 | 0.7680 | 0.7488 | 0.8129 | 0.7566 |
| 3.4724 | 2.4854 | 320 | 2.7425 | 0.7704 | 0.7528 | 0.8170 | 0.7601 |
| 3.4333 | 2.5631 | 330 | 2.6444 | 0.7755 | 0.7560 | 0.8236 | 0.7648 |
| 3.4303 | 2.6408 | 340 | 2.5672 | 0.7687 | 0.7473 | 0.8178 | 0.7585 |
| 3.3287 | 2.7184 | 350 | 2.5194 | 0.7806 | 0.7599 | 0.8229 | 0.7712 |
| 3.2916 | 2.7961 | 360 | 2.4733 | 0.7796 | 0.7575 | 0.8223 | 0.7698 |
| 3.1999 | 2.8738 | 370 | 2.4098 | 0.7792 | 0.7565 | 0.8158 | 0.7692 |
| 3.211 | 2.9515 | 380 | 2.3081 | 0.7796 | 0.7571 | 0.8284 | 0.7692 |
| 3.1437 | 3.0291 | 390 | 2.2523 | 0.7830 | 0.7600 | 0.8212 | 0.7730 |
| 3.1036 | 3.1068 | 400 | 2.2000 | 0.7847 | 0.7619 | 0.8210 | 0.7740 |
| 3.0345 | 3.1845 | 410 | 2.1385 | 0.7833 | 0.7606 | 0.8261 | 0.7726 |
| 2.99 | 3.2621 | 420 | 2.1079 | 0.7799 | 0.7560 | 0.8199 | 0.7698 |
| 2.9386 | 3.3398 | 430 | 2.0585 | 0.7821 | 0.7584 | 0.8232 | 0.7716 |
| 2.9093 | 3.4175 | 440 | 2.0176 | 0.7823 | 0.7586 | 0.8225 | 0.7721 |
| 2.8868 | 3.4951 | 450 | 1.9702 | 0.7818 | 0.7585 | 0.8183 | 0.7720 |
| 2.8603 | 3.5728 | 460 | 1.8973 | 0.7864 | 0.7645 | 0.8241 | 0.7767 |
| 2.8232 | 3.6505 | 470 | 1.8814 | 0.7855 | 0.7616 | 0.8128 | 0.7758 |
| 2.7889 | 3.7282 | 480 | 1.8170 | 0.7886 | 0.7676 | 0.8214 | 0.7792 |
| 2.7561 | 3.8058 | 490 | 1.7750 | 0.7920 | 0.7721 | 0.8364 | 0.7828 |
| 2.7243 | 3.8835 | 500 | 1.7369 | 0.7906 | 0.7695 | 0.8295 | 0.7813 |
| 2.6619 | 3.9612 | 510 | 1.7225 | 0.7971 | 0.7766 | 0.8292 | 0.7884 |
| 2.7054 | 4.0388 | 520 | 1.6453 | 0.7983 | 0.7788 | 0.8346 | 0.7894 |
| 2.6069 | 4.1165 | 530 | 1.6340 | 0.8000 | 0.7807 | 0.8347 | 0.7910 |
| 2.5627 | 4.1942 | 540 | 1.6538 | 0.7971 | 0.7760 | 0.8337 | 0.7878 |
| 2.5555 | 4.2718 | 550 | 1.5779 | 0.7998 | 0.7785 | 0.8324 | 0.7906 |
| 2.5541 | 4.3495 | 560 | 1.5960 | 0.7945 | 0.7736 | 0.8329 | 0.7850 |
| 2.513 | 4.4272 | 570 | 1.5537 | 0.8025 | 0.7841 | 0.8368 | 0.7941 |
| 2.442 | 4.5049 | 580 | 1.5196 | 0.8034 | 0.7858 | 0.8380 | 0.7954 |
| 2.4763 | 4.5825 | 590 | 1.5009 | 0.8052 | 0.7870 | 0.8345 | 0.7965 |
| 2.4412 | 4.6602 | 600 | 1.4760 | 0.8098 | 0.7924 | 0.8391 | 0.8015 |
| 2.383 | 4.7379 | 610 | 1.4403 | 0.8088 | 0.7920 | 0.8395 | 0.8007 |
| 2.3731 | 4.8155 | 620 | 1.4123 | 0.8120 | 0.7956 | 0.8401 | 0.8039 |
| 2.3616 | 4.8932 | 630 | 1.4193 | 0.8105 | 0.7940 | 0.8369 | 0.8021 |
| 2.3311 | 4.9709 | 640 | 1.4220 | 0.8098 | 0.7934 | 0.8370 | 0.8016 |
| 2.3373 | 5.0485 | 650 | 1.3956 | 0.8081 | 0.7907 | 0.8367 | 0.7996 |
| 2.2879 | 5.1262 | 660 | 1.3375 | 0.8144 | 0.7976 | 0.8410 | 0.8062 |
| 2.299 | 5.2039 | 670 | 1.3431 | 0.8146 | 0.7967 | 0.8371 | 0.8061 |
| 2.2471 | 5.2816 | 680 | 1.3360 | 0.8151 | 0.7985 | 0.8389 | 0.8070 |
| 2.2419 | 5.3592 | 690 | 1.3139 | 0.8139 | 0.7977 | 0.8377 | 0.8058 |
| 2.2195 | 5.4369 | 700 | 1.3225 | 0.8151 | 0.7974 | 0.8395 | 0.8062 |
| 2.1901 | 5.5146 | 710 | 1.2797 | 0.8173 | 0.8001 | 0.8397 | 0.8087 |
| 2.1931 | 5.5922 | 720 | 1.2543 | 0.8192 | 0.8032 | 0.8423 | 0.8109 |
| 2.195 | 5.6699 | 730 | 1.2767 | 0.8209 | 0.8039 | 0.8405 | 0.8125 |
| 2.1413 | 5.7476 | 740 | 1.2735 | 0.8212 | 0.8053 | 0.8416 | 0.8132 |
| 2.1696 | 5.8252 | 750 | 1.2694 | 0.8149 | 0.7983 | 0.8358 | 0.8069 |
| 2.1387 | 5.9029 | 760 | 1.2532 | 0.8217 | 0.8062 | 0.8422 | 0.8136 |
| 2.1811 | 5.9806 | 770 | 1.2426 | 0.8197 | 0.8034 | 0.8417 | 0.8116 |
| 2.077 | 6.0583 | 780 | 1.2101 | 0.8243 | 0.8078 | 0.8464 | 0.8159 |
| 2.1099 | 6.1359 | 790 | 1.1947 | 0.8265 | 0.8108 | 0.8455 | 0.8186 |
| 2.0825 | 6.2136 | 800 | 1.1826 | 0.8241 | 0.8080 | 0.8455 | 0.8161 |
| 2.0933 | 6.2913 | 810 | 1.1934 | 0.8282 | 0.8128 | 0.8474 | 0.8207 |
| 2.0857 | 6.3689 | 820 | 1.1897 | 0.8258 | 0.8099 | 0.8465 | 0.8181 |
| 2.0881 | 6.4466 | 830 | 1.1666 | 0.8277 | 0.8124 | 0.8477 | 0.8199 |
| 2.074 | 6.5243 | 840 | 1.1815 | 0.8248 | 0.8081 | 0.8433 | 0.8167 |
| 2.0145 | 6.6019 | 850 | 1.1680 | 0.8292 | 0.8130 | 0.8473 | 0.8209 |
| 2.0778 | 6.6796 | 860 | 1.1565 | 0.8260 | 0.8094 | 0.8348 | 0.8178 |
| 1.9784 | 6.7573 | 870 | 1.1571 | 0.8345 | 0.8201 | 0.8529 | 0.8269 |
| 2.0595 | 6.8350 | 880 | 1.1554 | 0.8309 | 0.8165 | 0.8475 | 0.8234 |
| 2.0252 | 6.9126 | 890 | 1.1444 | 0.8282 | 0.8140 | 0.8476 | 0.8209 |
| 1.9708 | 6.9903 | 900 | 1.1478 | 0.8302 | 0.8158 | 0.8472 | 0.8224 |
| 2.0656 | 7.0680 | 910 | 1.1285 | 0.8324 | 0.8169 | 0.8485 | 0.8245 |
| 2.0086 | 7.1456 | 920 | 1.1289 | 0.8290 | 0.8148 | 0.8444 | 0.8219 |
| 2.0056 | 7.2233 | 930 | 1.1268 | 0.8280 | 0.8130 | 0.8470 | 0.8208 |
| 1.9498 | 7.3010 | 940 | 1.1246 | 0.8311 | 0.8158 | 0.8497 | 0.8234 |
| 2.0067 | 7.3786 | 950 | 1.1495 | 0.8285 | 0.8132 | 0.8440 | 0.8207 |
| 2.0171 | 7.4563 | 960 | 1.1168 | 0.8285 | 0.8138 | 0.8501 | 0.8209 |
| 1.9683 | 7.5340 | 970 | 1.1290 | 0.8314 | 0.8165 | 0.8500 | 0.8235 |
| 1.9771 | 7.6117 | 980 | 1.0982 | 0.8314 | 0.8153 | 0.8454 | 0.8233 |
| 2.0086 | 7.6893 | 990 | 1.1275 | 0.8294 | 0.8151 | 0.8491 | 0.8218 |
| 1.9854 | 7.7670 | 1000 | 1.1192 | 0.8256 | 0.8098 | 0.8426 | 0.8178 |
### Framework versions
- Transformers 4.40.2
- Pytorch 2.3.0
- Datasets 2.19.1
- Tokenizers 0.19.1