metadata
license: apache-2.0
base_model: microsoft/resnet-50
tags:
- generated_from_trainer
datasets:
- stanford-dogs
metrics:
- accuracy
- f1
- precision
- recall
model-index:
- name: microsoft-resnet-50-batch32-lr0.0005-standford-dogs
results:
- task:
name: Image Classification
type: image-classification
dataset:
name: stanford-dogs
type: stanford-dogs
config: default
split: full
args: default
metrics:
- name: Accuracy
type: accuracy
value: 0.8386783284742468
- name: F1
type: f1
value: 0.8259546998355447
- name: Precision
type: precision
value: 0.8457483127517197
- name: Recall
type: recall
value: 0.8314858626273427
microsoft-resnet-50-batch32-lr0.0005-standford-dogs
This model is a fine-tuned version of microsoft/resnet-50 on the stanford-dogs dataset. It achieves the following results on the evaluation set:
- Loss: 1.1545
- Accuracy: 0.8387
- F1: 0.8260
- Precision: 0.8457
- Recall: 0.8315
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- training_steps: 1000
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
---|---|---|---|---|---|---|---|
4.7829 | 0.0777 | 10 | 4.7747 | 0.2119 | 0.1874 | 0.3919 | 0.1982 |
4.7714 | 0.1553 | 20 | 4.7572 | 0.2038 | 0.1842 | 0.4262 | 0.1836 |
4.7606 | 0.2330 | 30 | 4.7367 | 0.3586 | 0.3433 | 0.6517 | 0.3307 |
4.747 | 0.3107 | 40 | 4.7149 | 0.4303 | 0.4272 | 0.7734 | 0.4039 |
4.7253 | 0.3883 | 50 | 4.6846 | 0.4361 | 0.4678 | 0.7906 | 0.4160 |
4.7069 | 0.4660 | 60 | 4.6534 | 0.5330 | 0.5397 | 0.8048 | 0.5093 |
4.6857 | 0.5437 | 70 | 4.6177 | 0.5500 | 0.5511 | 0.7998 | 0.5264 |
4.6569 | 0.6214 | 80 | 4.5764 | 0.5739 | 0.5800 | 0.8208 | 0.5517 |
4.6293 | 0.6990 | 90 | 4.5359 | 0.6142 | 0.6149 | 0.8075 | 0.5926 |
4.5953 | 0.7767 | 100 | 4.4828 | 0.6207 | 0.6233 | 0.8109 | 0.6000 |
4.5651 | 0.8544 | 110 | 4.4257 | 0.6591 | 0.6585 | 0.8148 | 0.6393 |
4.5296 | 0.9320 | 120 | 4.3647 | 0.7063 | 0.7012 | 0.8284 | 0.6882 |
4.4911 | 1.0097 | 130 | 4.2998 | 0.7089 | 0.7074 | 0.8326 | 0.6924 |
4.4442 | 1.0874 | 140 | 4.2288 | 0.6939 | 0.6890 | 0.8302 | 0.6759 |
4.3912 | 1.1650 | 150 | 4.1527 | 0.6873 | 0.6863 | 0.8262 | 0.6703 |
4.3393 | 1.2427 | 160 | 4.0884 | 0.7250 | 0.7127 | 0.8251 | 0.7082 |
4.3019 | 1.3204 | 170 | 3.9946 | 0.7262 | 0.7152 | 0.8234 | 0.7098 |
4.2366 | 1.3981 | 180 | 3.9314 | 0.7301 | 0.7177 | 0.8230 | 0.7143 |
4.1966 | 1.4757 | 190 | 3.8398 | 0.7325 | 0.7196 | 0.8169 | 0.7175 |
4.1402 | 1.5534 | 200 | 3.7587 | 0.7381 | 0.7217 | 0.8149 | 0.7221 |
4.0771 | 1.6311 | 210 | 3.6745 | 0.7310 | 0.7149 | 0.8125 | 0.7160 |
4.0436 | 1.7087 | 220 | 3.5729 | 0.7364 | 0.7189 | 0.8121 | 0.7214 |
3.9697 | 1.7864 | 230 | 3.5030 | 0.7490 | 0.7339 | 0.8172 | 0.7358 |
3.9181 | 1.8641 | 240 | 3.4505 | 0.7541 | 0.7379 | 0.8123 | 0.7408 |
3.8573 | 1.9417 | 250 | 3.3529 | 0.7646 | 0.7453 | 0.8136 | 0.7521 |
3.8077 | 2.0194 | 260 | 3.2566 | 0.7660 | 0.7482 | 0.8093 | 0.7540 |
3.7449 | 2.0971 | 270 | 3.1869 | 0.7709 | 0.7510 | 0.8144 | 0.7588 |
3.682 | 2.1748 | 280 | 3.0898 | 0.7668 | 0.7440 | 0.8097 | 0.7548 |
3.6461 | 2.2524 | 290 | 3.0377 | 0.7641 | 0.7381 | 0.8100 | 0.7511 |
3.6004 | 2.3301 | 300 | 2.9001 | 0.7648 | 0.7384 | 0.8061 | 0.7522 |
3.5478 | 2.4078 | 310 | 2.8623 | 0.7653 | 0.7410 | 0.8060 | 0.7529 |
3.4971 | 2.4854 | 320 | 2.7961 | 0.7675 | 0.7447 | 0.8068 | 0.7558 |
3.4446 | 2.5631 | 330 | 2.6960 | 0.7690 | 0.7486 | 0.8128 | 0.7582 |
3.4093 | 2.6408 | 340 | 2.6480 | 0.7821 | 0.7652 | 0.8151 | 0.7718 |
3.3994 | 2.7184 | 350 | 2.5330 | 0.7847 | 0.7676 | 0.8156 | 0.7742 |
3.2963 | 2.7961 | 360 | 2.4866 | 0.7855 | 0.7681 | 0.8154 | 0.7752 |
3.2615 | 2.8738 | 370 | 2.4344 | 0.7891 | 0.7740 | 0.8172 | 0.7792 |
3.2024 | 2.9515 | 380 | 2.4011 | 0.7794 | 0.7638 | 0.8126 | 0.7694 |
3.1641 | 3.0291 | 390 | 2.3039 | 0.7835 | 0.7659 | 0.8100 | 0.7736 |
3.0719 | 3.1068 | 400 | 2.2471 | 0.7796 | 0.7608 | 0.8072 | 0.7691 |
3.0808 | 3.1845 | 410 | 2.2130 | 0.7896 | 0.7717 | 0.8137 | 0.7795 |
2.9916 | 3.2621 | 420 | 2.1387 | 0.7823 | 0.7652 | 0.8104 | 0.7718 |
2.9898 | 3.3398 | 430 | 2.0905 | 0.7981 | 0.7821 | 0.8250 | 0.7886 |
2.9597 | 3.4175 | 440 | 2.0260 | 0.7923 | 0.7769 | 0.8192 | 0.7826 |
2.9068 | 3.4951 | 450 | 1.9944 | 0.7976 | 0.7816 | 0.8233 | 0.7877 |
2.8423 | 3.5728 | 460 | 1.9643 | 0.7976 | 0.7805 | 0.8185 | 0.7876 |
2.8323 | 3.6505 | 470 | 1.8926 | 0.7935 | 0.7754 | 0.8136 | 0.7837 |
2.7814 | 3.7282 | 480 | 1.8676 | 0.8017 | 0.7856 | 0.8208 | 0.7917 |
2.7337 | 3.8058 | 490 | 1.8320 | 0.8052 | 0.7905 | 0.8246 | 0.7957 |
2.7215 | 3.8835 | 500 | 1.8003 | 0.7986 | 0.7834 | 0.8208 | 0.7890 |
2.6456 | 3.9612 | 510 | 1.7754 | 0.8005 | 0.7848 | 0.8230 | 0.7914 |
2.6494 | 4.0388 | 520 | 1.7083 | 0.8054 | 0.7895 | 0.8252 | 0.7967 |
2.5878 | 4.1165 | 530 | 1.6836 | 0.8054 | 0.7878 | 0.8239 | 0.7967 |
2.592 | 4.1942 | 540 | 1.6770 | 0.8005 | 0.7826 | 0.8220 | 0.7912 |
2.5698 | 4.2718 | 550 | 1.6184 | 0.8056 | 0.7881 | 0.8268 | 0.7970 |
2.52 | 4.3495 | 560 | 1.6368 | 0.8064 | 0.7898 | 0.8267 | 0.7975 |
2.5317 | 4.4272 | 570 | 1.5952 | 0.8059 | 0.7891 | 0.8289 | 0.7972 |
2.4199 | 4.5049 | 580 | 1.5518 | 0.8163 | 0.8002 | 0.8337 | 0.8082 |
2.4357 | 4.5825 | 590 | 1.5375 | 0.8095 | 0.7933 | 0.8263 | 0.8012 |
2.4217 | 4.6602 | 600 | 1.4994 | 0.8127 | 0.7964 | 0.8297 | 0.8042 |
2.428 | 4.7379 | 610 | 1.4671 | 0.8156 | 0.8003 | 0.8309 | 0.8074 |
2.3725 | 4.8155 | 620 | 1.4402 | 0.8141 | 0.7973 | 0.8295 | 0.8054 |
2.3594 | 4.8932 | 630 | 1.4566 | 0.8134 | 0.7976 | 0.8287 | 0.8049 |
2.3279 | 4.9709 | 640 | 1.4359 | 0.8183 | 0.8034 | 0.8314 | 0.8100 |
2.3166 | 5.0485 | 650 | 1.4067 | 0.8226 | 0.8086 | 0.8343 | 0.8149 |
2.3062 | 5.1262 | 660 | 1.3913 | 0.8212 | 0.8072 | 0.8340 | 0.8131 |
2.3096 | 5.2039 | 670 | 1.3577 | 0.8241 | 0.8107 | 0.8373 | 0.8159 |
2.2514 | 5.2816 | 680 | 1.3574 | 0.8270 | 0.8136 | 0.8371 | 0.8193 |
2.2053 | 5.3592 | 690 | 1.3450 | 0.8239 | 0.8101 | 0.8370 | 0.8164 |
2.2347 | 5.4369 | 700 | 1.3331 | 0.8270 | 0.8137 | 0.8388 | 0.8194 |
2.215 | 5.5146 | 710 | 1.2902 | 0.8294 | 0.8154 | 0.8419 | 0.8219 |
2.175 | 5.5922 | 720 | 1.2861 | 0.8256 | 0.8114 | 0.8388 | 0.8181 |
2.2212 | 5.6699 | 730 | 1.2637 | 0.8321 | 0.8180 | 0.8440 | 0.8241 |
2.1459 | 5.7476 | 740 | 1.2827 | 0.8302 | 0.8166 | 0.8396 | 0.8227 |
2.1615 | 5.8252 | 750 | 1.2800 | 0.8311 | 0.8184 | 0.8496 | 0.8239 |
2.0966 | 5.9029 | 760 | 1.2742 | 0.8326 | 0.8195 | 0.8418 | 0.8251 |
2.1314 | 5.9806 | 770 | 1.2464 | 0.8316 | 0.8184 | 0.8407 | 0.8238 |
2.0846 | 6.0583 | 780 | 1.2409 | 0.8326 | 0.8189 | 0.8414 | 0.8250 |
2.0522 | 6.1359 | 790 | 1.2023 | 0.8365 | 0.8233 | 0.8455 | 0.8292 |
2.0724 | 6.2136 | 800 | 1.2252 | 0.8309 | 0.8174 | 0.8396 | 0.8235 |
2.0848 | 6.2913 | 810 | 1.2025 | 0.8321 | 0.8186 | 0.8424 | 0.8248 |
2.0402 | 6.3689 | 820 | 1.2130 | 0.8333 | 0.8189 | 0.8428 | 0.8255 |
2.0778 | 6.4466 | 830 | 1.1809 | 0.8375 | 0.8249 | 0.8532 | 0.8302 |
2.0963 | 6.5243 | 840 | 1.1696 | 0.8365 | 0.8231 | 0.8527 | 0.8289 |
2.0576 | 6.6019 | 850 | 1.1866 | 0.8321 | 0.8181 | 0.8411 | 0.8245 |
2.0386 | 6.6796 | 860 | 1.1882 | 0.8302 | 0.8160 | 0.8389 | 0.8227 |
2.0084 | 6.7573 | 870 | 1.1696 | 0.8372 | 0.8244 | 0.8446 | 0.8301 |
2.0571 | 6.8350 | 880 | 1.1622 | 0.8353 | 0.8217 | 0.8437 | 0.8280 |
2.0264 | 6.9126 | 890 | 1.1640 | 0.8336 | 0.8204 | 0.8429 | 0.8263 |
2.0077 | 6.9903 | 900 | 1.1673 | 0.8367 | 0.8241 | 0.8447 | 0.8295 |
2.0492 | 7.0680 | 910 | 1.1455 | 0.8404 | 0.8269 | 0.8462 | 0.8330 |
1.9973 | 7.1456 | 920 | 1.1538 | 0.8379 | 0.8250 | 0.8455 | 0.8307 |
1.9961 | 7.2233 | 930 | 1.1502 | 0.8367 | 0.8236 | 0.8415 | 0.8295 |
1.9681 | 7.3010 | 940 | 1.1657 | 0.8384 | 0.8254 | 0.8463 | 0.8311 |
2.0188 | 7.3786 | 950 | 1.1309 | 0.8379 | 0.8252 | 0.8445 | 0.8310 |
2.0225 | 7.4563 | 960 | 1.1547 | 0.8367 | 0.8231 | 0.8446 | 0.8294 |
1.9562 | 7.5340 | 970 | 1.1474 | 0.8377 | 0.8243 | 0.8457 | 0.8305 |
2.0247 | 7.6117 | 980 | 1.1251 | 0.8365 | 0.8241 | 0.8449 | 0.8294 |
1.9355 | 7.6893 | 990 | 1.1349 | 0.8397 | 0.8276 | 0.8532 | 0.8329 |
1.9804 | 7.7670 | 1000 | 1.1545 | 0.8387 | 0.8260 | 0.8457 | 0.8315 |
Framework versions
- Transformers 4.40.2
- Pytorch 2.3.0
- Datasets 2.19.1
- Tokenizers 0.19.1