resnet-18-feature-extraction

This model is a fine-tuned version of microsoft/resnet-18 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1485
  • Accuracy: 0.95
  • Precision: 0.9653
  • Recall: 0.9789
  • F1: 0.9720
  • Roc Auc: 0.8505

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy Precision Recall F1 Roc Auc
No log 0.8 2 0.6232 0.75 0.9636 0.7465 0.8413 0.7621
No log 1.8 4 0.6971 0.4875 1.0 0.4225 0.5941 0.7113
No log 2.8 6 0.7915 0.2875 1.0 0.1972 0.3294 0.5986
No log 3.8 8 0.8480 0.2875 1.0 0.1972 0.3294 0.5986
0.8651 4.8 10 0.9094 0.2562 1.0 0.1620 0.2788 0.5810
0.8651 5.8 12 0.7470 0.5625 1.0 0.5070 0.6729 0.7535
0.8651 6.8 14 0.5915 0.85 1.0 0.8310 0.9077 0.9155
0.8651 7.8 16 0.4817 0.8875 0.9844 0.8873 0.9333 0.8881
0.8651 8.8 18 0.3455 0.9187 0.9778 0.9296 0.9531 0.8815
0.5349 9.8 20 0.2966 0.9187 0.9708 0.9366 0.9534 0.8572
0.5349 10.8 22 0.2347 0.95 0.9653 0.9789 0.9720 0.8505
0.5349 11.8 24 0.2468 0.9313 0.9645 0.9577 0.9611 0.8400
0.5349 12.8 26 0.2310 0.9563 0.9720 0.9789 0.9754 0.8783
0.5349 13.8 28 0.2083 0.9313 0.9580 0.9648 0.9614 0.8157
0.3593 14.8 30 0.1840 0.9375 0.9521 0.9789 0.9653 0.7950
0.3593 15.8 32 0.1947 0.9375 0.9648 0.9648 0.9648 0.8435
0.3593 16.8 34 0.1837 0.9313 0.9517 0.9718 0.9617 0.7915
0.3593 17.8 36 0.1819 0.9437 0.9524 0.9859 0.9689 0.7985
0.3593 18.8 38 0.1924 0.9437 0.9650 0.9718 0.9684 0.8470
0.2737 19.8 40 0.1990 0.95 0.9653 0.9789 0.9720 0.8505
0.2737 20.8 42 0.1759 0.95 0.9718 0.9718 0.9718 0.8748
0.2737 21.8 44 0.1804 0.9313 0.9517 0.9718 0.9617 0.7915
0.2737 22.8 46 0.1666 0.9313 0.9517 0.9718 0.9617 0.7915
0.2737 23.8 48 0.1534 0.9437 0.9524 0.9859 0.9689 0.7985
0.2278 24.8 50 0.1612 0.9375 0.9521 0.9789 0.9653 0.7950
0.2278 25.8 52 0.1535 0.9437 0.9586 0.9789 0.9686 0.8228
0.2278 26.8 54 0.1568 0.9437 0.9716 0.9648 0.9682 0.8713
0.2278 27.8 56 0.2107 0.9375 0.9714 0.9577 0.9645 0.8678
0.2278 28.8 58 0.1592 0.9313 0.9517 0.9718 0.9617 0.7915
0.2057 29.8 60 0.1557 0.9375 0.9648 0.9648 0.9648 0.8435
0.2057 30.8 62 0.1714 0.9437 0.9650 0.9718 0.9684 0.8470
0.2057 31.8 64 0.1571 0.95 0.9653 0.9789 0.9720 0.8505
0.2057 32.8 66 0.1574 0.9375 0.9583 0.9718 0.9650 0.8192
0.2057 33.8 68 0.1423 0.9563 0.9720 0.9789 0.9754 0.8783
0.2 34.8 70 0.1677 0.9437 0.9650 0.9718 0.9684 0.8470
0.2 35.8 72 0.1560 0.9375 0.9583 0.9718 0.9650 0.8192
0.2 36.8 74 0.1594 0.9375 0.9521 0.9789 0.9653 0.7950
0.2 37.8 76 0.1512 0.9437 0.9586 0.9789 0.9686 0.8228
0.2 38.8 78 0.1396 0.9563 0.9655 0.9859 0.9756 0.8541
0.1838 39.8 80 0.1509 0.9375 0.9583 0.9718 0.9650 0.8192
0.1838 40.8 82 0.1529 0.95 0.9718 0.9718 0.9718 0.8748
0.1838 41.8 84 0.1506 0.95 0.9653 0.9789 0.9720 0.8505
0.1838 42.8 86 0.1549 0.95 0.9653 0.9789 0.9720 0.8505
0.1838 43.8 88 0.1331 0.9563 0.9655 0.9859 0.9756 0.8541
0.1872 44.8 90 0.1409 0.9437 0.9524 0.9859 0.9689 0.7985
0.1872 45.8 92 0.1639 0.9375 0.9583 0.9718 0.9650 0.8192
0.1872 46.8 94 0.1391 0.95 0.9589 0.9859 0.9722 0.8263
0.1872 47.8 96 0.1436 0.9563 0.9655 0.9859 0.9756 0.8541
0.1872 48.8 98 0.1442 0.9437 0.9586 0.9789 0.9686 0.8228
0.185 49.8 100 0.1485 0.95 0.9653 0.9789 0.9720 0.8505

Framework versions

  • Transformers 4.24.0.dev0
  • Pytorch 1.11.0+cu102
  • Datasets 2.6.1
  • Tokenizers 0.13.1
Downloads last month
80
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Evaluation results