Edit model card

Cinnamon-Plant-50-Epochs-Model

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3989
  • Accuracy: 0.8958
  • F1: 0.8960

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
0.0428 1.0 18 0.2528 0.9167 0.9167
0.0218 2.0 36 0.3322 0.875 0.8763
0.0149 3.0 54 0.2954 0.875 0.8763
0.0121 4.0 72 0.2941 0.8958 0.8965
0.0106 5.0 90 0.3093 0.875 0.8763
0.0096 6.0 108 0.3130 0.8958 0.8965
0.0088 7.0 126 0.3227 0.875 0.8763
0.0082 8.0 144 0.3197 0.9167 0.9170
0.0077 9.0 162 0.3323 0.8958 0.8965
0.0073 10.0 180 0.3310 0.9167 0.9170
0.0069 11.0 198 0.3378 0.9167 0.9170
0.0066 12.0 216 0.3427 0.8958 0.8965
0.0064 13.0 234 0.3437 0.9167 0.9170
0.0061 14.0 252 0.3483 0.9167 0.9170
0.0059 15.0 270 0.3504 0.9167 0.9170
0.0057 16.0 288 0.3539 0.9167 0.9170
0.0055 17.0 306 0.3597 0.8958 0.8965
0.0054 18.0 324 0.3623 0.8958 0.8965
0.0052 19.0 342 0.3638 0.8958 0.8965
0.0051 20.0 360 0.3670 0.8958 0.8965
0.0049 21.0 378 0.3672 0.9167 0.9170
0.0048 22.0 396 0.3690 0.9167 0.9170
0.0047 23.0 414 0.3704 0.9167 0.9170
0.0046 24.0 432 0.3735 0.9167 0.9170
0.0045 25.0 450 0.3748 0.8958 0.8960
0.0044 26.0 468 0.3775 0.9167 0.9170
0.0044 27.0 486 0.3779 0.8958 0.8960
0.0043 28.0 504 0.3797 0.8958 0.8960
0.0042 29.0 522 0.3818 0.8958 0.8960
0.0041 30.0 540 0.3840 0.8958 0.8960
0.0041 31.0 558 0.3845 0.8958 0.8960
0.004 32.0 576 0.3861 0.8958 0.8960
0.004 33.0 594 0.3877 0.8958 0.8960
0.0039 34.0 612 0.3892 0.8958 0.8960
0.0039 35.0 630 0.3901 0.8958 0.8960
0.0038 36.0 648 0.3912 0.8958 0.8960
0.0038 37.0 666 0.3921 0.8958 0.8960
0.0038 38.0 684 0.3929 0.8958 0.8960
0.0037 39.0 702 0.3935 0.8958 0.8960
0.0037 40.0 720 0.3940 0.8958 0.8960
0.0037 41.0 738 0.3951 0.8958 0.8960
0.0036 42.0 756 0.3958 0.8958 0.8960
0.0036 43.0 774 0.3964 0.8958 0.8960
0.0036 44.0 792 0.3973 0.8958 0.8960
0.0036 45.0 810 0.3978 0.8958 0.8960
0.0036 46.0 828 0.3982 0.8958 0.8960
0.0036 47.0 846 0.3985 0.8958 0.8960
0.0036 48.0 864 0.3987 0.8958 0.8960
0.0035 49.0 882 0.3989 0.8958 0.8960
0.0035 50.0 900 0.3989 0.8958 0.8960

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
4
Safetensors
Model size
85.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Akshay0706/Cinnamon-Plant-50-Epochs-Model

Finetuned
(1694)
this model

Evaluation results