Tr13's picture
End of training
f5e60c6 verified
metadata
library_name: transformers
license: other
base_model: apple/mobilevit-small
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: my_awesome_food_model
    results: []

my_awesome_food_model

This model is a fine-tuned version of apple/mobilevit-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5132
  • Accuracy: 0.8636

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy
4.614 0.9829 43 4.6167 0.01
4.5907 1.9886 87 4.5880 0.0214
4.5382 2.9943 131 4.5343 0.105
4.4551 4.0 175 4.4235 0.3043
4.287 4.9829 218 4.2081 0.5043
3.885 5.9886 262 3.8133 0.5307
3.4412 6.9943 306 3.3131 0.4907
2.8825 8.0 350 2.8045 0.4871
2.5521 8.9829 393 2.3834 0.4671
2.2911 9.9886 437 2.0563 0.5186
2.0126 10.9943 481 1.8029 0.5679
1.8294 12.0 525 1.6611 0.605
1.745 12.9829 568 1.4988 0.6257
1.6162 13.9886 612 1.3868 0.62
1.5084 14.9943 656 1.2984 0.6429
1.4441 16.0 700 1.2081 0.6457
1.3625 16.9829 743 1.1554 0.6814
1.2752 17.9886 787 1.0955 0.6929
1.224 18.9943 831 1.0373 0.7164
1.2096 20.0 875 1.0375 0.7164
1.1551 20.9829 918 0.9842 0.7414
1.1079 21.9886 962 0.9645 0.7571
1.0669 22.9943 1006 0.9150 0.77
1.0206 24.0 1050 0.8508 0.7836
0.9963 24.9829 1093 0.8458 0.7743
0.9132 25.9886 1137 0.7838 0.7971
0.863 26.9943 1181 0.7590 0.8057
0.8669 28.0 1225 0.7646 0.785
0.8776 28.9829 1268 0.7084 0.8157
0.793 29.9886 1312 0.6862 0.82
0.7941 30.9943 1356 0.6971 0.8143
0.7863 32.0 1400 0.6135 0.8314
0.7344 32.9829 1443 0.5961 0.8407
0.6888 33.9886 1487 0.6304 0.845
0.6693 34.9943 1531 0.6011 0.8364
0.6736 36.0 1575 0.5917 0.8364
0.6739 36.9829 1618 0.5933 0.8336
0.6595 37.9886 1662 0.5824 0.8357
0.641 38.9943 1706 0.5232 0.8579
0.576 40.0 1750 0.5700 0.8393
0.6097 40.9829 1793 0.5384 0.8471
0.6016 41.9886 1837 0.5824 0.8379
0.6017 42.9943 1881 0.5511 0.8443
0.5937 44.0 1925 0.5095 0.8621
0.5674 44.9829 1968 0.5299 0.8536
0.5575 45.9886 2012 0.5106 0.8507
0.5709 46.9943 2056 0.5445 0.8507
0.5046 48.0 2100 0.4848 0.855
0.5485 48.9829 2143 0.5097 0.8564
0.4865 49.9886 2187 0.5227 0.8471
0.5505 50.9943 2231 0.5127 0.8507
0.4827 52.0 2275 0.5253 0.8493
0.5121 52.9829 2318 0.5095 0.8636
0.4879 53.9886 2362 0.5053 0.8621
0.5008 54.9943 2406 0.5196 0.8521
0.489 56.0 2450 0.4834 0.8657
0.5019 56.9829 2493 0.4714 0.8614
0.4828 57.9886 2537 0.5019 0.8571
0.4373 58.9943 2581 0.4894 0.8679
0.4444 60.0 2625 0.5093 0.8657
0.4178 60.9829 2668 0.5058 0.8614
0.4081 61.9886 2712 0.4996 0.8586
0.4311 62.9943 2756 0.4973 0.8557
0.425 64.0 2800 0.4627 0.8743
0.4147 64.9829 2843 0.4875 0.865
0.4505 65.9886 2887 0.4918 0.8636
0.3621 66.9943 2931 0.4903 0.86
0.4072 68.0 2975 0.4983 0.8564
0.3883 68.9829 3018 0.4635 0.8743
0.4284 69.9886 3062 0.4582 0.8686
0.3891 70.9943 3106 0.4456 0.8793
0.4255 72.0 3150 0.4760 0.87
0.425 72.9829 3193 0.4905 0.8721
0.4301 73.9886 3237 0.4942 0.8643
0.3666 74.9943 3281 0.4824 0.8629
0.4275 76.0 3325 0.4638 0.8671
0.4161 76.9829 3368 0.4859 0.8621
0.3773 77.9886 3412 0.4918 0.8521
0.3591 78.9943 3456 0.4881 0.8729
0.4018 80.0 3500 0.4681 0.8707
0.404 80.9829 3543 0.4882 0.86
0.3987 81.9886 3587 0.4796 0.8657
0.3546 82.9943 3631 0.4945 0.8643
0.3795 84.0 3675 0.4638 0.8679
0.4007 84.9829 3718 0.4624 0.8729
0.3783 85.9886 3762 0.4693 0.8729
0.3498 86.9943 3806 0.4980 0.8621
0.3477 88.0 3850 0.4705 0.8671
0.4022 88.9829 3893 0.4817 0.86
0.3697 89.9886 3937 0.4763 0.8629
0.3828 90.9943 3981 0.4867 0.8671
0.3842 92.0 4025 0.4911 0.865
0.3562 92.9829 4068 0.4562 0.875
0.3343 93.9886 4112 0.4573 0.8786
0.3521 94.9943 4156 0.4481 0.8843
0.3788 96.0 4200 0.4793 0.8721
0.3518 96.9829 4243 0.4802 0.8693
0.3491 97.9886 4287 0.4740 0.8686
0.4063 98.2857 4300 0.5132 0.8636

Framework versions

  • Transformers 4.45.1
  • Pytorch 2.4.0
  • Datasets 3.0.1
  • Tokenizers 0.20.0