finetuned-fake-food / README.md
itsLeen's picture
Model save
97b156e verified
|
raw
history blame
7.65 kB
metadata
library_name: transformers
license: apache-2.0
base_model: google/vit-base-patch16-224-in21k
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: finetuned-fake-food
    results: []

finetuned-fake-food

This model is a fine-tuned version of google/vit-base-patch16-224-in21k on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1451
  • Accuracy: 0.9628

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.3831 0.1010 100 0.2663 0.8912
0.3699 0.2020 200 0.2570 0.8990
0.2135 0.3030 300 0.6753 0.7837
0.214 0.4040 400 0.2690 0.8901
0.1947 0.5051 500 0.2533 0.9112
0.3618 0.6061 600 0.3738 0.8571
0.2065 0.7071 700 0.2919 0.8919
0.3103 0.8081 800 0.2165 0.9169
0.1479 0.9091 900 0.2135 0.9173
0.2421 1.0101 1000 0.2187 0.9184
0.2264 1.1111 1100 0.1888 0.9205
0.1664 1.2121 1200 0.2607 0.8876
0.2049 1.3131 1300 0.2502 0.9005
0.1503 1.4141 1400 0.2305 0.9177
0.1846 1.5152 1500 0.1881 0.9219
0.1571 1.6162 1600 0.1788 0.9284
0.4091 1.7172 1700 0.2228 0.9216
0.2954 1.8182 1800 0.1653 0.9366
0.1366 1.9192 1900 0.1529 0.9420
0.1657 2.0202 2000 0.1745 0.9255
0.2531 2.1212 2100 0.1744 0.9381
0.152 2.2222 2200 0.2513 0.8951
0.145 2.3232 2300 0.1718 0.9302
0.202 2.4242 2400 0.2436 0.9033
0.1346 2.5253 2500 0.1839 0.9234
0.1554 2.6263 2600 0.1447 0.9463
0.183 2.7273 2700 0.2474 0.8822
0.0972 2.8283 2800 0.2223 0.9205
0.1073 2.9293 2900 0.1860 0.9345
0.1824 3.0303 3000 0.2324 0.9194
0.1221 3.1313 3100 0.1475 0.9449
0.1039 3.2323 3200 0.1480 0.9427
0.276 3.3333 3300 0.1591 0.9402
0.2498 3.4343 3400 0.2447 0.9098
0.1453 3.5354 3500 0.1556 0.9416
0.1794 3.6364 3600 0.2272 0.9083
0.1467 3.7374 3700 0.1673 0.9413
0.1372 3.8384 3800 0.1763 0.9341
0.2283 3.9394 3900 0.1671 0.9373
0.164 4.0404 4000 0.1490 0.9477
0.1513 4.1414 4100 0.1547 0.9488
0.0991 4.2424 4200 0.1536 0.9431
0.1419 4.3434 4300 0.1568 0.9445
0.1452 4.4444 4400 0.2328 0.9320
0.1445 4.5455 4500 0.1351 0.9513
0.1366 4.6465 4600 0.1571 0.9416
0.097 4.7475 4700 0.1506 0.9424
0.0603 4.8485 4800 0.1435 0.9499
0.1179 4.9495 4900 0.1754 0.9363
0.1948 5.0505 5000 0.1609 0.9402
0.1021 5.1515 5100 0.1566 0.9459
0.0652 5.2525 5200 0.1564 0.9481
0.1029 5.3535 5300 0.1410 0.9492
0.1014 5.4545 5400 0.1490 0.9531
0.1338 5.5556 5500 0.1865 0.9406
0.0844 5.6566 5600 0.1631 0.9456
0.1059 5.7576 5700 0.1738 0.9409
0.0788 5.8586 5800 0.1801 0.9370
0.0941 5.9596 5900 0.1575 0.9495
0.112 6.0606 6000 0.1796 0.9470
0.0691 6.1616 6100 0.1697 0.9499
0.1385 6.2626 6200 0.1348 0.9563
0.1173 6.3636 6300 0.1522 0.9502
0.046 6.4646 6400 0.2114 0.9391
0.0319 6.5657 6500 0.1723 0.9477
0.0757 6.6667 6600 0.1561 0.9527
0.0744 6.7677 6700 0.1587 0.9567
0.0341 6.8687 6800 0.1458 0.9578
0.1512 6.9697 6900 0.1572 0.9531
0.0153 7.0707 7000 0.1402 0.9617
0.0711 7.1717 7100 0.1527 0.9610
0.0453 7.2727 7200 0.1512 0.9570
0.0052 7.3737 7300 0.1936 0.9520
0.0477 7.4747 7400 0.1699 0.9513
0.091 7.5758 7500 0.1628 0.9513
0.063 7.6768 7600 0.1474 0.9578
0.0497 7.7778 7700 0.1389 0.9613
0.0552 7.8788 7800 0.2587 0.9381
0.0364 7.9798 7900 0.1361 0.9603
0.0124 8.0808 8000 0.1438 0.9606
0.0703 8.1818 8100 0.1577 0.9585
0.025 8.2828 8200 0.1943 0.9484
0.0259 8.3838 8300 0.1590 0.9613
0.0049 8.4848 8400 0.1521 0.9581
0.0174 8.5859 8500 0.1522 0.9599
0.0194 8.6869 8600 0.1456 0.9606
0.0315 8.7879 8700 0.1411 0.9599
0.0419 8.8889 8800 0.1426 0.9592
0.0193 8.9899 8900 0.1375 0.9642
0.0027 9.0909 9000 0.1379 0.9635
0.0345 9.1919 9100 0.1444 0.9631
0.0291 9.2929 9200 0.1492 0.9624
0.017 9.3939 9300 0.1466 0.9635
0.0269 9.4949 9400 0.1523 0.9631
0.003 9.5960 9500 0.1445 0.9628
0.0471 9.6970 9600 0.1454 0.9617
0.0356 9.7980 9700 0.1452 0.9620
0.0034 9.8990 9800 0.1445 0.9624
0.0162 10.0 9900 0.1451 0.9628

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.0
  • Tokenizers 0.19.1