hkivancoral's picture
End of training
0fbd6ac
|
raw
history blame
4.82 kB
metadata
license: apache-2.0
base_model: facebook/deit-tiny-patch16-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: hushem_1x_deit_tiny_sgd_00001_fold1
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.26666666666666666

hushem_1x_deit_tiny_sgd_00001_fold1

This model is a fine-tuned version of facebook/deit-tiny-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6650
  • Accuracy: 0.2667

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 6 1.6807 0.2667
1.7256 2.0 12 1.6798 0.2667
1.7256 3.0 18 1.6790 0.2667
1.715 4.0 24 1.6783 0.2667
1.7579 5.0 30 1.6775 0.2667
1.7579 6.0 36 1.6768 0.2667
1.7037 7.0 42 1.6761 0.2667
1.7037 8.0 48 1.6755 0.2667
1.6916 9.0 54 1.6748 0.2667
1.7402 10.0 60 1.6742 0.2667
1.7402 11.0 66 1.6736 0.2667
1.7036 12.0 72 1.6730 0.2667
1.7036 13.0 78 1.6724 0.2667
1.8164 14.0 84 1.6718 0.2667
1.7198 15.0 90 1.6713 0.2667
1.7198 16.0 96 1.6708 0.2667
1.7047 17.0 102 1.6704 0.2667
1.7047 18.0 108 1.6699 0.2667
1.7105 19.0 114 1.6695 0.2667
1.6839 20.0 120 1.6691 0.2667
1.6839 21.0 126 1.6687 0.2667
1.6768 22.0 132 1.6683 0.2667
1.6768 23.0 138 1.6679 0.2667
1.7332 24.0 144 1.6676 0.2667
1.69 25.0 150 1.6673 0.2667
1.69 26.0 156 1.6670 0.2667
1.6919 27.0 162 1.6668 0.2667
1.6919 28.0 168 1.6665 0.2667
1.713 29.0 174 1.6663 0.2667
1.7082 30.0 180 1.6661 0.2667
1.7082 31.0 186 1.6659 0.2667
1.7547 32.0 192 1.6657 0.2667
1.7547 33.0 198 1.6656 0.2667
1.6513 34.0 204 1.6654 0.2667
1.7419 35.0 210 1.6653 0.2667
1.7419 36.0 216 1.6652 0.2667
1.7087 37.0 222 1.6652 0.2667
1.7087 38.0 228 1.6651 0.2667
1.6162 39.0 234 1.6651 0.2667
1.6974 40.0 240 1.6651 0.2667
1.6974 41.0 246 1.6650 0.2667
1.7234 42.0 252 1.6650 0.2667
1.7234 43.0 258 1.6650 0.2667
1.7326 44.0 264 1.6650 0.2667
1.6725 45.0 270 1.6650 0.2667
1.6725 46.0 276 1.6650 0.2667
1.6993 47.0 282 1.6650 0.2667
1.6993 48.0 288 1.6650 0.2667
1.6816 49.0 294 1.6650 0.2667
1.7255 50.0 300 1.6650 0.2667

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1