swin-tiny-patch4-window7-224-finetuned-eurosat

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3768
  • Accuracy: 0.8989

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9231 3 1.8889 0.2472
No log 1.8462 6 1.7625 0.3820
No log 2.7692 9 1.5603 0.4494
1.7854 4.0 13 1.3005 0.5281
1.7854 4.9231 16 1.0408 0.6292
1.7854 5.8462 19 0.8925 0.6854
1.1431 6.7692 22 0.7614 0.7303
1.1431 8.0 26 0.6343 0.7753
1.1431 8.9231 29 0.5810 0.7978
0.7715 9.8462 32 0.5551 0.8427
0.7715 10.7692 35 0.5209 0.8539
0.7715 12.0 39 0.5690 0.8202
0.5645 12.9231 42 0.4431 0.8876
0.5645 13.8462 45 0.4922 0.8202
0.5645 14.7692 48 0.4914 0.8315
0.4999 16.0 52 0.3768 0.8989
0.4999 16.9231 55 0.4292 0.8539
0.4999 17.8462 58 0.3846 0.8652
0.4555 18.7692 61 0.3498 0.8876
0.4555 20.0 65 0.3523 0.8652
0.4555 20.9231 68 0.3541 0.8876
0.3941 21.8462 71 0.3240 0.8989
0.3941 22.7692 74 0.3169 0.8989
0.3941 24.0 78 0.3317 0.8764
0.361 24.9231 81 0.3251 0.8876
0.361 25.8462 84 0.3198 0.8764
0.361 26.7692 87 0.3117 0.8764
0.3485 27.6923 90 0.3101 0.8764

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.4.0
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
40
Safetensors
Model size
27.6M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Shk4/vit_ana_0.89

Finetuned
(486)
this model

Evaluation results