Edit model card

swin-tiny-patch4-window7-224-finetuned-leukemia.v2.2

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5507
  • Accuracy: 0.7638

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.2349 0.9984 312 0.5575 0.7698
0.2191 2.0 625 0.5572 0.7618
0.2124 2.9984 937 0.5580 0.769
0.2207 4.0 1250 0.5500 0.763
0.2143 4.9984 1562 0.5575 0.7652
0.2191 6.0 1875 0.5486 0.7728
0.2063 6.9984 2187 0.5594 0.7615
0.207 8.0 2500 0.5405 0.7695
0.2273 8.9984 2812 0.5568 0.7672
0.2136 10.0 3125 0.5483 0.7728
0.2184 10.9984 3437 0.5606 0.7665
0.212 12.0 3750 0.5578 0.761
0.1903 12.9984 4062 0.5371 0.769
0.2487 14.0 4375 0.5582 0.7645
0.2025 14.9984 4687 0.5414 0.7778
0.2207 16.0 5000 0.5376 0.7685
0.2012 16.9984 5312 0.5489 0.7702
0.2198 18.0 5625 0.5560 0.7752
0.2171 18.9984 5937 0.5570 0.7725
0.2116 20.0 6250 0.5622 0.7625
0.2162 20.9984 6562 0.5587 0.7668
0.224 22.0 6875 0.5456 0.7712
0.212 22.9984 7187 0.5647 0.7652
0.2084 24.0 7500 0.5533 0.7672
0.2226 24.9984 7812 0.5434 0.7705
0.2173 26.0 8125 0.5738 0.7675
0.2216 26.9984 8437 0.5557 0.7672
0.1918 28.0 8750 0.5502 0.7705
0.199 28.9984 9062 0.5456 0.7675
0.21 29.9520 9360 0.5483 0.7715

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.1
Downloads last month
15
Safetensors
Model size
28.3M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for DouglasBraga/swin-tiny-patch4-window7-224-finetuned-leukemia.v2.2

Finetuned
(469)
this model