Edit model card

swiftformer-xs-dmae-va-U-80

This model is a fine-tuned version of MBZUAI/swiftformer-xs on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4325
  • Accuracy: 0.8257

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 80

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9 7 1.3863 0.2844
1.4158 1.94 15 1.3760 0.3119
1.3853 2.97 23 1.3548 0.3853
1.3745 4.0 31 1.3327 0.3394
1.3745 4.9 38 1.2938 0.4220
1.3435 5.94 46 1.2450 0.4679
1.2681 6.97 54 1.1933 0.5596
1.1803 8.0 62 1.1410 0.4771
1.1803 8.9 69 1.1014 0.5046
1.1277 9.94 77 1.0785 0.5321
1.0674 10.97 85 1.0440 0.5596
1.0353 12.0 93 0.9962 0.5780
0.9859 12.9 100 0.9700 0.5872
0.9859 13.94 108 0.9402 0.6422
0.9397 14.97 116 0.9215 0.6239
0.8959 16.0 124 0.8745 0.6606
0.8663 16.9 131 0.8561 0.6697
0.8663 17.94 139 0.8182 0.6789
0.8405 18.97 147 0.8168 0.6514
0.8093 20.0 155 0.8039 0.6789
0.7396 20.9 162 0.7478 0.7064
0.7588 21.94 170 0.7237 0.6972
0.7588 22.97 178 0.7031 0.7156
0.7189 24.0 186 0.6956 0.6972
0.7111 24.9 193 0.6749 0.7248
0.6577 25.94 201 0.6758 0.6972
0.6577 26.97 209 0.6429 0.7339
0.6681 28.0 217 0.6451 0.7064
0.6238 28.9 224 0.6368 0.7339
0.6136 29.94 232 0.6233 0.7706
0.5934 30.97 240 0.6161 0.7706
0.5934 32.0 248 0.6268 0.7431
0.5807 32.9 255 0.5879 0.7982
0.575 33.94 263 0.5772 0.7706
0.5409 34.97 271 0.5703 0.7798
0.5409 36.0 279 0.5603 0.7890
0.553 36.9 286 0.5560 0.8073
0.515 37.94 294 0.5639 0.7706
0.5424 38.97 302 0.5483 0.7890
0.5193 40.0 310 0.5505 0.7798
0.5193 40.9 317 0.5323 0.8073
0.5123 41.94 325 0.5257 0.7982
0.4719 42.97 333 0.5270 0.7798
0.4583 44.0 341 0.5305 0.7706
0.4583 44.9 348 0.5282 0.7798
0.4568 45.94 356 0.5178 0.7890
0.4717 46.97 364 0.4945 0.7982
0.4587 48.0 372 0.4978 0.7982
0.4587 48.9 379 0.4888 0.7890
0.4314 49.94 387 0.4867 0.7982
0.4389 50.97 395 0.4739 0.7890
0.4115 52.0 403 0.4844 0.7982
0.4323 52.9 410 0.4819 0.7982
0.4323 53.94 418 0.4562 0.7982
0.3855 54.97 426 0.4640 0.8073
0.4113 56.0 434 0.4474 0.8165
0.4282 56.9 441 0.4540 0.7982
0.4282 57.94 449 0.4450 0.8165
0.4499 58.97 457 0.4497 0.8165
0.4179 60.0 465 0.4400 0.8073
0.4213 60.9 472 0.4392 0.8073
0.4176 61.94 480 0.4325 0.8257
0.4176 62.97 488 0.4296 0.8165
0.4083 64.0 496 0.4388 0.8165
0.3853 64.9 503 0.4392 0.8073
0.3647 65.94 511 0.4349 0.8073
0.3647 66.97 519 0.4344 0.8257
0.3927 68.0 527 0.4348 0.8073
0.3833 68.9 534 0.4352 0.8073
0.3932 69.94 542 0.4294 0.8165
0.4085 70.97 550 0.4276 0.8073
0.4085 72.0 558 0.4232 0.8073
0.4029 72.26 560 0.4359 0.8165

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.1
  • Tokenizers 0.15.1
Downloads last month
6
Safetensors
Model size
3.04M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Augusto777/swiftformer-xs-dmae-va-U-80

Finetuned
(14)
this model