Edit model card

Visualize in Weights & Biases

swinModel

This model is a fine-tuned version of facebook/convnext-tiny-224 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4645
  • Accuracy: 0.7823

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.5524 0.2278 100 0.3380 0.9845
0.4727 0.4556 200 0.3134 0.9439
0.3821 0.6834 300 0.3179 0.8939
0.2765 0.9112 400 0.3308 0.8603
0.1905 1.1390 500 0.4489 0.8069
0.1258 1.3667 600 0.5830 0.7731
0.0846 1.5945 700 0.4515 0.8439
0.064 1.8223 800 0.5274 0.8248
0.0494 2.0501 900 0.6575 0.7969
0.0378 2.2779 1000 0.6267 0.8261
0.0284 2.5057 1100 0.8875 0.7677
0.023 2.7335 1200 1.0218 0.7502
0.0225 2.9613 1300 0.8597 0.7930
0.0158 3.1891 1400 0.9559 0.7875
0.0134 3.4169 1500 0.7133 0.8378
0.0146 3.6446 1600 0.8297 0.8159
0.0116 3.8724 1700 0.9716 0.7930
0.0099 4.1002 1800 0.8118 0.8289
0.009 4.3280 1900 0.8361 0.8305
0.0059 4.5558 2000 0.9536 0.8127
0.009 4.7836 2100 1.0436 0.8003
0.0107 5.0114 2200 1.0988 0.7929
0.0077 5.2392 2300 0.9100 0.8344
0.007 5.4670 2400 0.9920 0.8186
0.0037 5.6948 2500 1.0256 0.8130
0.0073 5.9226 2600 1.5456 0.7387
0.0055 6.1503 2700 1.2020 0.7793
0.0039 6.3781 2800 1.1095 0.8048
0.0022 6.6059 2900 1.2638 0.7887
0.0042 6.8337 3000 1.0389 0.8263
0.005 7.0615 3100 1.3570 0.7763
0.0017 7.2893 3200 1.6866 0.7303
0.0024 7.5171 3300 1.4244 0.7679
0.0036 7.7449 3400 1.4379 0.7609
0.0032 7.9727 3500 1.1855 0.8006
0.0016 8.2005 3600 1.1089 0.8163
0.0023 8.4282 3700 0.9546 0.8441
0.0022 8.6560 3800 1.0083 0.8378
0.002 8.8838 3900 1.6526 0.7368
0.0032 9.1116 4000 1.5307 0.7619
0.0008 9.3394 4100 1.1384 0.8191
0.002 9.5672 4200 1.2104 0.8063
0.0031 9.7950 4300 1.5793 0.7564
0.0024 10.0228 4400 1.3544 0.7857
0.0035 10.2506 4500 1.5046 0.7667
0.0009 10.4784 4600 1.8010 0.7306
0.0007 10.7062 4700 1.2062 0.8115
0.0025 10.9339 4800 1.2110 0.8127
0.0016 11.1617 4900 1.3772 0.7875
0.001 11.3895 5000 1.3586 0.7947
0.0024 11.6173 5100 1.2359 0.8094
0.0012 11.8451 5200 0.8793 0.8679
0.0011 12.0729 5300 1.5563 0.7648
0.0021 12.3007 5400 1.3154 0.8003
0.0018 12.5285 5500 1.2115 0.8168
0.001 12.7563 5600 1.4905 0.7773
0.0012 12.9841 5700 1.4290 0.7868
0.0022 13.2118 5800 1.1928 0.8214
0.0023 13.4396 5900 1.2761 0.8077
0.0014 13.6674 6000 1.1804 0.8211
0.0021 13.8952 6100 1.3523 0.7965
0.0007 14.1230 6200 1.2330 0.8128
0.0008 14.3508 6300 1.3563 0.7955
0.0004 14.5786 6400 1.3969 0.7903
0.0011 14.8064 6500 1.4645 0.7823

Framework versions

  • Transformers 4.42.0.dev0
  • Pytorch 2.1.1
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
27.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for peter881122/roadnModel

Finetuned
(33)
this model