HorcruxNo13's picture
update model card README.md
8e107af
|
raw
history blame
9.6 kB
metadata
license: other
tags:
  - generated_from_trainer
model-index:
  - name: segformer-b0-finetuned-segments-toolwear
    results: []

segformer-b0-finetuned-segments-toolwear

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0387
  • Mean Iou: 0.4153
  • Mean Accuracy: 0.8306
  • Overall Accuracy: 0.8306
  • Accuracy Unlabeled: nan
  • Accuracy Tool: nan
  • Accuracy Wear: 0.8306
  • Iou Unlabeled: 0.0
  • Iou Tool: nan
  • Iou Wear: 0.8306

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Tool Accuracy Wear Iou Unlabeled Iou Tool Iou Wear
0.8874 1.18 20 0.8947 0.4560 0.9119 0.9119 nan nan 0.9119 0.0 nan 0.9119
0.5792 2.35 40 0.5701 0.2551 0.7653 0.7653 nan nan 0.7653 0.0 0.0 0.7653
0.5031 3.53 60 0.4364 0.4652 0.9305 0.9305 nan nan 0.9305 0.0 nan 0.9305
0.4025 4.71 80 0.4218 0.4529 0.9058 0.9058 nan nan 0.9058 0.0 nan 0.9058
0.3212 5.88 100 0.3115 0.4447 0.8894 0.8894 nan nan 0.8894 0.0 nan 0.8894
0.2797 7.06 120 0.2646 0.3291 0.6582 0.6582 nan nan 0.6582 0.0 nan 0.6582
0.2143 8.24 140 0.2223 0.4177 0.8354 0.8354 nan nan 0.8354 0.0 nan 0.8354
0.1951 9.41 160 0.1815 0.4313 0.8625 0.8625 nan nan 0.8625 0.0 nan 0.8625
0.1475 10.59 180 0.1571 0.4014 0.8029 0.8029 nan nan 0.8029 0.0 nan 0.8029
0.1523 11.76 200 0.1386 0.4242 0.8485 0.8485 nan nan 0.8485 0.0 nan 0.8485
0.1324 12.94 220 0.1127 0.4429 0.8858 0.8858 nan nan 0.8858 0.0 nan 0.8858
0.0977 14.12 240 0.1064 0.4458 0.8916 0.8916 nan nan 0.8916 0.0 nan 0.8916
0.0858 15.29 260 0.0915 0.4561 0.9122 0.9122 nan nan 0.9122 0.0 nan 0.9122
0.0782 16.47 280 0.0934 0.4611 0.9223 0.9223 nan nan 0.9223 0.0 nan 0.9223
0.0763 17.65 300 0.0757 0.4542 0.9084 0.9084 nan nan 0.9084 0.0 nan 0.9084
0.0665 18.82 320 0.0718 0.4259 0.8518 0.8518 nan nan 0.8518 0.0 nan 0.8518
0.0658 20.0 340 0.0636 0.3842 0.7685 0.7685 nan nan 0.7685 0.0 nan 0.7685
0.0672 21.18 360 0.0590 0.4212 0.8425 0.8425 nan nan 0.8425 0.0 nan 0.8425
0.05 22.35 380 0.0586 0.4502 0.9005 0.9005 nan nan 0.9005 0.0 nan 0.9005
0.0525 23.53 400 0.0546 0.3913 0.7827 0.7827 nan nan 0.7827 0.0 nan 0.7827
0.0451 24.71 420 0.0528 0.4383 0.8767 0.8767 nan nan 0.8767 0.0 nan 0.8767
0.0407 25.88 440 0.0494 0.4337 0.8675 0.8675 nan nan 0.8675 0.0 nan 0.8675
0.0462 27.06 460 0.0510 0.3397 0.6795 0.6795 nan nan 0.6795 0.0 nan 0.6795
0.0376 28.24 480 0.0451 0.4271 0.8541 0.8541 nan nan 0.8541 0.0 nan 0.8541
0.0349 29.41 500 0.0456 0.4173 0.8346 0.8346 nan nan 0.8346 0.0 nan 0.8346
0.0406 30.59 520 0.0449 0.3863 0.7726 0.7726 nan nan 0.7726 0.0 nan 0.7726
0.0333 31.76 540 0.0438 0.4361 0.8721 0.8721 nan nan 0.8721 0.0 nan 0.8721
0.0331 32.94 560 0.0480 0.3417 0.6834 0.6834 nan nan 0.6834 0.0 nan 0.6834
0.0756 34.12 580 0.0420 0.4362 0.8723 0.8723 nan nan 0.8723 0.0 nan 0.8723
0.0295 35.29 600 0.0437 0.3674 0.7349 0.7349 nan nan 0.7349 0.0 nan 0.7349
0.0325 36.47 620 0.0409 0.4087 0.8174 0.8174 nan nan 0.8174 0.0 nan 0.8174
0.0299 37.65 640 0.0405 0.4150 0.8299 0.8299 nan nan 0.8299 0.0 nan 0.8299
0.0384 38.82 660 0.0416 0.3690 0.7380 0.7380 nan nan 0.7380 0.0 nan 0.7380
0.0269 40.0 680 0.0393 0.4356 0.8713 0.8713 nan nan 0.8713 0.0 nan 0.8713
0.025 41.18 700 0.0389 0.3976 0.7952 0.7952 nan nan 0.7952 0.0 nan 0.7952
0.0256 42.35 720 0.0392 0.3729 0.7459 0.7459 nan nan 0.7459 0.0 nan 0.7459
0.0303 43.53 740 0.0400 0.3869 0.7738 0.7738 nan nan 0.7738 0.0 nan 0.7738
0.0244 44.71 760 0.0389 0.4022 0.8044 0.8044 nan nan 0.8044 0.0 nan 0.8044
0.03 45.88 780 0.0387 0.4003 0.8006 0.8006 nan nan 0.8006 0.0 nan 0.8006
0.0238 47.06 800 0.0384 0.4073 0.8147 0.8147 nan nan 0.8147 0.0 nan 0.8147
0.0278 48.24 820 0.0394 0.4151 0.8302 0.8302 nan nan 0.8302 0.0 nan 0.8302
0.0281 49.41 840 0.0387 0.4153 0.8306 0.8306 nan nan 0.8306 0.0 nan 0.8306

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.13.3