HorcruxNo13's picture
update model card README.md
eeb6fdb
|
raw
history blame
9.6 kB
metadata
license: other
tags:
  - generated_from_trainer
model-index:
  - name: segformer-b0-finetuned-segments-toolwear
    results: []

segformer-b0-finetuned-segments-toolwear

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0406
  • Mean Iou: 0.3913
  • Mean Accuracy: 0.7826
  • Overall Accuracy: 0.7826
  • Accuracy Unlabeled: nan
  • Accuracy Tool: nan
  • Accuracy Wear: 0.7826
  • Iou Unlabeled: 0.0
  • Iou Tool: nan
  • Iou Wear: 0.7826

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Tool Accuracy Wear Iou Unlabeled Iou Tool Iou Wear
0.7907 1.18 20 0.8970 0.3905 0.7810 0.7810 nan nan 0.7810 0.0 nan 0.7810
0.515 2.35 40 0.4998 0.3753 0.7506 0.7506 nan nan 0.7506 0.0 nan 0.7506
0.405 3.53 60 0.3773 0.4074 0.8148 0.8148 nan nan 0.8148 0.0 nan 0.8148
0.3532 4.71 80 0.3191 0.4127 0.8255 0.8255 nan nan 0.8255 0.0 nan 0.8255
0.2912 5.88 100 0.2693 0.4314 0.8628 0.8628 nan nan 0.8628 0.0 nan 0.8628
0.2128 7.06 120 0.2297 0.4067 0.8133 0.8133 nan nan 0.8133 0.0 nan 0.8133
0.1676 8.24 140 0.1849 0.4101 0.8203 0.8203 nan nan 0.8203 0.0 nan 0.8203
0.1712 9.41 160 0.1446 0.3677 0.7354 0.7354 nan nan 0.7354 0.0 nan 0.7354
0.1344 10.59 180 0.1265 0.3931 0.7861 0.7861 nan nan 0.7861 0.0 nan 0.7861
0.1315 11.76 200 0.1023 0.3511 0.7022 0.7022 nan nan 0.7022 0.0 nan 0.7022
0.109 12.94 220 0.1047 0.3986 0.7973 0.7973 nan nan 0.7973 0.0 nan 0.7973
0.0985 14.12 240 0.0913 0.4042 0.8084 0.8084 nan nan 0.8084 0.0 nan 0.8084
0.0711 15.29 260 0.0773 0.3192 0.6384 0.6384 nan nan 0.6384 0.0 nan 0.6384
0.0636 16.47 280 0.0798 0.4138 0.8275 0.8275 nan nan 0.8275 0.0 nan 0.8275
0.0619 17.65 300 0.0692 0.3770 0.7540 0.7540 nan nan 0.7540 0.0 nan 0.7540
0.0573 18.82 320 0.0608 0.3386 0.6771 0.6771 nan nan 0.6771 0.0 nan 0.6771
0.0579 20.0 340 0.0609 0.3882 0.7765 0.7765 nan nan 0.7765 0.0 nan 0.7765
0.0505 21.18 360 0.0552 0.3748 0.7496 0.7496 nan nan 0.7496 0.0 nan 0.7496
0.0514 22.35 380 0.0606 0.4208 0.8416 0.8416 nan nan 0.8416 0.0 nan 0.8416
0.0475 23.53 400 0.0513 0.3796 0.7593 0.7593 nan nan 0.7593 0.0 nan 0.7593
0.0442 24.71 420 0.0526 0.4185 0.8371 0.8371 nan nan 0.8371 0.0 nan 0.8371
0.0408 25.88 440 0.0526 0.4044 0.8087 0.8087 nan nan 0.8087 0.0 nan 0.8087
0.0337 27.06 460 0.0485 0.3932 0.7865 0.7865 nan nan 0.7865 0.0 nan 0.7865
0.0384 28.24 480 0.0463 0.4049 0.8098 0.8098 nan nan 0.8098 0.0 nan 0.8098
0.0469 29.41 500 0.0459 0.3687 0.7374 0.7374 nan nan 0.7374 0.0 nan 0.7374
0.0305 30.59 520 0.0444 0.3610 0.7220 0.7220 nan nan 0.7220 0.0 nan 0.7220
0.0364 31.76 540 0.0461 0.4147 0.8294 0.8294 nan nan 0.8294 0.0 nan 0.8294
0.034 32.94 560 0.0434 0.3907 0.7813 0.7813 nan nan 0.7813 0.0 nan 0.7813
0.0276 34.12 580 0.0431 0.3880 0.7759 0.7759 nan nan 0.7759 0.0 nan 0.7759
0.0281 35.29 600 0.0424 0.3761 0.7522 0.7522 nan nan 0.7522 0.0 nan 0.7522
0.0264 36.47 620 0.0438 0.4045 0.8090 0.8090 nan nan 0.8090 0.0 nan 0.8090
0.0269 37.65 640 0.0430 0.4041 0.8082 0.8082 nan nan 0.8082 0.0 nan 0.8082
0.0245 38.82 660 0.0409 0.3803 0.7607 0.7607 nan nan 0.7607 0.0 nan 0.7607
0.0241 40.0 680 0.0436 0.4147 0.8295 0.8295 nan nan 0.8295 0.0 nan 0.8295
0.027 41.18 700 0.0417 0.3901 0.7803 0.7803 nan nan 0.7803 0.0 nan 0.7803
0.0227 42.35 720 0.0405 0.3914 0.7828 0.7828 nan nan 0.7828 0.0 nan 0.7828
0.0269 43.53 740 0.0409 0.3907 0.7814 0.7814 nan nan 0.7814 0.0 nan 0.7814
0.0223 44.71 760 0.0409 0.3938 0.7877 0.7877 nan nan 0.7877 0.0 nan 0.7877
0.0268 45.88 780 0.0405 0.3888 0.7776 0.7776 nan nan 0.7776 0.0 nan 0.7776
0.0228 47.06 800 0.0408 0.3908 0.7817 0.7817 nan nan 0.7817 0.0 nan 0.7817
0.0218 48.24 820 0.0406 0.3868 0.7736 0.7736 nan nan 0.7736 0.0 nan 0.7736
0.0221 49.41 840 0.0406 0.3913 0.7826 0.7826 nan nan 0.7826 0.0 nan 0.7826

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.13.3