HorcruxNo13's picture
update model card README.md
bcb4564
|
raw
history blame
9.6 kB
metadata
license: other
tags:
  - generated_from_trainer
model-index:
  - name: segformer-b0-finetuned-segments-toolwear
    results: []

segformer-b0-finetuned-segments-toolwear

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0354
  • Mean Iou: 0.3022
  • Mean Accuracy: 0.6045
  • Overall Accuracy: 0.6045
  • Accuracy Unlabeled: nan
  • Accuracy Tool: nan
  • Accuracy Wear: 0.6045
  • Iou Unlabeled: 0.0
  • Iou Tool: nan
  • Iou Wear: 0.6045

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Tool Accuracy Wear Iou Unlabeled Iou Tool Iou Wear
0.8671 1.18 20 0.9263 0.4061 0.8122 0.8122 nan nan 0.8122 0.0 nan 0.8122
0.5691 2.35 40 0.5998 0.2895 0.5790 0.5790 nan nan 0.5790 0.0 nan 0.5790
0.4378 3.53 60 0.3948 0.3106 0.6213 0.6213 nan nan 0.6213 0.0 nan 0.6213
0.3842 4.71 80 0.3190 0.2679 0.5357 0.5357 nan nan 0.5357 0.0 nan 0.5357
0.3234 5.88 100 0.2883 0.3574 0.7148 0.7148 nan nan 0.7148 0.0 nan 0.7148
0.2731 7.06 120 0.2392 0.3456 0.6911 0.6911 nan nan 0.6911 0.0 nan 0.6911
0.2137 8.24 140 0.1850 0.1844 0.3688 0.3688 nan nan 0.3688 0.0 nan 0.3688
0.1798 9.41 160 0.1692 0.2757 0.5515 0.5515 nan nan 0.5515 0.0 nan 0.5515
0.1607 10.59 180 0.1338 0.2978 0.5956 0.5956 nan nan 0.5956 0.0 nan 0.5956
0.1399 11.76 200 0.1218 0.2906 0.5811 0.5811 nan nan 0.5811 0.0 nan 0.5811
0.1173 12.94 220 0.1030 0.2612 0.5224 0.5224 nan nan 0.5224 0.0 nan 0.5224
0.0922 14.12 240 0.0976 0.2817 0.5633 0.5633 nan nan 0.5633 0.0 nan 0.5633
0.081 15.29 260 0.0795 0.3154 0.6308 0.6308 nan nan 0.6308 0.0 nan 0.6308
0.0852 16.47 280 0.0716 0.2188 0.4377 0.4377 nan nan 0.4377 0.0 nan 0.4377
0.0709 17.65 300 0.0680 0.2691 0.5382 0.5382 nan nan 0.5382 0.0 nan 0.5382
0.073 18.82 320 0.0611 0.2830 0.5660 0.5660 nan nan 0.5660 0.0 nan 0.5660
0.0602 20.0 340 0.0592 0.2829 0.5657 0.5657 nan nan 0.5657 0.0 nan 0.5657
0.0547 21.18 360 0.0577 0.2842 0.5684 0.5684 nan nan 0.5684 0.0 nan 0.5684
0.0554 22.35 380 0.0537 0.2613 0.5226 0.5226 nan nan 0.5226 0.0 nan 0.5226
0.0515 23.53 400 0.0523 0.3076 0.6152 0.6152 nan nan 0.6152 0.0 nan 0.6152
0.0444 24.71 420 0.0487 0.3063 0.6126 0.6126 nan nan 0.6126 0.0 nan 0.6126
0.088 25.88 440 0.0467 0.3041 0.6082 0.6082 nan nan 0.6082 0.0 nan 0.6082
0.0472 27.06 460 0.0437 0.2623 0.5245 0.5245 nan nan 0.5245 0.0 nan 0.5245
0.0396 28.24 480 0.0474 0.3352 0.6704 0.6704 nan nan 0.6704 0.0 nan 0.6704
0.0351 29.41 500 0.0436 0.3060 0.6120 0.6120 nan nan 0.6120 0.0 nan 0.6120
0.0392 30.59 520 0.0428 0.2975 0.5951 0.5951 nan nan 0.5951 0.0 nan 0.5951
0.0317 31.76 540 0.0431 0.3253 0.6507 0.6507 nan nan 0.6507 0.0 nan 0.6507
0.0391 32.94 560 0.0404 0.2863 0.5726 0.5726 nan nan 0.5726 0.0 nan 0.5726
0.0309 34.12 580 0.0408 0.3215 0.6429 0.6429 nan nan 0.6429 0.0 nan 0.6429
0.0493 35.29 600 0.0381 0.2581 0.5162 0.5162 nan nan 0.5162 0.0 nan 0.5162
0.0321 36.47 620 0.0376 0.3147 0.6293 0.6293 nan nan 0.6293 0.0 nan 0.6293
0.0333 37.65 640 0.0372 0.3118 0.6236 0.6236 nan nan 0.6236 0.0 nan 0.6236
0.0295 38.82 660 0.0362 0.3036 0.6072 0.6072 nan nan 0.6072 0.0 nan 0.6072
0.0302 40.0 680 0.0365 0.3157 0.6314 0.6314 nan nan 0.6314 0.0 nan 0.6314
0.0272 41.18 700 0.0367 0.3012 0.6024 0.6024 nan nan 0.6024 0.0 nan 0.6024
0.0278 42.35 720 0.0353 0.2935 0.5870 0.5870 nan nan 0.5870 0.0 nan 0.5870
0.0283 43.53 740 0.0353 0.2970 0.5940 0.5940 nan nan 0.5940 0.0 nan 0.5940
0.0256 44.71 760 0.0355 0.3090 0.6181 0.6181 nan nan 0.6181 0.0 nan 0.6181
0.0365 45.88 780 0.0358 0.3008 0.6015 0.6015 nan nan 0.6015 0.0 nan 0.6015
0.025 47.06 800 0.0353 0.2965 0.5930 0.5930 nan nan 0.5930 0.0 nan 0.5930
0.0299 48.24 820 0.0361 0.3109 0.6219 0.6219 nan nan 0.6219 0.0 nan 0.6219
0.0239 49.41 840 0.0354 0.3022 0.6045 0.6045 nan nan 0.6045 0.0 nan 0.6045

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.13.3