HorcruxNo13's picture
update model card README.md
a110206
|
raw
history blame
9.6 kB
metadata
license: other
tags:
  - generated_from_trainer
model-index:
  - name: segformer-b0-finetuned-segments-toolwear
    results: []

segformer-b0-finetuned-segments-toolwear

This model is a fine-tuned version of nvidia/mit-b0 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1009
  • Mean Iou: 0.2182
  • Mean Accuracy: 0.4365
  • Overall Accuracy: 0.4365
  • Accuracy Unlabeled: nan
  • Accuracy Tool: nan
  • Accuracy Wear: 0.4365
  • Iou Unlabeled: 0.0
  • Iou Tool: nan
  • Iou Wear: 0.4365

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Accuracy Unlabeled Accuracy Tool Accuracy Wear Iou Unlabeled Iou Tool Iou Wear
0.8747 1.18 20 0.9764 0.1788 0.5363 0.5363 nan nan 0.5363 0.0 0.0 0.5363
0.6206 2.35 40 0.6394 0.1860 0.3719 0.3719 nan nan 0.3719 0.0 nan 0.3719
0.4963 3.53 60 0.4309 0.2230 0.4460 0.4460 nan nan 0.4460 0.0 nan 0.4460
0.3978 4.71 80 0.3839 0.3231 0.6463 0.6463 nan nan 0.6463 0.0 nan 0.6463
0.3171 5.88 100 0.3193 0.2653 0.5306 0.5306 nan nan 0.5306 0.0 nan 0.5306
0.3046 7.06 120 0.2760 0.1372 0.2745 0.2745 nan nan 0.2745 0.0 nan 0.2745
0.2558 8.24 140 0.2181 0.2549 0.5097 0.5097 nan nan 0.5097 0.0 nan 0.5097
0.225 9.41 160 0.1933 0.2673 0.5345 0.5345 nan nan 0.5345 0.0 nan 0.5345
0.1532 10.59 180 0.1735 0.2673 0.5346 0.5346 nan nan 0.5346 0.0 nan 0.5346
0.1505 11.76 200 0.1660 0.1857 0.3715 0.3715 nan nan 0.3715 0.0 nan 0.3715
0.1222 12.94 220 0.1641 0.1508 0.3016 0.3016 nan nan 0.3016 0.0 nan 0.3016
0.0921 14.12 240 0.1363 0.2869 0.5738 0.5738 nan nan 0.5738 0.0 nan 0.5738
0.0792 15.29 260 0.1300 0.2245 0.4491 0.4491 nan nan 0.4491 0.0 nan 0.4491
0.0804 16.47 280 0.1338 0.1910 0.3820 0.3820 nan nan 0.3820 0.0 nan 0.3820
0.0732 17.65 300 0.1118 0.2583 0.5166 0.5166 nan nan 0.5166 0.0 nan 0.5166
0.062 18.82 320 0.1102 0.2432 0.4864 0.4864 nan nan 0.4864 0.0 nan 0.4864
0.0582 20.0 340 0.1023 0.2547 0.5095 0.5095 nan nan 0.5095 0.0 nan 0.5095
0.056 21.18 360 0.1151 0.2111 0.4222 0.4222 nan nan 0.4222 0.0 nan 0.4222
0.0493 22.35 380 0.1126 0.2045 0.4089 0.4089 nan nan 0.4089 0.0 nan 0.4089
0.0633 23.53 400 0.1065 0.2220 0.4440 0.4440 nan nan 0.4440 0.0 nan 0.4440
0.0438 24.71 420 0.0987 0.2558 0.5116 0.5116 nan nan 0.5116 0.0 nan 0.5116
0.0451 25.88 440 0.1060 0.2326 0.4652 0.4652 nan nan 0.4652 0.0 nan 0.4652
0.0426 27.06 460 0.0981 0.2493 0.4986 0.4986 nan nan 0.4986 0.0 nan 0.4986
0.0397 28.24 480 0.0955 0.2485 0.4970 0.4970 nan nan 0.4970 0.0 nan 0.4970
0.0349 29.41 500 0.0991 0.2321 0.4641 0.4641 nan nan 0.4641 0.0 nan 0.4641
0.0337 30.59 520 0.1048 0.2111 0.4222 0.4222 nan nan 0.4222 0.0 nan 0.4222
0.0358 31.76 540 0.0870 0.2856 0.5712 0.5712 nan nan 0.5712 0.0 nan 0.5712
0.0322 32.94 560 0.1061 0.2085 0.4170 0.4170 nan nan 0.4170 0.0 nan 0.4170
0.028 34.12 580 0.0950 0.2399 0.4798 0.4798 nan nan 0.4798 0.0 nan 0.4798
0.0282 35.29 600 0.0880 0.2667 0.5335 0.5335 nan nan 0.5335 0.0 nan 0.5335
0.0266 36.47 620 0.0952 0.2457 0.4914 0.4914 nan nan 0.4914 0.0 nan 0.4914
0.0276 37.65 640 0.0994 0.2329 0.4658 0.4658 nan nan 0.4658 0.0 nan 0.4658
0.0306 38.82 660 0.0978 0.2314 0.4627 0.4627 nan nan 0.4627 0.0 nan 0.4627
0.0337 40.0 680 0.0949 0.2404 0.4809 0.4809 nan nan 0.4809 0.0 nan 0.4809
0.0243 41.18 700 0.0948 0.2382 0.4765 0.4765 nan nan 0.4765 0.0 nan 0.4765
0.0278 42.35 720 0.0978 0.2328 0.4655 0.4655 nan nan 0.4655 0.0 nan 0.4655
0.0317 43.53 740 0.0975 0.2337 0.4675 0.4675 nan nan 0.4675 0.0 nan 0.4675
0.0321 44.71 760 0.0981 0.2331 0.4663 0.4663 nan nan 0.4663 0.0 nan 0.4663
0.0318 45.88 780 0.0955 0.2374 0.4748 0.4748 nan nan 0.4748 0.0 nan 0.4748
0.0268 47.06 800 0.0963 0.2358 0.4715 0.4715 nan nan 0.4715 0.0 nan 0.4715
0.0268 48.24 820 0.1001 0.2229 0.4459 0.4459 nan nan 0.4459 0.0 nan 0.4459
0.0314 49.41 840 0.1009 0.2182 0.4365 0.4365 nan nan 0.4365 0.0 nan 0.4365

Framework versions

  • Transformers 4.28.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.16.0
  • Tokenizers 0.13.3