|
--- |
|
license: other |
|
tags: |
|
- generated_from_trainer |
|
model-index: |
|
- name: segformer-b0-finetuned-segments-toolwear |
|
results: [] |
|
--- |
|
|
|
<!-- This model card has been generated automatically according to the information the Trainer had access to. You |
|
should probably proofread and complete it, then remove this comment. --> |
|
|
|
# segformer-b0-finetuned-segments-toolwear |
|
|
|
This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co./nvidia/mit-b0) on an unknown dataset. |
|
It achieves the following results on the evaluation set: |
|
- Loss: 0.0360 |
|
- Mean Iou: 0.3724 |
|
- Mean Accuracy: 0.7448 |
|
- Overall Accuracy: 0.7448 |
|
- Accuracy Unlabeled: nan |
|
- Accuracy Tool: nan |
|
- Accuracy Wear: 0.7448 |
|
- Iou Unlabeled: 0.0 |
|
- Iou Tool: nan |
|
- Iou Wear: 0.7448 |
|
|
|
## Model description |
|
|
|
More information needed |
|
|
|
## Intended uses & limitations |
|
|
|
More information needed |
|
|
|
## Training and evaluation data |
|
|
|
More information needed |
|
|
|
## Training procedure |
|
|
|
### Training hyperparameters |
|
|
|
The following hyperparameters were used during training: |
|
- learning_rate: 6e-05 |
|
- train_batch_size: 2 |
|
- eval_batch_size: 2 |
|
- seed: 42 |
|
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
|
- lr_scheduler_type: linear |
|
- num_epochs: 50 |
|
|
|
### Training results |
|
|
|
| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Tool | Accuracy Wear | Iou Unlabeled | Iou Tool | Iou Wear | |
|
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:-------------:|:--------:|:--------:| |
|
| 0.9316 | 1.18 | 20 | 0.9877 | 0.4510 | 0.9020 | 0.9020 | nan | nan | 0.9020 | 0.0 | nan | 0.9020 | |
|
| 0.6902 | 2.35 | 40 | 0.6556 | 0.3765 | 0.7531 | 0.7531 | nan | nan | 0.7531 | 0.0 | nan | 0.7531 | |
|
| 0.532 | 3.53 | 60 | 0.4585 | 0.3435 | 0.6871 | 0.6871 | nan | nan | 0.6871 | 0.0 | nan | 0.6871 | |
|
| 0.4296 | 4.71 | 80 | 0.3832 | 0.3900 | 0.7799 | 0.7799 | nan | nan | 0.7799 | 0.0 | nan | 0.7799 | |
|
| 0.3318 | 5.88 | 100 | 0.3255 | 0.3739 | 0.7478 | 0.7478 | nan | nan | 0.7478 | 0.0 | nan | 0.7478 | |
|
| 0.281 | 7.06 | 120 | 0.2480 | 0.3109 | 0.6219 | 0.6219 | nan | nan | 0.6219 | 0.0 | nan | 0.6219 | |
|
| 0.2405 | 8.24 | 140 | 0.2410 | 0.2029 | 0.4059 | 0.4059 | nan | nan | 0.4059 | 0.0 | nan | 0.4059 | |
|
| 0.1945 | 9.41 | 160 | 0.1969 | 0.3366 | 0.6733 | 0.6733 | nan | nan | 0.6733 | 0.0 | nan | 0.6733 | |
|
| 0.1612 | 10.59 | 180 | 0.1776 | 0.3469 | 0.6938 | 0.6938 | nan | nan | 0.6938 | 0.0 | nan | 0.6938 | |
|
| 0.1653 | 11.76 | 200 | 0.1455 | 0.3758 | 0.7515 | 0.7515 | nan | nan | 0.7515 | 0.0 | nan | 0.7515 | |
|
| 0.1562 | 12.94 | 220 | 0.1330 | 0.2652 | 0.5304 | 0.5304 | nan | nan | 0.5304 | 0.0 | nan | 0.5304 | |
|
| 0.1053 | 14.12 | 240 | 0.1145 | 0.3511 | 0.7022 | 0.7022 | nan | nan | 0.7022 | 0.0 | nan | 0.7022 | |
|
| 0.1017 | 15.29 | 260 | 0.0989 | 0.3879 | 0.7757 | 0.7757 | nan | nan | 0.7757 | 0.0 | nan | 0.7757 | |
|
| 0.0809 | 16.47 | 280 | 0.0859 | 0.2622 | 0.5243 | 0.5243 | nan | nan | 0.5243 | 0.0 | nan | 0.5243 | |
|
| 0.0861 | 17.65 | 300 | 0.0761 | 0.3688 | 0.7375 | 0.7375 | nan | nan | 0.7375 | 0.0 | nan | 0.7375 | |
|
| 0.0695 | 18.82 | 320 | 0.0720 | 0.3786 | 0.7572 | 0.7572 | nan | nan | 0.7572 | 0.0 | nan | 0.7572 | |
|
| 0.0689 | 20.0 | 340 | 0.0646 | 0.3964 | 0.7927 | 0.7927 | nan | nan | 0.7927 | 0.0 | nan | 0.7927 | |
|
| 0.0592 | 21.18 | 360 | 0.0657 | 0.3063 | 0.6126 | 0.6126 | nan | nan | 0.6126 | 0.0 | nan | 0.6126 | |
|
| 0.0635 | 22.35 | 380 | 0.0581 | 0.3615 | 0.7230 | 0.7230 | nan | nan | 0.7230 | 0.0 | nan | 0.7230 | |
|
| 0.0511 | 23.53 | 400 | 0.0526 | 0.3622 | 0.7245 | 0.7245 | nan | nan | 0.7245 | 0.0 | nan | 0.7245 | |
|
| 0.0518 | 24.71 | 420 | 0.0543 | 0.3270 | 0.6540 | 0.6540 | nan | nan | 0.6540 | 0.0 | nan | 0.6540 | |
|
| 0.0448 | 25.88 | 440 | 0.0522 | 0.4141 | 0.8282 | 0.8282 | nan | nan | 0.8282 | 0.0 | nan | 0.8282 | |
|
| 0.0395 | 27.06 | 460 | 0.0470 | 0.3519 | 0.7038 | 0.7038 | nan | nan | 0.7038 | 0.0 | nan | 0.7038 | |
|
| 0.04 | 28.24 | 480 | 0.0452 | 0.3870 | 0.7740 | 0.7740 | nan | nan | 0.7740 | 0.0 | nan | 0.7740 | |
|
| 0.0386 | 29.41 | 500 | 0.0439 | 0.3801 | 0.7603 | 0.7603 | nan | nan | 0.7603 | 0.0 | nan | 0.7603 | |
|
| 0.0421 | 30.59 | 520 | 0.0437 | 0.4047 | 0.8093 | 0.8093 | nan | nan | 0.8093 | 0.0 | nan | 0.8093 | |
|
| 0.0356 | 31.76 | 540 | 0.0427 | 0.3675 | 0.7349 | 0.7349 | nan | nan | 0.7349 | 0.0 | nan | 0.7349 | |
|
| 0.0368 | 32.94 | 560 | 0.0420 | 0.3604 | 0.7208 | 0.7208 | nan | nan | 0.7208 | 0.0 | nan | 0.7208 | |
|
| 0.0368 | 34.12 | 580 | 0.0408 | 0.3589 | 0.7179 | 0.7179 | nan | nan | 0.7179 | 0.0 | nan | 0.7179 | |
|
| 0.032 | 35.29 | 600 | 0.0395 | 0.3664 | 0.7329 | 0.7329 | nan | nan | 0.7329 | 0.0 | nan | 0.7329 | |
|
| 0.03 | 36.47 | 620 | 0.0394 | 0.3691 | 0.7382 | 0.7382 | nan | nan | 0.7382 | 0.0 | nan | 0.7382 | |
|
| 0.028 | 37.65 | 640 | 0.0383 | 0.3731 | 0.7462 | 0.7462 | nan | nan | 0.7462 | 0.0 | nan | 0.7462 | |
|
| 0.0304 | 38.82 | 660 | 0.0376 | 0.3796 | 0.7592 | 0.7592 | nan | nan | 0.7592 | 0.0 | nan | 0.7592 | |
|
| 0.0314 | 40.0 | 680 | 0.0382 | 0.3602 | 0.7204 | 0.7204 | nan | nan | 0.7204 | 0.0 | nan | 0.7204 | |
|
| 0.0266 | 41.18 | 700 | 0.0385 | 0.3602 | 0.7203 | 0.7203 | nan | nan | 0.7203 | 0.0 | nan | 0.7203 | |
|
| 0.0305 | 42.35 | 720 | 0.0375 | 0.3413 | 0.6827 | 0.6827 | nan | nan | 0.6827 | 0.0 | nan | 0.6827 | |
|
| 0.0334 | 43.53 | 740 | 0.0366 | 0.3632 | 0.7263 | 0.7263 | nan | nan | 0.7263 | 0.0 | nan | 0.7263 | |
|
| 0.0269 | 44.71 | 760 | 0.0359 | 0.3698 | 0.7396 | 0.7396 | nan | nan | 0.7396 | 0.0 | nan | 0.7396 | |
|
| 0.0352 | 45.88 | 780 | 0.0364 | 0.3679 | 0.7359 | 0.7359 | nan | nan | 0.7359 | 0.0 | nan | 0.7359 | |
|
| 0.0398 | 47.06 | 800 | 0.0366 | 0.3504 | 0.7008 | 0.7008 | nan | nan | 0.7008 | 0.0 | nan | 0.7008 | |
|
| 0.0261 | 48.24 | 820 | 0.0361 | 0.3789 | 0.7578 | 0.7578 | nan | nan | 0.7578 | 0.0 | nan | 0.7578 | |
|
| 0.0252 | 49.41 | 840 | 0.0360 | 0.3724 | 0.7448 | 0.7448 | nan | nan | 0.7448 | 0.0 | nan | 0.7448 | |
|
|
|
|
|
### Framework versions |
|
|
|
- Transformers 4.28.0 |
|
- Pytorch 2.1.0+cu118 |
|
- Datasets 2.15.0 |
|
- Tokenizers 0.13.3 |
|
|