--- license: other tags: - generated_from_trainer model-index: - name: segformer-b0-finetuned-segments-toolwear results: [] --- # segformer-b0-finetuned-segments-toolwear This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co./nvidia/mit-b0) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0622 - Mean Iou: 0.3898 - Mean Accuracy: 0.7796 - Overall Accuracy: 0.7796 - Accuracy Unlabeled: nan - Accuracy Tool: nan - Accuracy Wear: 0.7796 - Iou Unlabeled: 0.0 - Iou Tool: nan - Iou Wear: 0.7796 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Tool | Accuracy Wear | Iou Unlabeled | Iou Tool | Iou Wear | |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:-------------:|:--------:|:--------:| | 0.862 | 1.82 | 20 | 0.9317 | 0.4701 | 0.9401 | 0.9401 | nan | nan | 0.9401 | 0.0 | nan | 0.9401 | | 0.6008 | 3.64 | 40 | 0.6036 | 0.4683 | 0.9365 | 0.9365 | nan | nan | 0.9365 | 0.0 | nan | 0.9365 | | 0.5219 | 5.45 | 60 | 0.4657 | 0.4618 | 0.9236 | 0.9236 | nan | nan | 0.9236 | 0.0 | nan | 0.9236 | | 0.3939 | 7.27 | 80 | 0.4035 | 0.4462 | 0.8924 | 0.8924 | nan | nan | 0.8924 | 0.0 | nan | 0.8924 | | 0.3919 | 9.09 | 100 | 0.3229 | 0.3580 | 0.7159 | 0.7159 | nan | nan | 0.7159 | 0.0 | nan | 0.7159 | | 0.3267 | 10.91 | 120 | 0.2979 | 0.4378 | 0.8756 | 0.8756 | nan | nan | 0.8756 | 0.0 | nan | 0.8756 | | 0.2791 | 12.73 | 140 | 0.2356 | 0.3858 | 0.7715 | 0.7715 | nan | nan | 0.7715 | 0.0 | nan | 0.7715 | | 0.2466 | 14.55 | 160 | 0.2299 | 0.4279 | 0.8558 | 0.8558 | nan | nan | 0.8558 | 0.0 | nan | 0.8558 | | 0.1883 | 16.36 | 180 | 0.1804 | 0.4138 | 0.8276 | 0.8276 | nan | nan | 0.8276 | 0.0 | nan | 0.8276 | | 0.1745 | 18.18 | 200 | 0.1772 | 0.4006 | 0.8011 | 0.8011 | nan | nan | 0.8011 | 0.0 | nan | 0.8011 | | 0.144 | 20.0 | 220 | 0.1425 | 0.3985 | 0.7970 | 0.7970 | nan | nan | 0.7970 | 0.0 | nan | 0.7970 | | 0.1901 | 21.82 | 240 | 0.1243 | 0.3619 | 0.7239 | 0.7239 | nan | nan | 0.7239 | 0.0 | nan | 0.7239 | | 0.1248 | 23.64 | 260 | 0.1197 | 0.3573 | 0.7146 | 0.7146 | nan | nan | 0.7146 | 0.0 | nan | 0.7146 | | 0.1245 | 25.45 | 280 | 0.1059 | 0.3985 | 0.7970 | 0.7970 | nan | nan | 0.7970 | 0.0 | nan | 0.7970 | | 0.1189 | 27.27 | 300 | 0.0990 | 0.4031 | 0.8063 | 0.8063 | nan | nan | 0.8063 | 0.0 | nan | 0.8063 | | 0.0985 | 29.09 | 320 | 0.0915 | 0.4186 | 0.8371 | 0.8371 | nan | nan | 0.8371 | 0.0 | nan | 0.8371 | | 0.0884 | 30.91 | 340 | 0.0839 | 0.3677 | 0.7354 | 0.7354 | nan | nan | 0.7354 | 0.0 | nan | 0.7354 | | 0.0797 | 32.73 | 360 | 0.0813 | 0.3796 | 0.7592 | 0.7592 | nan | nan | 0.7592 | 0.0 | nan | 0.7592 | | 0.077 | 34.55 | 380 | 0.0748 | 0.3965 | 0.7931 | 0.7931 | nan | nan | 0.7931 | 0.0 | nan | 0.7931 | | 0.0735 | 36.36 | 400 | 0.0739 | 0.3880 | 0.7760 | 0.7760 | nan | nan | 0.7760 | 0.0 | nan | 0.7760 | | 0.072 | 38.18 | 420 | 0.0725 | 0.3980 | 0.7959 | 0.7959 | nan | nan | 0.7959 | 0.0 | nan | 0.7959 | | 0.0744 | 40.0 | 440 | 0.0672 | 0.3942 | 0.7884 | 0.7884 | nan | nan | 0.7884 | 0.0 | nan | 0.7884 | | 0.0602 | 41.82 | 460 | 0.0652 | 0.4077 | 0.8154 | 0.8154 | nan | nan | 0.8154 | 0.0 | nan | 0.8154 | | 0.0632 | 43.64 | 480 | 0.0660 | 0.3855 | 0.7711 | 0.7711 | nan | nan | 0.7711 | 0.0 | nan | 0.7711 | | 0.0768 | 45.45 | 500 | 0.0629 | 0.3911 | 0.7821 | 0.7821 | nan | nan | 0.7821 | 0.0 | nan | 0.7821 | | 0.0564 | 47.27 | 520 | 0.0619 | 0.3764 | 0.7529 | 0.7529 | nan | nan | 0.7529 | 0.0 | nan | 0.7529 | | 0.06 | 49.09 | 540 | 0.0622 | 0.3898 | 0.7796 | 0.7796 | nan | nan | 0.7796 | 0.0 | nan | 0.7796 | ### Framework versions - Transformers 4.28.0 - Pytorch 2.1.0+cu118 - Datasets 2.15.0 - Tokenizers 0.13.3