File size: 6,790 Bytes
b3fd691
 
 
 
 
 
 
 
 
 
 
 
 
 
d35f045
b3fd691
c618cf8
 
 
 
b3fd691
c618cf8
 
b3fd691
c618cf8
 
b3fd691
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
bc85fee
b3fd691
 
 
d344718
b3fd691
c618cf8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b3fd691
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: other
tags:
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-segments-toolwear
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b0-finetuned-segments-toolwear

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co./nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3547
- Mean Iou: 0.3725
- Mean Accuracy: 0.7265
- Overall Accuracy: 0.8226
- Accuracy Unlabeled: nan
- Accuracy Tool: 0.6195
- Accuracy Wear: 0.8334
- Iou Unlabeled: 0.0
- Iou Tool: 0.2973
- Iou Wear: 0.8202

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Tool | Accuracy Wear | Iou Unlabeled | Iou Tool | Iou Wear |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:-------------:|:--------:|:--------:|
| 0.7196        | 1.82  | 20   | 0.9873          | 0.2927   | 0.4996        | 0.6806           | nan                | 0.2982        | 0.7009        | 0.0           | 0.2025   | 0.6757   |
| 0.6004        | 3.64  | 40   | 0.7373          | 0.3312   | 0.6517        | 0.7107           | nan                | 0.5861        | 0.7173        | 0.0           | 0.2916   | 0.7019   |
| 0.5155        | 5.45  | 60   | 0.6634          | 0.3376   | 0.5621        | 0.6378           | nan                | 0.4778        | 0.6463        | 0.0           | 0.3840   | 0.6289   |
| 0.4228        | 7.27  | 80   | 0.5380          | 0.3612   | 0.6707        | 0.7661           | nan                | 0.5646        | 0.7768        | 0.0           | 0.3241   | 0.7595   |
| 0.3216        | 9.09  | 100  | 0.5102          | 0.3466   | 0.6845        | 0.7281           | nan                | 0.6361        | 0.7330        | 0.0           | 0.3188   | 0.7209   |
| 0.3752        | 10.91 | 120  | 0.4615          | 0.3902   | 0.7013        | 0.8268           | nan                | 0.5616        | 0.8409        | 0.0           | 0.3476   | 0.8229   |
| 0.3014        | 12.73 | 140  | 0.4504          | 0.4075   | 0.7007        | 0.8311           | nan                | 0.5558        | 0.8457        | 0.0           | 0.3949   | 0.8275   |
| 0.2183        | 14.55 | 160  | 0.4241          | 0.3708   | 0.7363        | 0.8002           | nan                | 0.6653        | 0.8073        | 0.0           | 0.3165   | 0.7959   |
| 0.1674        | 16.36 | 180  | 0.4173          | 0.4020   | 0.7433        | 0.8684           | nan                | 0.6041        | 0.8824        | 0.0           | 0.3397   | 0.8664   |
| 0.2385        | 18.18 | 200  | 0.4716          | 0.3450   | 0.6543        | 0.7462           | nan                | 0.5520        | 0.7566        | 0.0           | 0.2941   | 0.7410   |
| 0.1588        | 20.0  | 220  | 0.3742          | 0.3820   | 0.7108        | 0.8179           | nan                | 0.5917        | 0.8299        | 0.0           | 0.3311   | 0.8149   |
| 0.1553        | 21.82 | 240  | 0.3677          | 0.3811   | 0.7312        | 0.8313           | nan                | 0.6199        | 0.8426        | 0.0           | 0.3144   | 0.8291   |
| 0.1765        | 23.64 | 260  | 0.4131          | 0.3689   | 0.7032        | 0.8024           | nan                | 0.5929        | 0.8135        | 0.0           | 0.3082   | 0.7985   |
| 0.2516        | 25.45 | 280  | 0.3632          | 0.4142   | 0.7158        | 0.8856           | nan                | 0.5270        | 0.9047        | 0.0           | 0.3585   | 0.8841   |
| 0.1534        | 27.27 | 300  | 0.3979          | 0.3813   | 0.7191        | 0.8236           | nan                | 0.6029        | 0.8354        | 0.0           | 0.3231   | 0.8209   |
| 0.1104        | 29.09 | 320  | 0.3787          | 0.3640   | 0.7439        | 0.8044           | nan                | 0.6765        | 0.8112        | 0.0           | 0.2911   | 0.8007   |
| 0.1799        | 30.91 | 340  | 0.3654          | 0.3868   | 0.7217        | 0.8257           | nan                | 0.6060        | 0.8374        | 0.0           | 0.3378   | 0.8227   |
| 0.1069        | 32.73 | 360  | 0.3928          | 0.3524   | 0.7171        | 0.7606           | nan                | 0.6687        | 0.7655        | 0.0           | 0.3018   | 0.7554   |
| 0.1178        | 34.55 | 380  | 0.3703          | 0.3622   | 0.7259        | 0.8079           | nan                | 0.6345        | 0.8172        | 0.0           | 0.2814   | 0.8052   |
| 0.1191        | 36.36 | 400  | 0.3636          | 0.3766   | 0.7396        | 0.8264           | nan                | 0.6431        | 0.8361        | 0.0           | 0.3069   | 0.8230   |
| 0.2008        | 38.18 | 420  | 0.3836          | 0.3685   | 0.7249        | 0.7907           | nan                | 0.6516        | 0.7981        | 0.0           | 0.3194   | 0.7860   |
| 0.0846        | 40.0  | 440  | 0.3602          | 0.3738   | 0.7285        | 0.8244           | nan                | 0.6218        | 0.8352        | 0.0           | 0.2994   | 0.8219   |
| 0.1178        | 41.82 | 460  | 0.3631          | 0.3751   | 0.7224        | 0.8311           | nan                | 0.6015        | 0.8433        | 0.0           | 0.2964   | 0.8288   |
| 0.0806        | 43.64 | 480  | 0.3631          | 0.3678   | 0.7233        | 0.8074           | nan                | 0.6297        | 0.8169        | 0.0           | 0.2988   | 0.8045   |
| 0.1102        | 45.45 | 500  | 0.3731          | 0.3686   | 0.7113        | 0.8067           | nan                | 0.6053        | 0.8174        | 0.0           | 0.3025   | 0.8032   |
| 0.0751        | 47.27 | 520  | 0.3671          | 0.3682   | 0.7249        | 0.8117           | nan                | 0.6283        | 0.8215        | 0.0           | 0.2959   | 0.8085   |
| 0.1272        | 49.09 | 540  | 0.3547          | 0.3725   | 0.7265        | 0.8226           | nan                | 0.6195        | 0.8334        | 0.0           | 0.2973   | 0.8202   |


### Framework versions

- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3