File size: 6,784 Bytes
b3fd691
 
 
 
 
 
 
 
 
 
 
 
 
 
ee8e544
b3fd691
4c47cec
 
 
 
b3fd691
4c47cec
 
b3fd691
4c47cec
 
b3fd691
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
bc85fee
b3fd691
 
 
4c47cec
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b3fd691
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
---
license: other
tags:
- generated_from_trainer
model-index:
- name: segformer-b0-finetuned-segments-toolwear
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b0-finetuned-segments-toolwear

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co./nvidia/mit-b0) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1291
- Mean Iou: 0.4322
- Mean Accuracy: 0.8644
- Overall Accuracy: 0.8644
- Accuracy Unlabeled: nan
- Accuracy Tool: nan
- Accuracy Wear: 0.8644
- Iou Unlabeled: 0.0
- Iou Tool: nan
- Iou Wear: 0.8644

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Tool | Accuracy Wear | Iou Unlabeled | Iou Tool | Iou Wear |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:-------------:|:--------:|:--------:|
| 0.8371        | 1.82  | 20   | 0.9482          | 0.3285   | 0.9854        | 0.9854           | nan                | nan           | 0.9854        | 0.0           | 0.0      | 0.9854   |
| 0.6335        | 3.64  | 40   | 0.7489          | 0.4996   | 0.9992        | 0.9992           | nan                | nan           | 0.9992        | 0.0           | nan      | 0.9992   |
| 0.5053        | 5.45  | 60   | 0.5400          | 0.4975   | 0.9949        | 0.9949           | nan                | nan           | 0.9949        | 0.0           | nan      | 0.9949   |
| 0.3924        | 7.27  | 80   | 0.4544          | 0.4905   | 0.9810        | 0.9810           | nan                | nan           | 0.9810        | 0.0           | nan      | 0.9810   |
| 0.3419        | 9.09  | 100  | 0.3840          | 0.4727   | 0.9455        | 0.9455           | nan                | nan           | 0.9455        | 0.0           | nan      | 0.9455   |
| 0.3379        | 10.91 | 120  | 0.3407          | 0.4648   | 0.9296        | 0.9296           | nan                | nan           | 0.9296        | 0.0           | nan      | 0.9296   |
| 0.2639        | 12.73 | 140  | 0.3495          | 0.4780   | 0.9559        | 0.9559           | nan                | nan           | 0.9559        | 0.0           | nan      | 0.9559   |
| 0.224         | 14.55 | 160  | 0.2815          | 0.4541   | 0.9081        | 0.9081           | nan                | nan           | 0.9081        | 0.0           | nan      | 0.9081   |
| 0.1725        | 16.36 | 180  | 0.2896          | 0.4599   | 0.9199        | 0.9199           | nan                | nan           | 0.9199        | 0.0           | nan      | 0.9199   |
| 0.1623        | 18.18 | 200  | 0.2540          | 0.4679   | 0.9359        | 0.9359           | nan                | nan           | 0.9359        | 0.0           | nan      | 0.9359   |
| 0.1724        | 20.0  | 220  | 0.2567          | 0.4702   | 0.9404        | 0.9404           | nan                | nan           | 0.9404        | 0.0           | nan      | 0.9404   |
| 0.1503        | 21.82 | 240  | 0.1967          | 0.4459   | 0.8919        | 0.8919           | nan                | nan           | 0.8919        | 0.0           | nan      | 0.8919   |
| 0.1189        | 23.64 | 260  | 0.2153          | 0.4617   | 0.9234        | 0.9234           | nan                | nan           | 0.9234        | 0.0           | nan      | 0.9234   |
| 0.1007        | 25.45 | 280  | 0.1695          | 0.4324   | 0.8648        | 0.8648           | nan                | nan           | 0.8648        | 0.0           | nan      | 0.8648   |
| 0.0921        | 27.27 | 300  | 0.1540          | 0.4346   | 0.8691        | 0.8691           | nan                | nan           | 0.8691        | 0.0           | nan      | 0.8691   |
| 0.0897        | 29.09 | 320  | 0.1657          | 0.4538   | 0.9077        | 0.9077           | nan                | nan           | 0.9077        | 0.0           | nan      | 0.9077   |
| 0.0814        | 30.91 | 340  | 0.1519          | 0.4374   | 0.8749        | 0.8749           | nan                | nan           | 0.8749        | 0.0           | nan      | 0.8749   |
| 0.0729        | 32.73 | 360  | 0.1444          | 0.4430   | 0.8861        | 0.8861           | nan                | nan           | 0.8861        | 0.0           | nan      | 0.8861   |
| 0.0892        | 34.55 | 380  | 0.1283          | 0.4106   | 0.8213        | 0.8213           | nan                | nan           | 0.8213        | 0.0           | nan      | 0.8213   |
| 0.07          | 36.36 | 400  | 0.1442          | 0.4374   | 0.8748        | 0.8748           | nan                | nan           | 0.8748        | 0.0           | nan      | 0.8748   |
| 0.0619        | 38.18 | 420  | 0.1391          | 0.4296   | 0.8592        | 0.8592           | nan                | nan           | 0.8592        | 0.0           | nan      | 0.8592   |
| 0.0563        | 40.0  | 440  | 0.1283          | 0.4402   | 0.8804        | 0.8804           | nan                | nan           | 0.8804        | 0.0           | nan      | 0.8804   |
| 0.0582        | 41.82 | 460  | 0.1275          | 0.4297   | 0.8595        | 0.8595           | nan                | nan           | 0.8595        | 0.0           | nan      | 0.8595   |
| 0.0575        | 43.64 | 480  | 0.1341          | 0.4362   | 0.8724        | 0.8724           | nan                | nan           | 0.8724        | 0.0           | nan      | 0.8724   |
| 0.068         | 45.45 | 500  | 0.1132          | 0.4181   | 0.8362        | 0.8362           | nan                | nan           | 0.8362        | 0.0           | nan      | 0.8362   |
| 0.0595        | 47.27 | 520  | 0.1285          | 0.4316   | 0.8632        | 0.8632           | nan                | nan           | 0.8632        | 0.0           | nan      | 0.8632   |
| 0.0558        | 49.09 | 540  | 0.1291          | 0.4322   | 0.8644        | 0.8644           | nan                | nan           | 0.8644        | 0.0           | nan      | 0.8644   |


### Framework versions

- Transformers 4.28.0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3