File size: 7,563 Bytes
06105fe
254f35b
 
 
 
 
 
 
 
 
06105fe
 
254f35b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
---
license: other
base_model: nvidia/segformer-b5-finetuned-cityscapes-1024-1024
tags:
- vision
- image-segmentation
- generated_from_trainer
model-index:
- name: segformer-b5-cityscapes-finetuned-coastTrain
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b5-cityscapes-finetuned-coastTrain

This model is a fine-tuned version of [nvidia/segformer-b5-finetuned-cityscapes-1024-1024](https://huggingface.co./nvidia/segformer-b5-finetuned-cityscapes-1024-1024) on the peldrak/coastTrain dataset.
It achieves the following results on the evaluation set:
- Loss: 0.4253
- Mean Iou: 0.5585
- Mean Accuracy: 0.6197
- Overall Accuracy: 0.8740
- Accuracy Water: 0.9765
- Accuracy Whitewater: 0.0159
- Accuracy Sediment: 0.6122
- Accuracy Other Natural Terrain: 0.0
- Accuracy Vegetation: 0.9255
- Accuracy Development: 0.8619
- Accuracy Unknown: 0.9457
- Iou Water: 0.8021
- Iou Whitewater: 0.0158
- Iou Sediment: 0.5787
- Iou Other Natural Terrain: 0.0
- Iou Vegetation: 0.8069
- Iou Development: 0.7835
- Iou Unknown: 0.9224
- F1 Score: 0.8596

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Water | Accuracy Whitewater | Accuracy Sediment | Accuracy Other Natural Terrain | Accuracy Vegetation | Accuracy Development | Accuracy Unknown | Iou Water | Iou Whitewater | Iou Sediment | Iou Other Natural Terrain | Iou Vegetation | Iou Development | Iou Unknown | F1 Score |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:--------------:|:-------------------:|:-----------------:|:------------------------------:|:-------------------:|:--------------------:|:----------------:|:---------:|:--------------:|:------------:|:-------------------------:|:--------------:|:---------------:|:-----------:|:--------:|
| 1.7218        | 0.16  | 20   | 1.5229          | 0.3639   | 0.4826        | 0.6959           | 0.6530         | 0.0033              | 0.6952            | 0.0067                         | 0.8127              | 0.2824               | 0.9249           | 0.5955    | 0.0030         | 0.3817       | 0.0063                    | 0.5740         | 0.2453          | 0.7415      | 0.6829   |
| 1.3084        | 0.31  | 40   | 1.1275          | 0.4131   | 0.5060        | 0.7566           | 0.7956         | 0.0011              | 0.5967            | 0.0                            | 0.9496              | 0.3131               | 0.8858           | 0.6821    | 0.0011         | 0.4697       | 0.0                       | 0.6092         | 0.2891          | 0.8406      | 0.7379   |
| 1.1538        | 0.47  | 60   | 0.8108          | 0.4722   | 0.5593        | 0.8123           | 0.8802         | 0.0001              | 0.6946            | 0.0                            | 0.9169              | 0.5078               | 0.9153           | 0.7833    | 0.0001         | 0.4768       | 0.0                       | 0.7188         | 0.4376          | 0.8889      | 0.8002   |
| 0.9791        | 0.62  | 80   | 0.6995          | 0.5264   | 0.6143        | 0.8518           | 0.9168         | 0.0000              | 0.7471            | 0.0                            | 0.8479              | 0.8453               | 0.9433           | 0.8301    | 0.0000         | 0.5727       | 0.0                       | 0.7406         | 0.6249          | 0.9164      | 0.8421   |
| 1.0426        | 0.78  | 100  | 0.5931          | 0.5280   | 0.6063        | 0.8523           | 0.8932         | 0.0003              | 0.6361            | 0.0                            | 0.9550              | 0.8282               | 0.9309           | 0.8097    | 0.0003         | 0.5481       | 0.0                       | 0.7440         | 0.6697          | 0.9243      | 0.8402   |
| 0.8008        | 0.93  | 120  | 0.4687          | 0.5485   | 0.6225        | 0.8706           | 0.9263         | 0.0                 | 0.7444            | 0.0                            | 0.9248              | 0.8212               | 0.9410           | 0.8404    | 0.0            | 0.5924       | 0.0                       | 0.7871         | 0.6857          | 0.9337      | 0.8595   |
| 1.0298        | 1.09  | 140  | 0.4732          | 0.5527   | 0.6244        | 0.8726           | 0.9421         | 0.0000              | 0.8164            | 0.0                            | 0.9047              | 0.7891               | 0.9185           | 0.8289    | 0.0000         | 0.6400       | 0.0                       | 0.7976         | 0.6991          | 0.9036      | 0.8617   |
| 0.4902        | 1.24  | 160  | 0.3911          | 0.5713   | 0.6310        | 0.8868           | 0.9694         | 0.0                 | 0.7543            | 0.0                            | 0.9348              | 0.8241               | 0.9344           | 0.8366    | 0.0            | 0.6816       | 0.0                       | 0.8102         | 0.7408          | 0.9295      | 0.8744   |
| 0.8204        | 1.4   | 180  | 0.4865          | 0.5210   | 0.5894        | 0.8522           | 0.9765         | 0.0                 | 0.4534            | 0.0                            | 0.9521              | 0.8103               | 0.9336           | 0.7833    | 0.0            | 0.4303       | 0.0                       | 0.7921         | 0.7097          | 0.9313      | 0.8322   |
| 1.1865        | 1.55  | 200  | 0.3980          | 0.5668   | 0.6352        | 0.8838           | 0.9644         | 0.0000              | 0.7632            | 0.0                            | 0.8985              | 0.8816               | 0.9385           | 0.8442    | 0.0000         | 0.6333       | 0.0                       | 0.8133         | 0.7431          | 0.9338      | 0.8722   |
| 0.5676        | 1.71  | 220  | 0.3955          | 0.5598   | 0.6352        | 0.8750           | 0.9299         | 0.0                 | 0.8440            | 0.0                            | 0.9085              | 0.8890               | 0.8747           | 0.8160    | 0.0            | 0.6601       | 0.0                       | 0.8209         | 0.7499          | 0.8721      | 0.8647   |
| 0.9343        | 1.86  | 240  | 0.3969          | 0.5809   | 0.6445        | 0.8944           | 0.9593         | 0.0001              | 0.8201            | 0.0                            | 0.9120              | 0.8658               | 0.9539           | 0.8589    | 0.0001         | 0.6678       | 0.0                       | 0.8327         | 0.7744          | 0.9326      | 0.8829   |
| 0.5811        | 2.02  | 260  | 0.4253          | 0.5585   | 0.6197        | 0.8740           | 0.9765         | 0.0159              | 0.6122            | 0.0                            | 0.9255              | 0.8619               | 0.9457           | 0.8021    | 0.0158         | 0.5787       | 0.0                       | 0.8069         | 0.7835          | 0.9224      | 0.8596   |


### Framework versions

- Transformers 4.37.0
- Pytorch 2.1.2
- Datasets 2.18.0
- Tokenizers 0.15.1