segformer-b0-finetuned-batch1w5-15Dec
This model is a fine-tuned version of PushkarA07/segformer-b0-finetuned-batch2w5-15Dec on the PushkarA07/batch1-tiles_W5 dataset. It achieves the following results on the evaluation set:
- Loss: 0.0038
- Mean Iou: 0.9143
- Mean Accuracy: 0.9529
- Overall Accuracy: 0.9985
- Accuracy Abnormality: 0.9066
- Iou Abnormality: 0.8302
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Abnormality | Iou Abnormality |
---|---|---|---|---|---|---|---|---|
0.0085 | 0.8333 | 10 | 0.0101 | 0.8170 | 0.8872 | 0.9963 | 0.7762 | 0.6376 |
0.0097 | 1.6667 | 20 | 0.0078 | 0.8448 | 0.8888 | 0.9971 | 0.7787 | 0.6926 |
0.0102 | 2.5 | 30 | 0.0071 | 0.8563 | 0.9028 | 0.9973 | 0.8068 | 0.7153 |
0.0062 | 3.3333 | 40 | 0.0066 | 0.8618 | 0.9005 | 0.9975 | 0.8018 | 0.7262 |
0.006 | 4.1667 | 50 | 0.0062 | 0.8693 | 0.9147 | 0.9976 | 0.8304 | 0.7410 |
0.0086 | 5.0 | 60 | 0.0060 | 0.8726 | 0.9194 | 0.9976 | 0.8398 | 0.7475 |
0.0056 | 5.8333 | 70 | 0.0056 | 0.8773 | 0.9128 | 0.9978 | 0.8264 | 0.7568 |
0.0044 | 6.6667 | 80 | 0.0056 | 0.8789 | 0.9270 | 0.9978 | 0.8550 | 0.7601 |
0.0045 | 7.5 | 90 | 0.0054 | 0.8818 | 0.9239 | 0.9978 | 0.8487 | 0.7658 |
0.0087 | 8.3333 | 100 | 0.0053 | 0.8850 | 0.9347 | 0.9979 | 0.8705 | 0.7721 |
0.0045 | 9.1667 | 110 | 0.0052 | 0.8835 | 0.9150 | 0.9979 | 0.8306 | 0.7692 |
0.0051 | 10.0 | 120 | 0.0051 | 0.8888 | 0.9360 | 0.9980 | 0.8730 | 0.7798 |
0.0045 | 10.8333 | 130 | 0.0049 | 0.8904 | 0.9270 | 0.9980 | 0.8547 | 0.7827 |
0.0068 | 11.6667 | 140 | 0.0048 | 0.8904 | 0.9290 | 0.9980 | 0.8589 | 0.7828 |
0.0029 | 12.5 | 150 | 0.0048 | 0.8924 | 0.9394 | 0.9980 | 0.8799 | 0.7867 |
0.0051 | 13.3333 | 160 | 0.0048 | 0.8943 | 0.9361 | 0.9981 | 0.8731 | 0.7906 |
0.0038 | 14.1667 | 170 | 0.0047 | 0.8953 | 0.9394 | 0.9981 | 0.8796 | 0.7926 |
0.0075 | 15.0 | 180 | 0.0047 | 0.8967 | 0.9416 | 0.9981 | 0.8841 | 0.7952 |
0.0054 | 15.8333 | 190 | 0.0047 | 0.8954 | 0.9315 | 0.9981 | 0.8637 | 0.7928 |
0.0031 | 16.6667 | 200 | 0.0046 | 0.8973 | 0.9373 | 0.9981 | 0.8755 | 0.7965 |
0.0049 | 17.5 | 210 | 0.0046 | 0.8970 | 0.9300 | 0.9982 | 0.8606 | 0.7958 |
0.0049 | 18.3333 | 220 | 0.0045 | 0.9001 | 0.9430 | 0.9982 | 0.8870 | 0.8019 |
0.0038 | 19.1667 | 230 | 0.0045 | 0.9002 | 0.9485 | 0.9982 | 0.8979 | 0.8022 |
0.0074 | 20.0 | 240 | 0.0045 | 0.9009 | 0.9424 | 0.9982 | 0.8856 | 0.8036 |
0.0048 | 20.8333 | 250 | 0.0045 | 0.9008 | 0.9473 | 0.9982 | 0.8955 | 0.8034 |
0.0058 | 21.6667 | 260 | 0.0045 | 0.9011 | 0.9464 | 0.9982 | 0.8938 | 0.8039 |
0.0051 | 22.5 | 270 | 0.0044 | 0.9029 | 0.9421 | 0.9983 | 0.8850 | 0.8075 |
0.0062 | 23.3333 | 280 | 0.0043 | 0.9026 | 0.9379 | 0.9983 | 0.8766 | 0.8070 |
0.0051 | 24.1667 | 290 | 0.0044 | 0.9027 | 0.9440 | 0.9982 | 0.8888 | 0.8071 |
0.0026 | 25.0 | 300 | 0.0043 | 0.9043 | 0.9443 | 0.9983 | 0.8894 | 0.8103 |
0.007 | 25.8333 | 310 | 0.0043 | 0.9042 | 0.9498 | 0.9983 | 0.9004 | 0.8102 |
0.0041 | 26.6667 | 320 | 0.0043 | 0.9046 | 0.9454 | 0.9983 | 0.8916 | 0.8110 |
0.0045 | 27.5 | 330 | 0.0043 | 0.9048 | 0.9427 | 0.9983 | 0.8862 | 0.8114 |
0.0041 | 28.3333 | 340 | 0.0043 | 0.9055 | 0.9490 | 0.9983 | 0.8988 | 0.8128 |
0.0024 | 29.1667 | 350 | 0.0042 | 0.9064 | 0.9485 | 0.9983 | 0.8979 | 0.8145 |
0.0035 | 30.0 | 360 | 0.0042 | 0.9061 | 0.9424 | 0.9983 | 0.8856 | 0.8139 |
0.003 | 30.8333 | 370 | 0.0042 | 0.9063 | 0.9523 | 0.9983 | 0.9056 | 0.8142 |
0.0054 | 31.6667 | 380 | 0.0042 | 0.9074 | 0.9447 | 0.9983 | 0.8902 | 0.8165 |
0.0054 | 32.5 | 390 | 0.0042 | 0.9064 | 0.9480 | 0.9983 | 0.8969 | 0.8144 |
0.0041 | 33.3333 | 400 | 0.0042 | 0.9053 | 0.9471 | 0.9983 | 0.8951 | 0.8123 |
0.0059 | 34.1667 | 410 | 0.0041 | 0.9075 | 0.9439 | 0.9983 | 0.8886 | 0.8166 |
0.0027 | 35.0 | 420 | 0.0042 | 0.9066 | 0.9452 | 0.9983 | 0.8912 | 0.8149 |
0.0052 | 35.8333 | 430 | 0.0042 | 0.9074 | 0.9474 | 0.9983 | 0.8956 | 0.8165 |
0.0042 | 36.6667 | 440 | 0.0041 | 0.9070 | 0.9457 | 0.9983 | 0.8922 | 0.8156 |
0.0037 | 37.5 | 450 | 0.0041 | 0.9076 | 0.9457 | 0.9983 | 0.8922 | 0.8170 |
0.0033 | 38.3333 | 460 | 0.0041 | 0.9084 | 0.9481 | 0.9984 | 0.8970 | 0.8185 |
0.0031 | 39.1667 | 470 | 0.0041 | 0.9085 | 0.9471 | 0.9984 | 0.8949 | 0.8187 |
0.0037 | 40.0 | 480 | 0.0042 | 0.9071 | 0.9543 | 0.9983 | 0.9096 | 0.8159 |
0.0048 | 40.8333 | 490 | 0.0041 | 0.9088 | 0.9500 | 0.9984 | 0.9008 | 0.8192 |
0.0042 | 41.6667 | 500 | 0.0041 | 0.9086 | 0.9474 | 0.9984 | 0.8957 | 0.8188 |
0.0024 | 42.5 | 510 | 0.0040 | 0.9095 | 0.9470 | 0.9984 | 0.8948 | 0.8206 |
0.0047 | 43.3333 | 520 | 0.0040 | 0.9091 | 0.9511 | 0.9984 | 0.9031 | 0.8198 |
0.0054 | 44.1667 | 530 | 0.0041 | 0.9080 | 0.9438 | 0.9984 | 0.8884 | 0.8176 |
0.0053 | 45.0 | 540 | 0.0041 | 0.9084 | 0.9460 | 0.9984 | 0.8928 | 0.8185 |
0.0033 | 45.8333 | 550 | 0.0041 | 0.9094 | 0.9515 | 0.9984 | 0.9038 | 0.8205 |
0.0044 | 46.6667 | 560 | 0.0042 | 0.9076 | 0.9580 | 0.9983 | 0.9171 | 0.8169 |
0.0021 | 47.5 | 570 | 0.0040 | 0.9095 | 0.9501 | 0.9984 | 0.9010 | 0.8206 |
0.0035 | 48.3333 | 580 | 0.0040 | 0.9092 | 0.9529 | 0.9983 | 0.9067 | 0.8200 |
0.0038 | 49.1667 | 590 | 0.0040 | 0.9109 | 0.9505 | 0.9984 | 0.9019 | 0.8234 |
0.004 | 50.0 | 600 | 0.0041 | 0.9103 | 0.9563 | 0.9984 | 0.9134 | 0.8223 |
0.0044 | 50.8333 | 610 | 0.0040 | 0.9106 | 0.9464 | 0.9984 | 0.8936 | 0.8229 |
0.0026 | 51.6667 | 620 | 0.0040 | 0.9104 | 0.9554 | 0.9984 | 0.9116 | 0.8225 |
0.0062 | 52.5 | 630 | 0.0040 | 0.9114 | 0.9510 | 0.9984 | 0.9027 | 0.8244 |
0.0023 | 53.3333 | 640 | 0.0040 | 0.9114 | 0.9470 | 0.9984 | 0.8948 | 0.8244 |
0.0029 | 54.1667 | 650 | 0.0040 | 0.9113 | 0.9508 | 0.9984 | 0.9024 | 0.8242 |
0.0042 | 55.0 | 660 | 0.0040 | 0.9116 | 0.9528 | 0.9984 | 0.9064 | 0.8248 |
0.0044 | 55.8333 | 670 | 0.0039 | 0.9121 | 0.9519 | 0.9984 | 0.9045 | 0.8258 |
0.0016 | 56.6667 | 680 | 0.0040 | 0.9116 | 0.9514 | 0.9984 | 0.9035 | 0.8248 |
0.0044 | 57.5 | 690 | 0.0039 | 0.9116 | 0.9533 | 0.9984 | 0.9075 | 0.8248 |
0.0031 | 58.3333 | 700 | 0.0039 | 0.9118 | 0.9497 | 0.9984 | 0.9002 | 0.8253 |
0.0038 | 59.1667 | 710 | 0.0039 | 0.9119 | 0.9509 | 0.9984 | 0.9025 | 0.8254 |
0.0042 | 60.0 | 720 | 0.0040 | 0.9117 | 0.9535 | 0.9984 | 0.9078 | 0.8250 |
0.0045 | 60.8333 | 730 | 0.0039 | 0.9119 | 0.9512 | 0.9984 | 0.9032 | 0.8254 |
0.0039 | 61.6667 | 740 | 0.0039 | 0.9122 | 0.9507 | 0.9984 | 0.9022 | 0.8260 |
0.0022 | 62.5 | 750 | 0.0040 | 0.9117 | 0.9562 | 0.9984 | 0.9134 | 0.8250 |
0.0039 | 63.3333 | 760 | 0.0039 | 0.9126 | 0.9502 | 0.9984 | 0.9012 | 0.8268 |
0.0031 | 64.1667 | 770 | 0.0039 | 0.9115 | 0.9507 | 0.9984 | 0.9021 | 0.8245 |
0.0037 | 65.0 | 780 | 0.0040 | 0.9118 | 0.9533 | 0.9984 | 0.9074 | 0.8252 |
0.0046 | 65.8333 | 790 | 0.0039 | 0.9123 | 0.9489 | 0.9984 | 0.8986 | 0.8261 |
0.0026 | 66.6667 | 800 | 0.0039 | 0.9127 | 0.9532 | 0.9984 | 0.9073 | 0.8269 |
0.0039 | 67.5 | 810 | 0.0039 | 0.9121 | 0.9469 | 0.9984 | 0.8946 | 0.8258 |
0.0025 | 68.3333 | 820 | 0.0039 | 0.9121 | 0.9541 | 0.9984 | 0.9091 | 0.8259 |
0.0044 | 69.1667 | 830 | 0.0039 | 0.9127 | 0.9531 | 0.9984 | 0.9069 | 0.8270 |
0.0049 | 70.0 | 840 | 0.0039 | 0.9123 | 0.9546 | 0.9984 | 0.9100 | 0.8263 |
0.0038 | 70.8333 | 850 | 0.0039 | 0.9129 | 0.9527 | 0.9984 | 0.9062 | 0.8273 |
0.0053 | 71.6667 | 860 | 0.0039 | 0.9131 | 0.9534 | 0.9984 | 0.9077 | 0.8278 |
0.0049 | 72.5 | 870 | 0.0039 | 0.9128 | 0.9538 | 0.9984 | 0.9083 | 0.8272 |
0.003 | 73.3333 | 880 | 0.0039 | 0.9130 | 0.9503 | 0.9984 | 0.9012 | 0.8276 |
0.0025 | 74.1667 | 890 | 0.0039 | 0.9124 | 0.9583 | 0.9984 | 0.9176 | 0.8264 |
0.0035 | 75.0 | 900 | 0.0039 | 0.9131 | 0.9509 | 0.9984 | 0.9026 | 0.8278 |
0.0028 | 75.8333 | 910 | 0.0039 | 0.9128 | 0.9559 | 0.9984 | 0.9127 | 0.8272 |
0.0027 | 76.6667 | 920 | 0.0039 | 0.9128 | 0.9528 | 0.9984 | 0.9064 | 0.8272 |
0.0033 | 77.5 | 930 | 0.0039 | 0.9133 | 0.9539 | 0.9984 | 0.9086 | 0.8282 |
0.0033 | 78.3333 | 940 | 0.0039 | 0.9135 | 0.9529 | 0.9984 | 0.9065 | 0.8285 |
0.0056 | 79.1667 | 950 | 0.0039 | 0.9134 | 0.9529 | 0.9984 | 0.9067 | 0.8283 |
0.0063 | 80.0 | 960 | 0.0039 | 0.9132 | 0.9495 | 0.9984 | 0.8996 | 0.8279 |
0.0057 | 80.8333 | 970 | 0.0039 | 0.9130 | 0.9563 | 0.9984 | 0.9134 | 0.8276 |
0.0021 | 81.6667 | 980 | 0.0039 | 0.9136 | 0.9511 | 0.9985 | 0.9029 | 0.8287 |
0.0043 | 82.5 | 990 | 0.0039 | 0.9130 | 0.9563 | 0.9984 | 0.9136 | 0.8275 |
0.0048 | 83.3333 | 1000 | 0.0039 | 0.9137 | 0.9525 | 0.9984 | 0.9057 | 0.8289 |
0.0043 | 84.1667 | 1010 | 0.0039 | 0.9133 | 0.9514 | 0.9984 | 0.9035 | 0.8282 |
0.0037 | 85.0 | 1020 | 0.0039 | 0.9137 | 0.9542 | 0.9984 | 0.9092 | 0.8289 |
0.0042 | 85.8333 | 1030 | 0.0038 | 0.9137 | 0.9501 | 0.9985 | 0.9010 | 0.8290 |
0.0039 | 86.6667 | 1040 | 0.0039 | 0.9138 | 0.9550 | 0.9984 | 0.9108 | 0.8292 |
0.0027 | 87.5 | 1050 | 0.0038 | 0.9139 | 0.9517 | 0.9985 | 0.9041 | 0.8294 |
0.0034 | 88.3333 | 1060 | 0.0038 | 0.9138 | 0.9526 | 0.9985 | 0.9060 | 0.8291 |
0.0037 | 89.1667 | 1070 | 0.0039 | 0.9137 | 0.9550 | 0.9984 | 0.9109 | 0.8289 |
0.0029 | 90.0 | 1080 | 0.0038 | 0.9141 | 0.9509 | 0.9985 | 0.9025 | 0.8297 |
0.0038 | 90.8333 | 1090 | 0.0038 | 0.9139 | 0.9535 | 0.9985 | 0.9078 | 0.8294 |
0.0066 | 91.6667 | 1100 | 0.0039 | 0.9138 | 0.9545 | 0.9984 | 0.9097 | 0.8292 |
0.0037 | 92.5 | 1110 | 0.0039 | 0.9138 | 0.9547 | 0.9984 | 0.9102 | 0.8292 |
0.0053 | 93.3333 | 1120 | 0.0038 | 0.9143 | 0.9518 | 0.9985 | 0.9044 | 0.8301 |
0.0039 | 94.1667 | 1130 | 0.0038 | 0.9141 | 0.9523 | 0.9985 | 0.9054 | 0.8298 |
0.0049 | 95.0 | 1140 | 0.0038 | 0.9143 | 0.9520 | 0.9985 | 0.9047 | 0.8302 |
0.004 | 95.8333 | 1150 | 0.0038 | 0.9142 | 0.9535 | 0.9985 | 0.9077 | 0.8300 |
0.0033 | 96.6667 | 1160 | 0.0038 | 0.9142 | 0.9536 | 0.9985 | 0.9080 | 0.8300 |
0.0037 | 97.5 | 1170 | 0.0038 | 0.9141 | 0.9536 | 0.9985 | 0.9079 | 0.8298 |
0.0036 | 98.3333 | 1180 | 0.0038 | 0.9140 | 0.9536 | 0.9985 | 0.9080 | 0.8295 |
0.0042 | 99.1667 | 1190 | 0.0038 | 0.9141 | 0.9538 | 0.9985 | 0.9085 | 0.8298 |
0.0035 | 100.0 | 1200 | 0.0038 | 0.9143 | 0.9529 | 0.9985 | 0.9066 | 0.8302 |
Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.20.3
- Downloads last month
- 25
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for PushkarA07/segformer-b0-finetuned-batch1w5-15Dec
Unable to build the model tree, the base model loops to the model itself. Learn more.