HorcruxNo13 commited on
Commit
ac822b6
1 Parent(s): 5a0002a

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +52 -50
README.md CHANGED
@@ -14,14 +14,16 @@ should probably proofread and complete it, then remove this comment. -->
14
 
15
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.0223
18
- - Mean Iou: 0.4979
19
- - Mean Accuracy: 0.9957
20
- - Overall Accuracy: 0.9957
21
  - Accuracy Unlabeled: nan
22
- - Accuracy Tool: 0.9957
 
23
  - Iou Unlabeled: 0.0
24
- - Iou Tool: 0.9957
 
25
 
26
  ## Model description
27
 
@@ -50,50 +52,50 @@ The following hyperparameters were used during training:
50
 
51
  ### Training results
52
 
53
- | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Tool | Iou Unlabeled | Iou Tool |
54
- |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:--------:|
55
- | 0.1534 | 1.18 | 20 | 0.3425 | 0.4977 | 0.9955 | 0.9955 | nan | 0.9955 | 0.0 | 0.9955 |
56
- | 0.091 | 2.35 | 40 | 0.1076 | 0.4948 | 0.9897 | 0.9897 | nan | 0.9897 | 0.0 | 0.9897 |
57
- | 0.0827 | 3.53 | 60 | 0.0828 | 0.4965 | 0.9931 | 0.9931 | nan | 0.9931 | 0.0 | 0.9931 |
58
- | 0.0729 | 4.71 | 80 | 0.0795 | 0.4967 | 0.9934 | 0.9934 | nan | 0.9934 | 0.0 | 0.9934 |
59
- | 0.0825 | 5.88 | 100 | 0.0606 | 0.4910 | 0.9819 | 0.9819 | nan | 0.9819 | 0.0 | 0.9819 |
60
- | 0.0604 | 7.06 | 120 | 0.0546 | 0.4910 | 0.9820 | 0.9820 | nan | 0.9820 | 0.0 | 0.9820 |
61
- | 0.0575 | 8.24 | 140 | 0.0460 | 0.4942 | 0.9884 | 0.9884 | nan | 0.9884 | 0.0 | 0.9884 |
62
- | 0.0592 | 9.41 | 160 | 0.0450 | 0.4906 | 0.9813 | 0.9813 | nan | 0.9813 | 0.0 | 0.9813 |
63
- | 0.0478 | 10.59 | 180 | 0.0400 | 0.4981 | 0.9962 | 0.9962 | nan | 0.9962 | 0.0 | 0.9962 |
64
- | 0.046 | 11.76 | 200 | 0.0403 | 0.4982 | 0.9964 | 0.9964 | nan | 0.9964 | 0.0 | 0.9964 |
65
- | 0.0535 | 12.94 | 220 | 0.0340 | 0.4971 | 0.9941 | 0.9941 | nan | 0.9941 | 0.0 | 0.9941 |
66
- | 0.0317 | 14.12 | 240 | 0.0332 | 0.4975 | 0.9949 | 0.9949 | nan | 0.9949 | 0.0 | 0.9949 |
67
- | 0.0352 | 15.29 | 260 | 0.0328 | 0.4982 | 0.9964 | 0.9964 | nan | 0.9964 | 0.0 | 0.9964 |
68
- | 0.0258 | 16.47 | 280 | 0.0295 | 0.4963 | 0.9926 | 0.9926 | nan | 0.9926 | 0.0 | 0.9926 |
69
- | 0.0218 | 17.65 | 300 | 0.0265 | 0.4968 | 0.9935 | 0.9935 | nan | 0.9935 | 0.0 | 0.9935 |
70
- | 0.026 | 18.82 | 320 | 0.0284 | 0.4979 | 0.9958 | 0.9958 | nan | 0.9958 | 0.0 | 0.9958 |
71
- | 0.026 | 20.0 | 340 | 0.0267 | 0.4971 | 0.9941 | 0.9941 | nan | 0.9941 | 0.0 | 0.9941 |
72
- | 0.02 | 21.18 | 360 | 0.0242 | 0.4967 | 0.9935 | 0.9935 | nan | 0.9935 | 0.0 | 0.9935 |
73
- | 0.0255 | 22.35 | 380 | 0.0270 | 0.4975 | 0.9949 | 0.9949 | nan | 0.9949 | 0.0 | 0.9949 |
74
- | 0.0282 | 23.53 | 400 | 0.0240 | 0.4973 | 0.9946 | 0.9946 | nan | 0.9946 | 0.0 | 0.9946 |
75
- | 0.0188 | 24.71 | 420 | 0.0244 | 0.4972 | 0.9944 | 0.9944 | nan | 0.9944 | 0.0 | 0.9944 |
76
- | 0.0196 | 25.88 | 440 | 0.0226 | 0.4961 | 0.9922 | 0.9922 | nan | 0.9922 | 0.0 | 0.9922 |
77
- | 0.0165 | 27.06 | 460 | 0.0235 | 0.4968 | 0.9937 | 0.9937 | nan | 0.9937 | 0.0 | 0.9937 |
78
- | 0.02 | 28.24 | 480 | 0.0245 | 0.4981 | 0.9962 | 0.9962 | nan | 0.9962 | 0.0 | 0.9962 |
79
- | 0.0213 | 29.41 | 500 | 0.0225 | 0.4972 | 0.9944 | 0.9944 | nan | 0.9944 | 0.0 | 0.9944 |
80
- | 0.0174 | 30.59 | 520 | 0.0221 | 0.4970 | 0.9940 | 0.9940 | nan | 0.9940 | 0.0 | 0.9940 |
81
- | 0.0163 | 31.76 | 540 | 0.0226 | 0.4975 | 0.9951 | 0.9951 | nan | 0.9951 | 0.0 | 0.9951 |
82
- | 0.0242 | 32.94 | 560 | 0.0236 | 0.4978 | 0.9956 | 0.9956 | nan | 0.9956 | 0.0 | 0.9956 |
83
- | 0.0195 | 34.12 | 580 | 0.0217 | 0.4976 | 0.9953 | 0.9953 | nan | 0.9953 | 0.0 | 0.9953 |
84
- | 0.0134 | 35.29 | 600 | 0.0220 | 0.4974 | 0.9948 | 0.9948 | nan | 0.9948 | 0.0 | 0.9948 |
85
- | 0.0192 | 36.47 | 620 | 0.0216 | 0.4974 | 0.9947 | 0.9947 | nan | 0.9947 | 0.0 | 0.9947 |
86
- | 0.0138 | 37.65 | 640 | 0.0219 | 0.4974 | 0.9948 | 0.9948 | nan | 0.9948 | 0.0 | 0.9948 |
87
- | 0.0147 | 38.82 | 660 | 0.0215 | 0.4973 | 0.9945 | 0.9945 | nan | 0.9945 | 0.0 | 0.9945 |
88
- | 0.0208 | 40.0 | 680 | 0.0219 | 0.4979 | 0.9958 | 0.9958 | nan | 0.9958 | 0.0 | 0.9958 |
89
- | 0.0152 | 41.18 | 700 | 0.0211 | 0.4974 | 0.9948 | 0.9948 | nan | 0.9948 | 0.0 | 0.9948 |
90
- | 0.0145 | 42.35 | 720 | 0.0214 | 0.4977 | 0.9954 | 0.9954 | nan | 0.9954 | 0.0 | 0.9954 |
91
- | 0.0138 | 43.53 | 740 | 0.0217 | 0.4977 | 0.9954 | 0.9954 | nan | 0.9954 | 0.0 | 0.9954 |
92
- | 0.0122 | 44.71 | 760 | 0.0218 | 0.4977 | 0.9954 | 0.9954 | nan | 0.9954 | 0.0 | 0.9954 |
93
- | 0.0201 | 45.88 | 780 | 0.0220 | 0.4976 | 0.9953 | 0.9953 | nan | 0.9953 | 0.0 | 0.9953 |
94
- | 0.0147 | 47.06 | 800 | 0.0219 | 0.4977 | 0.9954 | 0.9954 | nan | 0.9954 | 0.0 | 0.9954 |
95
- | 0.0131 | 48.24 | 820 | 0.0213 | 0.4975 | 0.9950 | 0.9950 | nan | 0.9950 | 0.0 | 0.9950 |
96
- | 0.016 | 49.41 | 840 | 0.0223 | 0.4979 | 0.9957 | 0.9957 | nan | 0.9957 | 0.0 | 0.9957 |
97
 
98
 
99
  ### Framework versions
 
14
 
15
  This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on an unknown dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.0360
18
+ - Mean Iou: 0.3724
19
+ - Mean Accuracy: 0.7448
20
+ - Overall Accuracy: 0.7448
21
  - Accuracy Unlabeled: nan
22
+ - Accuracy Tool: nan
23
+ - Accuracy Wear: 0.7448
24
  - Iou Unlabeled: 0.0
25
+ - Iou Tool: nan
26
+ - Iou Wear: 0.7448
27
 
28
  ## Model description
29
 
 
52
 
53
  ### Training results
54
 
55
+ | Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Unlabeled | Accuracy Tool | Accuracy Wear | Iou Unlabeled | Iou Tool | Iou Wear |
56
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:------------------:|:-------------:|:-------------:|:-------------:|:--------:|:--------:|
57
+ | 0.9316 | 1.18 | 20 | 0.9877 | 0.4510 | 0.9020 | 0.9020 | nan | nan | 0.9020 | 0.0 | nan | 0.9020 |
58
+ | 0.6902 | 2.35 | 40 | 0.6556 | 0.3765 | 0.7531 | 0.7531 | nan | nan | 0.7531 | 0.0 | nan | 0.7531 |
59
+ | 0.532 | 3.53 | 60 | 0.4585 | 0.3435 | 0.6871 | 0.6871 | nan | nan | 0.6871 | 0.0 | nan | 0.6871 |
60
+ | 0.4296 | 4.71 | 80 | 0.3832 | 0.3900 | 0.7799 | 0.7799 | nan | nan | 0.7799 | 0.0 | nan | 0.7799 |
61
+ | 0.3318 | 5.88 | 100 | 0.3255 | 0.3739 | 0.7478 | 0.7478 | nan | nan | 0.7478 | 0.0 | nan | 0.7478 |
62
+ | 0.281 | 7.06 | 120 | 0.2480 | 0.3109 | 0.6219 | 0.6219 | nan | nan | 0.6219 | 0.0 | nan | 0.6219 |
63
+ | 0.2405 | 8.24 | 140 | 0.2410 | 0.2029 | 0.4059 | 0.4059 | nan | nan | 0.4059 | 0.0 | nan | 0.4059 |
64
+ | 0.1945 | 9.41 | 160 | 0.1969 | 0.3366 | 0.6733 | 0.6733 | nan | nan | 0.6733 | 0.0 | nan | 0.6733 |
65
+ | 0.1612 | 10.59 | 180 | 0.1776 | 0.3469 | 0.6938 | 0.6938 | nan | nan | 0.6938 | 0.0 | nan | 0.6938 |
66
+ | 0.1653 | 11.76 | 200 | 0.1455 | 0.3758 | 0.7515 | 0.7515 | nan | nan | 0.7515 | 0.0 | nan | 0.7515 |
67
+ | 0.1562 | 12.94 | 220 | 0.1330 | 0.2652 | 0.5304 | 0.5304 | nan | nan | 0.5304 | 0.0 | nan | 0.5304 |
68
+ | 0.1053 | 14.12 | 240 | 0.1145 | 0.3511 | 0.7022 | 0.7022 | nan | nan | 0.7022 | 0.0 | nan | 0.7022 |
69
+ | 0.1017 | 15.29 | 260 | 0.0989 | 0.3879 | 0.7757 | 0.7757 | nan | nan | 0.7757 | 0.0 | nan | 0.7757 |
70
+ | 0.0809 | 16.47 | 280 | 0.0859 | 0.2622 | 0.5243 | 0.5243 | nan | nan | 0.5243 | 0.0 | nan | 0.5243 |
71
+ | 0.0861 | 17.65 | 300 | 0.0761 | 0.3688 | 0.7375 | 0.7375 | nan | nan | 0.7375 | 0.0 | nan | 0.7375 |
72
+ | 0.0695 | 18.82 | 320 | 0.0720 | 0.3786 | 0.7572 | 0.7572 | nan | nan | 0.7572 | 0.0 | nan | 0.7572 |
73
+ | 0.0689 | 20.0 | 340 | 0.0646 | 0.3964 | 0.7927 | 0.7927 | nan | nan | 0.7927 | 0.0 | nan | 0.7927 |
74
+ | 0.0592 | 21.18 | 360 | 0.0657 | 0.3063 | 0.6126 | 0.6126 | nan | nan | 0.6126 | 0.0 | nan | 0.6126 |
75
+ | 0.0635 | 22.35 | 380 | 0.0581 | 0.3615 | 0.7230 | 0.7230 | nan | nan | 0.7230 | 0.0 | nan | 0.7230 |
76
+ | 0.0511 | 23.53 | 400 | 0.0526 | 0.3622 | 0.7245 | 0.7245 | nan | nan | 0.7245 | 0.0 | nan | 0.7245 |
77
+ | 0.0518 | 24.71 | 420 | 0.0543 | 0.3270 | 0.6540 | 0.6540 | nan | nan | 0.6540 | 0.0 | nan | 0.6540 |
78
+ | 0.0448 | 25.88 | 440 | 0.0522 | 0.4141 | 0.8282 | 0.8282 | nan | nan | 0.8282 | 0.0 | nan | 0.8282 |
79
+ | 0.0395 | 27.06 | 460 | 0.0470 | 0.3519 | 0.7038 | 0.7038 | nan | nan | 0.7038 | 0.0 | nan | 0.7038 |
80
+ | 0.04 | 28.24 | 480 | 0.0452 | 0.3870 | 0.7740 | 0.7740 | nan | nan | 0.7740 | 0.0 | nan | 0.7740 |
81
+ | 0.0386 | 29.41 | 500 | 0.0439 | 0.3801 | 0.7603 | 0.7603 | nan | nan | 0.7603 | 0.0 | nan | 0.7603 |
82
+ | 0.0421 | 30.59 | 520 | 0.0437 | 0.4047 | 0.8093 | 0.8093 | nan | nan | 0.8093 | 0.0 | nan | 0.8093 |
83
+ | 0.0356 | 31.76 | 540 | 0.0427 | 0.3675 | 0.7349 | 0.7349 | nan | nan | 0.7349 | 0.0 | nan | 0.7349 |
84
+ | 0.0368 | 32.94 | 560 | 0.0420 | 0.3604 | 0.7208 | 0.7208 | nan | nan | 0.7208 | 0.0 | nan | 0.7208 |
85
+ | 0.0368 | 34.12 | 580 | 0.0408 | 0.3589 | 0.7179 | 0.7179 | nan | nan | 0.7179 | 0.0 | nan | 0.7179 |
86
+ | 0.032 | 35.29 | 600 | 0.0395 | 0.3664 | 0.7329 | 0.7329 | nan | nan | 0.7329 | 0.0 | nan | 0.7329 |
87
+ | 0.03 | 36.47 | 620 | 0.0394 | 0.3691 | 0.7382 | 0.7382 | nan | nan | 0.7382 | 0.0 | nan | 0.7382 |
88
+ | 0.028 | 37.65 | 640 | 0.0383 | 0.3731 | 0.7462 | 0.7462 | nan | nan | 0.7462 | 0.0 | nan | 0.7462 |
89
+ | 0.0304 | 38.82 | 660 | 0.0376 | 0.3796 | 0.7592 | 0.7592 | nan | nan | 0.7592 | 0.0 | nan | 0.7592 |
90
+ | 0.0314 | 40.0 | 680 | 0.0382 | 0.3602 | 0.7204 | 0.7204 | nan | nan | 0.7204 | 0.0 | nan | 0.7204 |
91
+ | 0.0266 | 41.18 | 700 | 0.0385 | 0.3602 | 0.7203 | 0.7203 | nan | nan | 0.7203 | 0.0 | nan | 0.7203 |
92
+ | 0.0305 | 42.35 | 720 | 0.0375 | 0.3413 | 0.6827 | 0.6827 | nan | nan | 0.6827 | 0.0 | nan | 0.6827 |
93
+ | 0.0334 | 43.53 | 740 | 0.0366 | 0.3632 | 0.7263 | 0.7263 | nan | nan | 0.7263 | 0.0 | nan | 0.7263 |
94
+ | 0.0269 | 44.71 | 760 | 0.0359 | 0.3698 | 0.7396 | 0.7396 | nan | nan | 0.7396 | 0.0 | nan | 0.7396 |
95
+ | 0.0352 | 45.88 | 780 | 0.0364 | 0.3679 | 0.7359 | 0.7359 | nan | nan | 0.7359 | 0.0 | nan | 0.7359 |
96
+ | 0.0398 | 47.06 | 800 | 0.0366 | 0.3504 | 0.7008 | 0.7008 | nan | nan | 0.7008 | 0.0 | nan | 0.7008 |
97
+ | 0.0261 | 48.24 | 820 | 0.0361 | 0.3789 | 0.7578 | 0.7578 | nan | nan | 0.7578 | 0.0 | nan | 0.7578 |
98
+ | 0.0252 | 49.41 | 840 | 0.0360 | 0.3724 | 0.7448 | 0.7448 | nan | nan | 0.7448 | 0.0 | nan | 0.7448 |
99
 
100
 
101
  ### Framework versions