MT-fallen-firebrand-113
This model is a fine-tuned version of toobiza/MT-smart-feather-100 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2177
- Loss Ce: 0.0007
- Loss Bbox: 0.0252
- Cardinality Error: 1.0
- Giou: 95.5116
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 3
Training results
Training Loss | Epoch | Step | Validation Loss | Loss Ce | Loss Bbox | Cardinality Error | Giou |
---|---|---|---|---|---|---|---|
0.0433 | 0.02 | 10 | 0.2285 | 0.0005 | 0.0272 | 1.0 | 95.4320 |
0.1123 | 0.04 | 20 | 0.2205 | 0.0006 | 0.0264 | 1.0 | 95.6513 |
0.2672 | 0.06 | 30 | 0.2229 | 0.0007 | 0.0264 | 1.0 | 95.5263 |
0.1671 | 0.09 | 40 | 0.2266 | 0.0006 | 0.0267 | 1.0 | 95.4210 |
0.0761 | 0.11 | 50 | 0.2268 | 0.0006 | 0.0272 | 1.0 | 95.5399 |
0.0882 | 0.13 | 60 | 0.2234 | 0.0006 | 0.0269 | 1.0 | 95.6263 |
0.0525 | 0.15 | 70 | 0.2277 | 0.0006 | 0.0273 | 1.0 | 95.5031 |
0.0789 | 0.17 | 80 | 0.2277 | 0.0006 | 0.0271 | 1.0 | 95.4728 |
0.0661 | 0.19 | 90 | 0.2315 | 0.0006 | 0.0270 | 1.0 | 95.2434 |
0.0534 | 0.21 | 100 | 0.2304 | 0.0006 | 0.0270 | 1.0 | 95.2914 |
0.133 | 0.23 | 110 | 0.2318 | 0.0007 | 0.0268 | 1.0 | 95.2015 |
0.0908 | 0.26 | 120 | 0.2381 | 0.0006 | 0.0282 | 1.0 | 95.2005 |
0.0893 | 0.28 | 130 | 0.2388 | 0.0005 | 0.0279 | 1.0 | 95.1057 |
0.0516 | 0.3 | 140 | 0.2389 | 0.0005 | 0.0279 | 1.0 | 95.1262 |
0.2496 | 0.32 | 150 | 0.2378 | 0.0005 | 0.0277 | 1.0 | 95.1056 |
0.0914 | 0.34 | 160 | 0.2278 | 0.0005 | 0.0262 | 1.0 | 95.2652 |
0.1489 | 0.36 | 170 | 0.2206 | 0.0006 | 0.0261 | 1.0 | 95.5911 |
0.2304 | 0.38 | 180 | 0.2192 | 0.0007 | 0.0260 | 1.0 | 95.6501 |
0.058 | 0.41 | 190 | 0.2187 | 0.0007 | 0.0257 | 1.0 | 95.6025 |
0.2195 | 0.43 | 200 | 0.2179 | 0.0006 | 0.0257 | 1.0 | 95.6214 |
0.076 | 0.45 | 210 | 0.2188 | 0.0006 | 0.0256 | 1.0 | 95.5425 |
0.054 | 0.47 | 220 | 0.2206 | 0.0006 | 0.0256 | 1.0 | 95.4653 |
0.1074 | 0.49 | 230 | 0.2260 | 0.0006 | 0.0259 | 1.0 | 95.2627 |
0.0648 | 0.51 | 240 | 0.2270 | 0.0006 | 0.0268 | 1.0 | 95.4231 |
0.2838 | 0.53 | 250 | 0.2227 | 0.0006 | 0.0263 | 1.0 | 95.5340 |
0.1706 | 0.55 | 260 | 0.2238 | 0.0006 | 0.0260 | 1.0 | 95.4056 |
0.065 | 0.58 | 270 | 0.2260 | 0.0007 | 0.0258 | 1.0 | 95.2666 |
0.063 | 0.6 | 280 | 0.2238 | 0.0007 | 0.0262 | 1.0 | 95.4676 |
0.0667 | 0.62 | 290 | 0.2206 | 0.0006 | 0.0262 | 1.0 | 95.6224 |
0.087 | 0.64 | 300 | 0.2238 | 0.0006 | 0.0259 | 1.0 | 95.3894 |
0.1819 | 0.66 | 310 | 0.2290 | 0.0006 | 0.0261 | 1.0 | 95.1798 |
0.1198 | 0.68 | 320 | 0.2251 | 0.0007 | 0.0264 | 1.0 | 95.4342 |
0.0732 | 0.7 | 330 | 0.2203 | 0.0007 | 0.0259 | 1.0 | 95.5583 |
0.1283 | 0.72 | 340 | 0.2255 | 0.0008 | 0.0258 | 1.0 | 95.2776 |
0.1481 | 0.75 | 350 | 0.2243 | 0.0008 | 0.0259 | 1.0 | 95.3665 |
0.1448 | 0.77 | 360 | 0.2199 | 0.0008 | 0.0259 | 1.0 | 95.5715 |
0.0831 | 0.79 | 370 | 0.2228 | 0.0008 | 0.0261 | 1.0 | 95.5002 |
0.0936 | 0.81 | 380 | 0.2251 | 0.0007 | 0.0260 | 1.0 | 95.3486 |
0.0715 | 0.83 | 390 | 0.2204 | 0.0007 | 0.0257 | 1.0 | 95.5102 |
0.0672 | 0.85 | 400 | 0.2234 | 0.0007 | 0.0256 | 1.0 | 95.3418 |
0.0713 | 0.87 | 410 | 0.2233 | 0.0007 | 0.0259 | 1.0 | 95.3986 |
0.0776 | 0.9 | 420 | 0.2269 | 0.0007 | 0.0259 | 1.0 | 95.2251 |
0.129 | 0.92 | 430 | 0.2278 | 0.0007 | 0.0259 | 1.0 | 95.1863 |
0.1938 | 0.94 | 440 | 0.2315 | 0.0007 | 0.0262 | 1.0 | 95.0673 |
0.0841 | 0.96 | 450 | 0.2271 | 0.0007 | 0.0258 | 1.0 | 95.1982 |
0.1348 | 0.98 | 460 | 0.2233 | 0.0008 | 0.0258 | 1.0 | 95.4038 |
0.0668 | 1.0 | 470 | 0.2243 | 0.0008 | 0.0257 | 1.0 | 95.3235 |
0.3109 | 1.02 | 480 | 0.2278 | 0.0008 | 0.0258 | 1.0 | 95.1576 |
0.069 | 1.04 | 490 | 0.2312 | 0.0007 | 0.0257 | 1.0 | 94.9736 |
0.0627 | 1.07 | 500 | 0.2284 | 0.0007 | 0.0254 | 1.0 | 95.0170 |
0.0852 | 1.09 | 510 | 0.2291 | 0.0006 | 0.0252 | 1.0 | 94.9385 |
0.0679 | 1.11 | 520 | 0.2259 | 0.0006 | 0.0253 | 1.0 | 95.1204 |
0.0543 | 1.13 | 530 | 0.2274 | 0.0006 | 0.0251 | 1.0 | 94.9929 |
0.1877 | 1.15 | 540 | 0.2248 | 0.0007 | 0.0249 | 1.0 | 95.0927 |
0.1028 | 1.17 | 550 | 0.2177 | 0.0007 | 0.0251 | 1.0 | 95.4758 |
0.068 | 1.19 | 560 | 0.2142 | 0.0007 | 0.0249 | 1.0 | 95.6181 |
0.0953 | 1.22 | 570 | 0.2120 | 0.0007 | 0.0248 | 1.0 | 95.6968 |
0.0548 | 1.24 | 580 | 0.2160 | 0.0007 | 0.0249 | 1.0 | 95.5313 |
0.2291 | 1.26 | 590 | 0.2192 | 0.0007 | 0.0252 | 1.0 | 95.4303 |
0.0565 | 1.28 | 600 | 0.2148 | 0.0007 | 0.0246 | 1.0 | 95.5175 |
0.0643 | 1.3 | 610 | 0.2165 | 0.0007 | 0.0248 | 1.0 | 95.4779 |
0.0527 | 1.32 | 620 | 0.2134 | 0.0007 | 0.0244 | 1.0 | 95.5284 |
0.1215 | 1.34 | 630 | 0.2106 | 0.0007 | 0.0240 | 1.0 | 95.5701 |
0.0851 | 1.36 | 640 | 0.2148 | 0.0007 | 0.0247 | 1.0 | 95.5315 |
0.0748 | 1.39 | 650 | 0.2188 | 0.0007 | 0.0252 | 1.0 | 95.4600 |
0.0632 | 1.41 | 660 | 0.2215 | 0.0007 | 0.0254 | 1.0 | 95.3640 |
0.0634 | 1.43 | 670 | 0.2232 | 0.0007 | 0.0255 | 1.0 | 95.3053 |
0.0851 | 1.45 | 680 | 0.2235 | 0.0006 | 0.0256 | 1.0 | 95.3000 |
0.0597 | 1.47 | 690 | 0.2250 | 0.0006 | 0.0255 | 1.0 | 95.2242 |
0.0522 | 1.49 | 700 | 0.2252 | 0.0007 | 0.0259 | 1.0 | 95.3196 |
0.1491 | 1.51 | 710 | 0.2250 | 0.0007 | 0.0260 | 1.0 | 95.3516 |
0.0515 | 1.54 | 720 | 0.2205 | 0.0007 | 0.0254 | 1.0 | 95.4144 |
0.1826 | 1.56 | 730 | 0.2199 | 0.0007 | 0.0254 | 1.0 | 95.4645 |
0.0529 | 1.58 | 740 | 0.2230 | 0.0007 | 0.0254 | 1.0 | 95.2972 |
0.0587 | 1.6 | 750 | 0.2229 | 0.0007 | 0.0256 | 1.0 | 95.3546 |
0.0589 | 1.62 | 760 | 0.2216 | 0.0007 | 0.0256 | 1.0 | 95.4016 |
0.0715 | 1.64 | 770 | 0.2216 | 0.0006 | 0.0254 | 1.0 | 95.3633 |
0.1186 | 1.66 | 780 | 0.2196 | 0.0006 | 0.0254 | 1.0 | 95.4553 |
0.0498 | 1.68 | 790 | 0.2179 | 0.0006 | 0.0255 | 1.0 | 95.5713 |
0.1199 | 1.71 | 800 | 0.2192 | 0.0006 | 0.0257 | 1.0 | 95.5583 |
0.2151 | 1.73 | 810 | 0.2196 | 0.0006 | 0.0257 | 1.0 | 95.5410 |
0.1703 | 1.75 | 820 | 0.2211 | 0.0006 | 0.0259 | 1.0 | 95.4973 |
0.0782 | 1.77 | 830 | 0.2244 | 0.0006 | 0.0259 | 1.0 | 95.3385 |
0.0575 | 1.79 | 840 | 0.2258 | 0.0006 | 0.0262 | 1.0 | 95.3399 |
0.1211 | 1.81 | 850 | 0.2239 | 0.0006 | 0.0261 | 1.0 | 95.4278 |
0.067 | 1.83 | 860 | 0.2232 | 0.0006 | 0.0261 | 1.0 | 95.4541 |
0.0521 | 1.86 | 870 | 0.2214 | 0.0006 | 0.0258 | 1.0 | 95.4654 |
0.1619 | 1.88 | 880 | 0.2206 | 0.0006 | 0.0256 | 1.0 | 95.4628 |
0.0656 | 1.9 | 890 | 0.2217 | 0.0006 | 0.0257 | 1.0 | 95.4210 |
0.1299 | 1.92 | 900 | 0.2216 | 0.0006 | 0.0257 | 1.0 | 95.4406 |
0.1795 | 1.94 | 910 | 0.2258 | 0.0006 | 0.0260 | 1.0 | 95.2944 |
0.0684 | 1.96 | 920 | 0.2288 | 0.0006 | 0.0257 | 1.0 | 95.0962 |
0.0683 | 1.98 | 930 | 0.2287 | 0.0006 | 0.0254 | 1.0 | 95.0161 |
0.1629 | 2.0 | 940 | 0.2300 | 0.0006 | 0.0255 | 1.0 | 94.9721 |
0.2252 | 2.03 | 950 | 0.2283 | 0.0006 | 0.0256 | 1.0 | 95.0778 |
0.097 | 2.05 | 960 | 0.2246 | 0.0006 | 0.0255 | 1.0 | 95.2448 |
0.0582 | 2.07 | 970 | 0.2247 | 0.0007 | 0.0253 | 1.0 | 95.2004 |
0.0613 | 2.09 | 980 | 0.2189 | 0.0007 | 0.0251 | 1.0 | 95.4246 |
0.055 | 2.11 | 990 | 0.2192 | 0.0007 | 0.0252 | 1.0 | 95.4337 |
0.1202 | 2.13 | 1000 | 0.2233 | 0.0007 | 0.0254 | 1.0 | 95.2810 |
0.0426 | 2.15 | 1010 | 0.2266 | 0.0006 | 0.0255 | 1.0 | 95.1437 |
0.0542 | 2.17 | 1020 | 0.2264 | 0.0006 | 0.0255 | 1.0 | 95.1477 |
0.0693 | 2.2 | 1030 | 0.2275 | 0.0006 | 0.0256 | 1.0 | 95.1170 |
0.0617 | 2.22 | 1040 | 0.2235 | 0.0006 | 0.0253 | 1.0 | 95.2543 |
0.0703 | 2.24 | 1050 | 0.2223 | 0.0007 | 0.0252 | 1.0 | 95.2883 |
0.1624 | 2.26 | 1060 | 0.2231 | 0.0007 | 0.0252 | 1.0 | 95.2493 |
0.1168 | 2.28 | 1070 | 0.2235 | 0.0007 | 0.0252 | 1.0 | 95.2330 |
0.0784 | 2.3 | 1080 | 0.2207 | 0.0007 | 0.0252 | 1.0 | 95.3759 |
0.0627 | 2.32 | 1090 | 0.2205 | 0.0007 | 0.0253 | 1.0 | 95.4080 |
0.0856 | 2.35 | 1100 | 0.2202 | 0.0007 | 0.0254 | 1.0 | 95.4356 |
0.0587 | 2.37 | 1110 | 0.2198 | 0.0007 | 0.0254 | 1.0 | 95.4616 |
0.0591 | 2.39 | 1120 | 0.2181 | 0.0007 | 0.0253 | 1.0 | 95.5062 |
0.3139 | 2.41 | 1130 | 0.2149 | 0.0007 | 0.0247 | 1.0 | 95.5317 |
0.0535 | 2.43 | 1140 | 0.2124 | 0.0007 | 0.0244 | 1.0 | 95.5952 |
0.0502 | 2.45 | 1150 | 0.2135 | 0.0007 | 0.0245 | 1.0 | 95.5379 |
0.0574 | 2.47 | 1160 | 0.2142 | 0.0007 | 0.0247 | 1.0 | 95.5711 |
0.0415 | 2.49 | 1170 | 0.2143 | 0.0007 | 0.0249 | 1.0 | 95.6145 |
0.0394 | 2.52 | 1180 | 0.2141 | 0.0008 | 0.0249 | 1.0 | 95.6154 |
0.1392 | 2.54 | 1190 | 0.2128 | 0.0008 | 0.0249 | 1.0 | 95.6786 |
0.1257 | 2.56 | 1200 | 0.2130 | 0.0007 | 0.0249 | 1.0 | 95.6791 |
0.0493 | 2.58 | 1210 | 0.2149 | 0.0007 | 0.0249 | 1.0 | 95.5834 |
0.1574 | 2.6 | 1220 | 0.2173 | 0.0007 | 0.0250 | 1.0 | 95.4966 |
0.077 | 2.62 | 1230 | 0.2183 | 0.0008 | 0.0251 | 1.0 | 95.4721 |
0.0616 | 2.64 | 1240 | 0.2172 | 0.0008 | 0.0252 | 1.0 | 95.5329 |
0.0515 | 2.67 | 1250 | 0.2169 | 0.0008 | 0.0252 | 1.0 | 95.5528 |
0.0464 | 2.69 | 1260 | 0.2158 | 0.0008 | 0.0252 | 1.0 | 95.6043 |
0.1273 | 2.71 | 1270 | 0.2151 | 0.0008 | 0.0252 | 1.0 | 95.6373 |
0.0602 | 2.73 | 1280 | 0.2157 | 0.0008 | 0.0252 | 1.0 | 95.6047 |
0.1282 | 2.75 | 1290 | 0.2166 | 0.0008 | 0.0251 | 1.0 | 95.5521 |
0.0552 | 2.77 | 1300 | 0.2172 | 0.0008 | 0.0252 | 1.0 | 95.5391 |
0.1731 | 2.79 | 1310 | 0.2176 | 0.0007 | 0.0252 | 1.0 | 95.5261 |
0.0999 | 2.81 | 1320 | 0.2169 | 0.0007 | 0.0252 | 1.0 | 95.5514 |
0.1018 | 2.84 | 1330 | 0.2169 | 0.0007 | 0.0252 | 1.0 | 95.5517 |
0.1837 | 2.86 | 1340 | 0.2172 | 0.0007 | 0.0252 | 1.0 | 95.5448 |
0.0582 | 2.88 | 1350 | 0.2165 | 0.0007 | 0.0252 | 1.0 | 95.5744 |
0.0671 | 2.9 | 1360 | 0.2171 | 0.0008 | 0.0252 | 1.0 | 95.5425 |
0.0451 | 2.92 | 1370 | 0.2176 | 0.0007 | 0.0252 | 1.0 | 95.5208 |
0.0829 | 2.94 | 1380 | 0.2173 | 0.0007 | 0.0252 | 1.0 | 95.5369 |
0.0575 | 2.96 | 1390 | 0.2176 | 0.0007 | 0.0252 | 1.0 | 95.5150 |
0.0926 | 2.99 | 1400 | 0.2177 | 0.0007 | 0.0252 | 1.0 | 95.5116 |
Framework versions
- Transformers 4.33.2
- Pytorch 2.1.0+cu118
- Datasets 2.14.6
- Tokenizers 0.13.3
- Downloads last month
- 35
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for AmineAllo/MT-fallen-firebrand-113
Base model
AmineAllo/MT-smart-feather-100