LynnKukunda's picture
End of training
e46c4f3 verified
|
raw
history blame
10.1 kB
metadata
license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
  - generated_from_trainer
datasets:
  - dsi
model-index:
  - name: detr_finetunned_ocular
    results: []

detr_finetunned_ocular

This model is a fine-tuned version of microsoft/conditional-detr-resnet-50 on the dsi dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0529
  • Map: 0.3106
  • Map 50: 0.5077
  • Map 75: 0.3672
  • Map Small: 0.3046
  • Map Medium: 0.6881
  • Map Large: -1.0
  • Mar 1: 0.0994
  • Mar 10: 0.3691
  • Mar 100: 0.4215
  • Mar Small: 0.4155
  • Mar Medium: 0.7477
  • Mar Large: -1.0
  • Map Falciparum Trophozoite: 0.0214
  • Mar 100 Falciparum Trophozoite: 0.1656
  • Map Wbc: 0.5997
  • Mar 100 Wbc: 0.6773

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Map Map 50 Map 75 Map Small Map Medium Map Large Mar 1 Mar 10 Mar 100 Mar Small Mar Medium Mar Large Map Falciparum Trophozoite Mar 100 Falciparum Trophozoite Map Wbc Mar 100 Wbc
No log 1.0 86 1.6663 0.1273 0.2547 0.1068 0.12 0.3504 -1.0 0.0547 0.2421 0.3256 0.3213 0.6374 -1.0 0.0008 0.065 0.2537 0.5862
No log 2.0 172 1.5286 0.1796 0.3765 0.1218 0.1738 0.4192 -1.0 0.0651 0.247 0.3015 0.3051 0.4738 -1.0 0.0012 0.0865 0.3581 0.5165
No log 3.0 258 1.3820 0.2265 0.4343 0.2063 0.2188 0.5554 -1.0 0.0753 0.2934 0.3347 0.3267 0.6813 -1.0 0.0016 0.0818 0.4513 0.5876
No log 4.0 344 1.3362 0.2148 0.4433 0.1633 0.206 0.5348 -1.0 0.0719 0.2764 0.3262 0.3193 0.6262 -1.0 0.004 0.1074 0.4256 0.5451
No log 5.0 430 1.2962 0.2558 0.4542 0.2703 0.2493 0.5953 -1.0 0.0826 0.3145 0.3488 0.3417 0.6897 -1.0 0.0029 0.092 0.5087 0.6055
1.6845 6.0 516 1.2470 0.2723 0.4597 0.3089 0.2662 0.6388 -1.0 0.0882 0.3285 0.3762 0.3691 0.7234 -1.0 0.0053 0.1119 0.5393 0.6405
1.6845 7.0 602 1.1917 0.2733 0.4681 0.3094 0.2677 0.6133 -1.0 0.0882 0.3294 0.3767 0.3709 0.7019 -1.0 0.0093 0.1204 0.5373 0.633
1.6845 8.0 688 1.2070 0.271 0.4732 0.2973 0.2641 0.6376 -1.0 0.0913 0.3372 0.3799 0.3728 0.7112 -1.0 0.0069 0.1333 0.5352 0.6265
1.6845 9.0 774 1.1872 0.2638 0.4778 0.2601 0.2566 0.649 -1.0 0.0865 0.3286 0.3765 0.371 0.6907 -1.0 0.0076 0.1272 0.52 0.6258
1.6845 10.0 860 1.1518 0.2767 0.4774 0.3047 0.2705 0.6371 -1.0 0.0896 0.3389 0.3838 0.3791 0.6953 -1.0 0.0067 0.1272 0.5467 0.6403
1.6845 11.0 946 1.1263 0.2846 0.4783 0.3251 0.2783 0.658 -1.0 0.0915 0.3472 0.3884 0.3835 0.7131 -1.0 0.006 0.1217 0.5632 0.6551
1.2411 12.0 1032 1.1376 0.2879 0.4859 0.323 0.2805 0.6685 -1.0 0.0961 0.3494 0.3925 0.3858 0.7374 -1.0 0.01 0.1266 0.5658 0.6583
1.2411 13.0 1118 1.1416 0.2834 0.4863 0.3144 0.2777 0.6417 -1.0 0.0952 0.3447 0.3903 0.3865 0.6888 -1.0 0.0113 0.137 0.5555 0.6436
1.2411 14.0 1204 1.1278 0.276 0.4831 0.3022 0.2711 0.6209 -1.0 0.0905 0.3392 0.3877 0.3829 0.6879 -1.0 0.0117 0.1436 0.5403 0.6318
1.2411 15.0 1290 1.1409 0.2707 0.495 0.2817 0.264 0.6219 -1.0 0.0891 0.3353 0.3851 0.379 0.6991 -1.0 0.0137 0.1442 0.5277 0.6261
1.2411 16.0 1376 1.0948 0.2904 0.4937 0.329 0.2846 0.6672 -1.0 0.0965 0.3524 0.4003 0.3935 0.7346 -1.0 0.0145 0.1468 0.5662 0.6539
1.2411 17.0 1462 1.1197 0.2851 0.4952 0.3222 0.2799 0.6439 -1.0 0.0928 0.346 0.4017 0.3969 0.7047 -1.0 0.0156 0.1554 0.5546 0.6479
1.1067 18.0 1548 1.0893 0.2967 0.4986 0.3418 0.2901 0.6747 -1.0 0.0959 0.354 0.4052 0.3995 0.729 -1.0 0.0149 0.1483 0.5786 0.6621
1.1067 19.0 1634 1.0634 0.2982 0.4997 0.3448 0.2918 0.6726 -1.0 0.0981 0.3605 0.4124 0.4072 0.7271 -1.0 0.0176 0.1595 0.5789 0.6654
1.1067 20.0 1720 1.0817 0.2984 0.5033 0.3386 0.2941 0.6425 -1.0 0.0951 0.3557 0.4077 0.4028 0.7206 -1.0 0.0172 0.1538 0.5796 0.6617
1.1067 21.0 1806 1.0688 0.3004 0.4941 0.3493 0.2959 0.6539 -1.0 0.0979 0.3631 0.4109 0.4075 0.7084 -1.0 0.0164 0.1534 0.5845 0.6685
1.1067 22.0 1892 1.0602 0.3052 0.5043 0.3517 0.3006 0.6705 -1.0 0.0979 0.3651 0.4174 0.4127 0.7271 -1.0 0.0209 0.163 0.5895 0.6719
1.1067 23.0 1978 1.0534 0.3061 0.501 0.3603 0.301 0.6674 -1.0 0.0965 0.3646 0.4157 0.4106 0.7364 -1.0 0.0199 0.1552 0.5923 0.6762
1.0314 24.0 2064 1.0490 0.3069 0.5057 0.3602 0.3012 0.6797 -1.0 0.0983 0.3678 0.42 0.4142 0.7439 -1.0 0.0188 0.1642 0.5951 0.6758
1.0314 25.0 2150 1.0600 0.3072 0.51 0.3643 0.3008 0.6851 -1.0 0.0976 0.3673 0.4212 0.4157 0.7411 -1.0 0.0203 0.1658 0.594 0.6765
1.0314 26.0 2236 1.0578 0.3092 0.5073 0.3689 0.3033 0.6886 -1.0 0.099 0.3678 0.4228 0.4172 0.743 -1.0 0.0215 0.1683 0.597 0.6773
1.0314 27.0 2322 1.0551 0.3102 0.5093 0.3679 0.3046 0.6894 -1.0 0.0993 0.3685 0.4223 0.4162 0.7505 -1.0 0.0215 0.1667 0.599 0.6779
1.0314 28.0 2408 1.0531 0.3107 0.5105 0.369 0.305 0.6888 -1.0 0.0996 0.3685 0.4216 0.4154 0.7495 -1.0 0.0222 0.1658 0.5992 0.6773
1.0314 29.0 2494 1.0528 0.3104 0.5077 0.3676 0.3045 0.6868 -1.0 0.0992 0.3694 0.4211 0.4152 0.7467 -1.0 0.0214 0.165 0.5994 0.6771
0.9801 30.0 2580 1.0529 0.3106 0.5077 0.3672 0.3046 0.6881 -1.0 0.0994 0.3691 0.4215 0.4155 0.7477 -1.0 0.0214 0.1656 0.5997 0.6773

Framework versions

  • Transformers 4.42.3
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1