File size: 10,058 Bytes
e46c4f3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c393348
 
 
 
 
 
e46c4f3
c393348
 
 
 
 
e46c4f3
c393348
 
 
 
e46c4f3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c393348
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e46c4f3
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
---
license: apache-2.0
base_model: microsoft/conditional-detr-resnet-50
tags:
- generated_from_trainer
datasets:
- dsi
model-index:
- name: detr_finetunned_ocular
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# detr_finetunned_ocular

This model is a fine-tuned version of [microsoft/conditional-detr-resnet-50](https://huggingface.co./microsoft/conditional-detr-resnet-50) on the dsi dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0598
- Map: 0.3166
- Map 50: 0.5255
- Map 75: 0.3725
- Map Small: 0.3115
- Map Medium: 0.6744
- Map Large: -1.0
- Mar 1: 0.1043
- Mar 10: 0.3801
- Mar 100: 0.4224
- Mar Small: 0.4186
- Mar Medium: 0.7234
- Mar Large: -1.0
- Map Falciparum Trophozoite: 0.0341
- Mar 100 Falciparum Trophozoite: 0.1663
- Map Wbc: 0.599
- Mar 100 Wbc: 0.6785

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Falciparum Trophozoite | Mar 100 Falciparum Trophozoite | Map Wbc | Mar 100 Wbc |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:--------------------------:|:------------------------------:|:-------:|:-----------:|
| No log        | 1.0   | 86   | 1.1493          | 0.2788 | 0.5002 | 0.3024 | 0.274     | 0.6184     | -1.0      | 0.0918 | 0.3473 | 0.386   | 0.3823    | 0.6785     | -1.0      | 0.0196                     | 0.1372                         | 0.5381  | 0.6348      |
| No log        | 2.0   | 172  | 1.1199          | 0.2924 | 0.5063 | 0.3371 | 0.2866    | 0.6545     | -1.0      | 0.0924 | 0.3509 | 0.3873  | 0.3805    | 0.729      | -1.0      | 0.0204                     | 0.1264                         | 0.5644  | 0.6483      |
| No log        | 3.0   | 258  | 1.1616          | 0.2802 | 0.4941 | 0.3138 | 0.2746    | 0.6231     | -1.0      | 0.0891 | 0.3378 | 0.377   | 0.374     | 0.6598     | -1.0      | 0.0167                     | 0.129                          | 0.5438  | 0.6249      |
| No log        | 4.0   | 344  | 1.1263          | 0.3014 | 0.517  | 0.3393 | 0.296     | 0.6609     | -1.0      | 0.0981 | 0.3588 | 0.3857  | 0.3806    | 0.7187     | -1.0      | 0.0258                     | 0.1135                         | 0.577   | 0.6579      |
| No log        | 5.0   | 430  | 1.1219          | 0.2801 | 0.5117 | 0.297  | 0.2734    | 0.6555     | -1.0      | 0.0905 | 0.3458 | 0.3795  | 0.3737    | 0.7028     | -1.0      | 0.0218                     | 0.1254                         | 0.5385  | 0.6337      |
| 1.0831        | 6.0   | 516  | 1.1299          | 0.2646 | 0.485  | 0.2705 | 0.2581    | 0.6103     | -1.0      | 0.0885 | 0.3294 | 0.371   | 0.3636    | 0.7        | -1.0      | 0.0117                     | 0.1288                         | 0.5175  | 0.6131      |
| 1.0831        | 7.0   | 602  | 1.1003          | 0.2934 | 0.5064 | 0.3254 | 0.286     | 0.6706     | -1.0      | 0.0933 | 0.357  | 0.3962  | 0.3905    | 0.7206     | -1.0      | 0.0207                     | 0.1397                         | 0.5661  | 0.6528      |
| 1.0831        | 8.0   | 688  | 1.1063          | 0.2945 | 0.5028 | 0.3407 | 0.2871    | 0.6606     | -1.0      | 0.0946 | 0.3568 | 0.3975  | 0.3938    | 0.6925     | -1.0      | 0.0243                     | 0.1454                         | 0.5647  | 0.6495      |
| 1.0831        | 9.0   | 774  | 1.1364          | 0.2824 | 0.4979 | 0.3114 | 0.2774    | 0.622      | -1.0      | 0.0928 | 0.3445 | 0.3844  | 0.3818    | 0.671      | -1.0      | 0.017                      | 0.1297                         | 0.5479  | 0.6392      |
| 1.0831        | 10.0  | 860  | 1.0997          | 0.2904 | 0.501  | 0.3299 | 0.2841    | 0.6483     | -1.0      | 0.0908 | 0.3515 | 0.3917  | 0.387     | 0.7065     | -1.0      | 0.02                       | 0.1329                         | 0.5609  | 0.6505      |
| 1.0831        | 11.0  | 946  | 1.1198          | 0.2826 | 0.496  | 0.3186 | 0.277     | 0.6299     | -1.0      | 0.0915 | 0.3426 | 0.3822  | 0.3778    | 0.6832     | -1.0      | 0.0225                     | 0.1342                         | 0.5427  | 0.6303      |
| 1.0585        | 12.0  | 1032 | 1.0999          | 0.2921 | 0.5038 | 0.3196 | 0.2867    | 0.6334     | -1.0      | 0.0953 | 0.3556 | 0.3954  | 0.3916    | 0.6897     | -1.0      | 0.0244                     | 0.1454                         | 0.5599  | 0.6453      |
| 1.0585        | 13.0  | 1118 | 1.1097          | 0.2966 | 0.5183 | 0.3365 | 0.29      | 0.6667     | -1.0      | 0.0995 | 0.3549 | 0.3983  | 0.3923    | 0.7178     | -1.0      | 0.0297                     | 0.1493                         | 0.5636  | 0.6472      |
| 1.0585        | 14.0  | 1204 | 1.0932          | 0.2964 | 0.5113 | 0.335  | 0.2913    | 0.6494     | -1.0      | 0.0969 | 0.3556 | 0.396   | 0.391     | 0.7037     | -1.0      | 0.0279                     | 0.1474                         | 0.565   | 0.6447      |
| 1.0585        | 15.0  | 1290 | 1.0951          | 0.2958 | 0.5173 | 0.3287 | 0.2915    | 0.6321     | -1.0      | 0.0969 | 0.3596 | 0.4018  | 0.3962    | 0.7187     | -1.0      | 0.0321                     | 0.1505                         | 0.5595  | 0.653       |
| 1.0585        | 16.0  | 1376 | 1.1036          | 0.3048 | 0.5215 | 0.3482 | 0.2997    | 0.6588     | -1.0      | 0.102  | 0.3633 | 0.4029  | 0.3974    | 0.7234     | -1.0      | 0.0321                     | 0.1481                         | 0.5775  | 0.6578      |
| 1.0585        | 17.0  | 1462 | 1.0973          | 0.2997 | 0.5169 | 0.3437 | 0.2943    | 0.6445     | -1.0      | 0.1006 | 0.3589 | 0.3994  | 0.3963    | 0.6907     | -1.0      | 0.0306                     | 0.1448                         | 0.5688  | 0.654       |
| 0.968         | 18.0  | 1548 | 1.1322          | 0.3029 | 0.5193 | 0.3482 | 0.2973    | 0.6525     | -1.0      | 0.0983 | 0.3636 | 0.4015  | 0.3977    | 0.6991     | -1.0      | 0.032                      | 0.1499                         | 0.5738  | 0.6532      |
| 0.968         | 19.0  | 1634 | 1.0698          | 0.3049 | 0.5114 | 0.3471 | 0.2989    | 0.6757     | -1.0      | 0.0992 | 0.3665 | 0.4138  | 0.4084    | 0.729      | -1.0      | 0.0288                     | 0.1634                         | 0.581   | 0.6643      |
| 0.968         | 20.0  | 1720 | 1.0780          | 0.3093 | 0.516  | 0.3556 | 0.3036    | 0.6647     | -1.0      | 0.101  | 0.3694 | 0.4145  | 0.4096    | 0.7252     | -1.0      | 0.0299                     | 0.1618                         | 0.5887  | 0.6673      |
| 0.968         | 21.0  | 1806 | 1.0825          | 0.3044 | 0.522  | 0.3357 | 0.2981    | 0.6642     | -1.0      | 0.0982 | 0.3653 | 0.4071  | 0.4029    | 0.7075     | -1.0      | 0.0319                     | 0.1564                         | 0.5768  | 0.6578      |
| 0.968         | 22.0  | 1892 | 1.0660          | 0.3142 | 0.5195 | 0.3691 | 0.3096    | 0.6679     | -1.0      | 0.1028 | 0.3764 | 0.4195  | 0.4158    | 0.7187     | -1.0      | 0.0352                     | 0.164                          | 0.5933  | 0.675       |
| 0.968         | 23.0  | 1978 | 1.0604          | 0.3145 | 0.5256 | 0.3633 | 0.3093    | 0.674      | -1.0      | 0.1031 | 0.3774 | 0.4199  | 0.4152    | 0.729      | -1.0      | 0.0368                     | 0.1669                         | 0.5922  | 0.6729      |
| 0.9092        | 24.0  | 2064 | 1.0607          | 0.3168 | 0.5266 | 0.3768 | 0.3114    | 0.6848     | -1.0      | 0.1039 | 0.3785 | 0.4233  | 0.4186    | 0.7374     | -1.0      | 0.034                      | 0.1654                         | 0.5996  | 0.6812      |
| 0.9092        | 25.0  | 2150 | 1.0681          | 0.3163 | 0.5283 | 0.3656 | 0.3113    | 0.6751     | -1.0      | 0.1053 | 0.3769 | 0.4185  | 0.4148    | 0.7196     | -1.0      | 0.0352                     | 0.1616                         | 0.5975  | 0.6755      |
| 0.9092        | 26.0  | 2236 | 1.0641          | 0.3158 | 0.5239 | 0.3708 | 0.3106    | 0.6715     | -1.0      | 0.1045 | 0.378  | 0.4217  | 0.4181    | 0.7196     | -1.0      | 0.0339                     | 0.1656                         | 0.5977  | 0.6777      |
| 0.9092        | 27.0  | 2322 | 1.0644          | 0.3162 | 0.526  | 0.3721 | 0.311     | 0.6785     | -1.0      | 0.1035 | 0.3775 | 0.42    | 0.4164    | 0.7206     | -1.0      | 0.0336                     | 0.1624                         | 0.5988  | 0.6777      |
| 0.9092        | 28.0  | 2408 | 1.0606          | 0.3165 | 0.5241 | 0.374  | 0.3114    | 0.6784     | -1.0      | 0.1052 | 0.3794 | 0.4223  | 0.4184    | 0.7252     | -1.0      | 0.0343                     | 0.1665                         | 0.5988  | 0.6782      |
| 0.9092        | 29.0  | 2494 | 1.0600          | 0.3161 | 0.5249 | 0.3728 | 0.311     | 0.6744     | -1.0      | 0.1043 | 0.3795 | 0.4219  | 0.418     | 0.7234     | -1.0      | 0.0341                     | 0.1661                         | 0.5981  | 0.6777      |
| 0.8509        | 30.0  | 2580 | 1.0598          | 0.3166 | 0.5255 | 0.3725 | 0.3115    | 0.6744     | -1.0      | 0.1043 | 0.3801 | 0.4224  | 0.4186    | 0.7234     | -1.0      | 0.0341                     | 0.1663                         | 0.599   | 0.6785      |


### Framework versions

- Transformers 4.42.3
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1