File size: 5,770 Bytes
6fbbfd5
c5f6d10
6fbbfd5
 
 
 
 
 
 
 
 
 
 
 
c5f6d10
6fbbfd5
e2f3f03
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6fbbfd5
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e2f3f03
c5f6d10
6fbbfd5
 
 
 
 
e2f3f03
 
 
 
 
 
 
 
 
 
6fbbfd5
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
---
base_model: PekingU/rtdetr_r50vd_coco_o365
tags:
- generated_from_trainer
model-index:
- name: rtdetr-r50-cppe5-finetune
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# rtdetr-r50-cppe5-finetune

This model is a fine-tuned version of [PekingU/rtdetr_r50vd_coco_o365](https://huggingface.co./PekingU/rtdetr_r50vd_coco_o365) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 9.7524
- Map: 0.5298
- Map 50: 0.7903
- Map 75: 0.5632
- Map Small: 0.5092
- Map Medium: 0.4212
- Map Large: 0.6655
- Mar 1: 0.4001
- Mar 10: 0.6526
- Mar 100: 0.711
- Mar Small: 0.6038
- Mar Medium: 0.5835
- Mar Large: 0.8378
- Map Coverall: 0.6271
- Mar 100 Coverall: 0.8308
- Map Face Shield: 0.4839
- Mar 100 Face Shield: 0.7706
- Map Gloves: 0.5775
- Mar 100 Gloves: 0.6492
- Map Goggles: 0.425
- Mar 100 Goggles: 0.6103
- Map Mask: 0.5354
- Mar 100 Mask: 0.6941

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 300
- num_epochs: 10

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log        | 1.0   | 107  | 216.6647        | 0.0037 | 0.0089 | 0.0022 | 0.0032    | 0.0183     | 0.014     | 0.0242 | 0.1046 | 0.1966  | 0.0405    | 0.1831     | 0.4092    | 0.0056       | 0.2649           | 0.001           | 0.1962              | 0.0021     | 0.0719         | 0.0008      | 0.2215          | 0.0091   | 0.2284       |
| No log        | 2.0   | 214  | 96.4364         | 0.0294 | 0.0559 | 0.0257 | 0.0169    | 0.0297     | 0.0299    | 0.0707 | 0.1835 | 0.298   | 0.0948    | 0.2203     | 0.4591    | 0.0888       | 0.5527           | 0.001           | 0.3203              | 0.021      | 0.1259         | 0.0014      | 0.2154          | 0.0346   | 0.2756       |
| No log        | 3.0   | 321  | 28.5504         | 0.1576 | 0.294  | 0.1448 | 0.0752    | 0.0925     | 0.2629    | 0.1621 | 0.3534 | 0.4661  | 0.347     | 0.3964     | 0.6546    | 0.4399       | 0.6518           | 0.0021          | 0.3797              | 0.1282     | 0.3866         | 0.0045      | 0.4             | 0.2132   | 0.5124       |
| No log        | 4.0   | 428  | 17.1997         | 0.2324 | 0.408  | 0.2295 | 0.1228    | 0.1816     | 0.3288    | 0.2317 | 0.4133 | 0.5     | 0.3527    | 0.4438     | 0.6543    | 0.5101       | 0.6396           | 0.0093          | 0.4671              | 0.1827     | 0.4513         | 0.1553      | 0.4062          | 0.3045   | 0.5356       |
| 117.1144      | 5.0   | 535  | 14.8812         | 0.2495 | 0.4498 | 0.2479 | 0.1261    | 0.1962     | 0.4086    | 0.253  | 0.4388 | 0.5189  | 0.3485    | 0.4683     | 0.7111    | 0.5078       | 0.6752           | 0.0291          | 0.5013              | 0.2265     | 0.4491         | 0.1715      | 0.4246          | 0.3129   | 0.5444       |
| 117.1144      | 6.0   | 642  | 13.5348         | 0.2572 | 0.4698 | 0.2541 | 0.1377    | 0.1905     | 0.424     | 0.2532 | 0.4315 | 0.4895  | 0.314     | 0.4481     | 0.6649    | 0.5166       | 0.6716           | 0.026           | 0.4873              | 0.2391     | 0.3754         | 0.1866      | 0.3754          | 0.3178   | 0.5378       |
| 117.1144      | 7.0   | 749  | 12.7545         | 0.2812 | 0.5035 | 0.2612 | 0.1618    | 0.2143     | 0.4653    | 0.2595 | 0.4568 | 0.496   | 0.3394    | 0.4438     | 0.6648    | 0.5152       | 0.6815           | 0.0918          | 0.4949              | 0.2504     | 0.3759         | 0.208       | 0.3954          | 0.3405   | 0.5324       |
| 117.1144      | 8.0   | 856  | 12.5330         | 0.2909 | 0.5328 | 0.2687 | 0.1568    | 0.2262     | 0.4868    | 0.2831 | 0.4625 | 0.5035  | 0.3209    | 0.4428     | 0.686     | 0.5059       | 0.6838           | 0.1762          | 0.5038              | 0.2528     | 0.3978         | 0.1905      | 0.4062          | 0.3289   | 0.5258       |
| 117.1144      | 9.0   | 963  | 12.2873         | 0.3023 | 0.5355 | 0.2927 | 0.1621    | 0.2502     | 0.494     | 0.2851 | 0.4696 | 0.5064  | 0.3301    | 0.452      | 0.6736    | 0.5276       | 0.6932           | 0.1696          | 0.4899              | 0.2633     | 0.4085         | 0.2249      | 0.4154          | 0.326    | 0.5249       |
| 16.4463       | 10.0  | 1070 | 12.2585         | 0.3095 | 0.5506 | 0.3029 | 0.1738    | 0.2405     | 0.4996    | 0.2901 | 0.4721 | 0.5105  | 0.3271    | 0.4558     | 0.6864    | 0.5196       | 0.6892           | 0.2225          | 0.5241              | 0.264      | 0.4022         | 0.2102      | 0.4077          | 0.3309   | 0.5293       |


### Framework versions

- Transformers 4.42.0.dev0
- Pytorch 2.1.0+cu118
- Datasets 2.19.1
- Tokenizers 0.19.1