File size: 10,673 Bytes
eac5e5a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
---
license: apache-2.0
base_model: GerMedBERT/medbert-512
tags:
- generated_from_trainer
metrics:
- precision
- recall
- accuracy
model-index:
- name: GerMedBert_NORMTOP50_V02_BRONCO
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# GerMedBert_NORMTOP50_V02_BRONCO

This model is a fine-tuned version of [GerMedBERT/medbert-512](https://huggingface.co./GerMedBERT/medbert-512) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0166
- F1 Score: 0.8624
- Precision: 0.8939
- Recall: 0.8329
- Accuracy: 0.8594
- Num Input Tokens Seen: 15165836

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20

### Training results

| Training Loss | Epoch | Step | Validation Loss | F1 Score | Precision | Recall | Accuracy | Input Tokens Seen |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:---------:|:------:|:--------:|:-----------------:|
| No log        | 0.25  | 81   | 0.1649          | 0.0      | 1.0       | 0.0    | 0.5920   | 190512            |
| 0.2195        | 0.5   | 162  | 0.0988          | 0.0      | 1.0       | 0.0    | 0.5920   | 380384            |
| 0.2195        | 0.75  | 243  | 0.0825          | 0.0      | 1.0       | 0.0    | 0.5920   | 570256            |
| 0.0855        | 1.0   | 324  | 0.0768          | 0.0      | 1.0       | 0.0    | 0.5920   | 759687            |
| 0.0855        | 1.25  | 405  | 0.0744          | 0.0      | 1.0       | 0.0    | 0.5920   | 948599            |
| 0.0737        | 1.5   | 486  | 0.0688          | 0.0      | 1.0       | 0.0    | 0.5920   | 1138791           |
| 0.0737        | 1.75  | 567  | 0.0652          | 0.1979   | 0.94      | 0.1106 | 0.6059   | 1327703           |
| 0.0666        | 2.0   | 648  | 0.0600          | 0.0982   | 0.9565    | 0.0518 | 0.5972   | 1516814           |
| 0.0666        | 2.25  | 729  | 0.0553          | 0.1231   | 0.9333    | 0.0659 | 0.5972   | 1706686           |
| 0.0572        | 2.5   | 810  | 0.0521          | 0.4093   | 0.8394    | 0.2706 | 0.6458   | 1896878           |
| 0.0572        | 2.75  | 891  | 0.0490          | 0.4326   | 0.8777    | 0.2871 | 0.6510   | 2086110           |
| 0.0499        | 3.0   | 972  | 0.0456          | 0.2897   | 1.0       | 0.1694 | 0.6476   | 2275221           |
| 0.0499        | 3.25  | 1053 | 0.0423          | 0.5060   | 0.925     | 0.3482 | 0.6667   | 2465413           |
| 0.0441        | 3.5   | 1134 | 0.0406          | 0.5386   | 0.8913    | 0.3859 | 0.6736   | 2654965           |
| 0.0441        | 3.75  | 1215 | 0.0396          | 0.5979   | 0.8419    | 0.4635 | 0.6858   | 2845157           |
| 0.0393        | 4.0   | 1296 | 0.0377          | 0.6517   | 0.9004    | 0.5106 | 0.7083   | 3034556           |
| 0.0393        | 4.25  | 1377 | 0.0357          | 0.6319   | 0.9075    | 0.4847 | 0.6997   | 3224428           |
| 0.0361        | 4.5   | 1458 | 0.0346          | 0.6154   | 0.9245    | 0.4612 | 0.7066   | 3414620           |
| 0.0361        | 4.75  | 1539 | 0.0334          | 0.6258   | 0.8987    | 0.48   | 0.7101   | 3604172           |
| 0.032         | 5.0   | 1620 | 0.0321          | 0.6775   | 0.9124    | 0.5388 | 0.7292   | 3793603           |
| 0.032         | 5.25  | 1701 | 0.0306          | 0.7081   | 0.9176    | 0.5765 | 0.7378   | 3983155           |
| 0.0293        | 5.5   | 1782 | 0.0302          | 0.6928   | 0.9019    | 0.5624 | 0.7361   | 4172387           |
| 0.0293        | 5.75  | 1863 | 0.0292          | 0.6657   | 0.9247    | 0.52   | 0.7240   | 4362579           |
| 0.0273        | 6.0   | 1944 | 0.0287          | 0.7365   | 0.9253    | 0.6118 | 0.7691   | 4552330           |
| 0.0273        | 6.25  | 2025 | 0.0275          | 0.7215   | 0.9328    | 0.5882 | 0.7552   | 4741882           |
| 0.0258        | 6.5   | 2106 | 0.0272          | 0.7275   | 0.9024    | 0.6094 | 0.7517   | 4930794           |
| 0.0258        | 6.75  | 2187 | 0.0260          | 0.7451   | 0.9204    | 0.6259 | 0.7726   | 5120026           |
| 0.0228        | 7.0   | 2268 | 0.0260          | 0.7247   | 0.9203    | 0.5976 | 0.7656   | 5309137           |
| 0.0228        | 7.25  | 2349 | 0.0249          | 0.7867   | 0.9077    | 0.6941 | 0.7969   | 5499649           |
| 0.0218        | 7.5   | 2430 | 0.0246          | 0.7572   | 0.9079    | 0.6494 | 0.7778   | 5688881           |
| 0.0218        | 7.75  | 2511 | 0.0239          | 0.7779   | 0.9088    | 0.68   | 0.7882   | 5878113           |
| 0.02          | 8.0   | 2592 | 0.0239          | 0.7835   | 0.8994    | 0.6941 | 0.7899   | 6067224           |
| 0.02          | 8.25  | 2673 | 0.0229          | 0.7711   | 0.9159    | 0.6659 | 0.7917   | 6256456           |
| 0.0184        | 8.5   | 2754 | 0.0227          | 0.7705   | 0.8969    | 0.6753 | 0.7917   | 6446328           |
| 0.0184        | 8.75  | 2835 | 0.0226          | 0.7782   | 0.8671    | 0.7059 | 0.7899   | 6636520           |
| 0.0182        | 9.0   | 2916 | 0.0224          | 0.7937   | 0.8988    | 0.7106 | 0.8003   | 6825951           |
| 0.0182        | 9.25  | 2997 | 0.0217          | 0.7815   | 0.8939    | 0.6941 | 0.7951   | 7015183           |
| 0.0172        | 9.5   | 3078 | 0.0213          | 0.8156   | 0.9101    | 0.7388 | 0.8212   | 7205375           |
| 0.0172        | 9.75  | 3159 | 0.0211          | 0.8063   | 0.9086    | 0.7247 | 0.8142   | 7394927           |
| 0.0154        | 10.0  | 3240 | 0.0216          | 0.8246   | 0.8820    | 0.7741 | 0.8212   | 7583366           |
| 0.0154        | 10.25 | 3321 | 0.0204          | 0.7831   | 0.8943    | 0.6965 | 0.8021   | 7772598           |
| 0.0145        | 10.5  | 3402 | 0.0201          | 0.8185   | 0.9034    | 0.7482 | 0.8229   | 7962470           |
| 0.0145        | 10.75 | 3483 | 0.0200          | 0.8261   | 0.9048    | 0.76   | 0.8264   | 8152662           |
| 0.0143        | 11.0  | 3564 | 0.0198          | 0.8238   | 0.8929    | 0.7647 | 0.8281   | 8341773           |
| 0.0143        | 11.25 | 3645 | 0.0196          | 0.8229   | 0.8972    | 0.76   | 0.8264   | 8531645           |
| 0.0131        | 11.5  | 3726 | 0.0193          | 0.8231   | 0.8817    | 0.7718 | 0.8212   | 8720877           |
| 0.0131        | 11.75 | 3807 | 0.0195          | 0.8152   | 0.8822    | 0.7576 | 0.8177   | 8910109           |
| 0.0129        | 12.0  | 3888 | 0.0192          | 0.8263   | 0.9119    | 0.7553 | 0.8299   | 9099860           |
| 0.0129        | 12.25 | 3969 | 0.0188          | 0.8229   | 0.8972    | 0.76   | 0.8212   | 9289412           |
| 0.0116        | 12.5  | 4050 | 0.0191          | 0.8123   | 0.8883    | 0.7482 | 0.8247   | 9479284           |
| 0.0116        | 12.75 | 4131 | 0.0181          | 0.8417   | 0.9030    | 0.7882 | 0.8472   | 9669156           |
| 0.0115        | 13.0  | 4212 | 0.0180          | 0.8398   | 0.8895    | 0.7953 | 0.8420   | 9857947           |
| 0.0115        | 13.25 | 4293 | 0.0177          | 0.8445   | 0.9126    | 0.7859 | 0.8455   | 10045899          |
| 0.0108        | 13.5  | 4374 | 0.0179          | 0.8426   | 0.8901    | 0.8    | 0.8438   | 10236091          |
| 0.0108        | 13.75 | 4455 | 0.0179          | 0.8519   | 0.8961    | 0.8118 | 0.8524   | 10426283          |
| 0.0103        | 14.0  | 4536 | 0.0177          | 0.8392   | 0.9003    | 0.7859 | 0.8420   | 10615394          |
| 0.0103        | 14.25 | 4617 | 0.0176          | 0.8603   | 0.9062    | 0.8188 | 0.8594   | 10805906          |
| 0.0097        | 14.5  | 4698 | 0.0173          | 0.8475   | 0.904     | 0.7976 | 0.8507   | 10995458          |
| 0.0097        | 14.75 | 4779 | 0.0175          | 0.8511   | 0.9003    | 0.8071 | 0.8524   | 11185650          |
| 0.0095        | 15.0  | 4860 | 0.0173          | 0.8501   | 0.8979    | 0.8071 | 0.8490   | 11375081          |
| 0.0095        | 15.25 | 4941 | 0.0175          | 0.8451   | 0.8927    | 0.8024 | 0.8472   | 11564633          |
| 0.009         | 15.5  | 5022 | 0.0174          | 0.8483   | 0.8912    | 0.8094 | 0.8490   | 11754185          |
| 0.009         | 15.75 | 5103 | 0.0172          | 0.8490   | 0.8956    | 0.8071 | 0.8438   | 11943417          |
| 0.009         | 16.0  | 5184 | 0.0174          | 0.8547   | 0.8883    | 0.8235 | 0.8490   | 12132528          |
| 0.009         | 16.25 | 5265 | 0.0171          | 0.8519   | 0.8961    | 0.8118 | 0.8490   | 12322400          |
| 0.0085        | 16.5  | 5346 | 0.0169          | 0.8537   | 0.8861    | 0.8235 | 0.8524   | 12511632          |
| 0.0085        | 16.75 | 5427 | 0.0169          | 0.8498   | 0.8915    | 0.8118 | 0.8524   | 12701184          |
| 0.0084        | 17.0  | 5508 | 0.0169          | 0.8487   | 0.8892    | 0.8118 | 0.8507   | 12889655          |
| 0.0084        | 17.25 | 5589 | 0.0167          | 0.8634   | 0.8962    | 0.8329 | 0.8524   | 13079527          |
| 0.0079        | 17.5  | 5670 | 0.0168          | 0.8504   | 0.8958    | 0.8094 | 0.8472   | 13269399          |
| 0.0079        | 17.75 | 5751 | 0.0168          | 0.8606   | 0.8957    | 0.8282 | 0.8594   | 13458631          |
| 0.0081        | 18.0  | 5832 | 0.0167          | 0.8564   | 0.8949    | 0.8212 | 0.8542   | 13648382          |
| 0.0081        | 18.25 | 5913 | 0.0166          | 0.8620   | 0.8959    | 0.8306 | 0.8576   | 13838254          |
| 0.0078        | 18.5  | 5994 | 0.0166          | 0.8533   | 0.8964    | 0.8141 | 0.8542   | 14028126          |
| 0.0078        | 18.75 | 6075 | 0.0167          | 0.8610   | 0.8937    | 0.8306 | 0.8611   | 14217678          |
| 0.0078        | 19.0  | 6156 | 0.0166          | 0.8610   | 0.8937    | 0.8306 | 0.8594   | 14407109          |
| 0.0078        | 19.25 | 6237 | 0.0166          | 0.8637   | 0.8942    | 0.8353 | 0.8594   | 14596341          |
| 0.0072        | 19.5  | 6318 | 0.0166          | 0.8620   | 0.8959    | 0.8306 | 0.8594   | 14786213          |
| 0.0072        | 19.75 | 6399 | 0.0166          | 0.8624   | 0.8939    | 0.8329 | 0.8594   | 14976085          |
| 0.008         | 20.0  | 6480 | 0.0166          | 0.8624   | 0.8939    | 0.8329 | 0.8594   | 15165836          |


### Framework versions

- Transformers 4.40.1
- Pytorch 2.3.0+cu121
- Datasets 2.19.0
- Tokenizers 0.19.1