File size: 438 Bytes
7d11592
 
 
18ec87e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
cdbc8ee
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
---
{}
---
# ModernBioBERT

A modern variant of BioBERT based on ModernBERT.
We continued the masked language modeling pre-training task for 1.000.000 steps on PubMed abstracts.

## Pre-Training Details
```
Batchsize: 512
Learningrate: 1e-4
Warmupsteps: 500
Learning Rate Scheduler: Cosine Schedule
Max. Sequence Length: 512
Precision: bfloat16
```

---
datasets:
- ncbi/pubmed
language:
- en
base_model:
- answerdotai/ModernBERT-base
---