Gorka Urbizu Garmendia
commited on
Commit
•
3ca4034
1
Parent(s):
d5a5078
Update README.md
Browse files
README.md
CHANGED
@@ -25,7 +25,7 @@ ElhBERTeu is a base, uncased monolingual BERT model for Basque, with a vocab siz
|
|
25 |
|
26 |
ElhBERTeu was trained following the design decisions for [BERTeus](https://huggingface.co/ixa-ehu/berteus-base-cased). The tokenizer and the hyper-parameter settings remained the same, with the only difference being that the full pre-training of the model (1M steps) was performed with a sequence length of 512 on a v3-8 TPU.
|
27 |
|
28 |
-
The model has been evaluated on the recently created BasqueGLUE NLU benchmark:
|
29 |
|
30 |
| Model | AVG | NERC | F_intent | F_slot | BHTC | BEC | Vaxx | QNLI | WiC | coref |
|
31 |
|-----------|:---------:|:---------:|:---------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|
|
|
|
25 |
|
26 |
ElhBERTeu was trained following the design decisions for [BERTeus](https://huggingface.co/ixa-ehu/berteus-base-cased). The tokenizer and the hyper-parameter settings remained the same, with the only difference being that the full pre-training of the model (1M steps) was performed with a sequence length of 512 on a v3-8 TPU.
|
27 |
|
28 |
+
The model has been evaluated on the recently created [BasqueGLUE](https://github.com/Elhuyar/BasqueGLUE) NLU benchmark:
|
29 |
|
30 |
| Model | AVG | NERC | F_intent | F_slot | BHTC | BEC | Vaxx | QNLI | WiC | coref |
|
31 |
|-----------|:---------:|:---------:|:---------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|
|