Fill-Mask
Transformers
PyTorch
Portuguese
deberta-v2
albertina-pt*
albertina-100m-portuguese-ptpt
albertina-100m-portuguese-ptbr
albertina-900m-portuguese-ptpt
albertina-900m-portuguese-ptbr
albertina-1b5-portuguese-ptpt
albertina-1b5-portuguese-ptbr
bert
deberta
portuguese
encoder
foundation model
Inference Endpoints
Update README.md
Browse files
README.md
CHANGED
@@ -71,8 +71,9 @@ For a fully detailed description, check the respective [publication](https://arx
|
|
71 |
title={Fostering the Ecosystem of Open Neural Encoders
|
72 |
for Portuguese with Albertina PT-* family},
|
73 |
author={Rodrigo Santos and João Rodrigues and Luís Gomes
|
74 |
-
and João Silva and António Branco
|
75 |
-
and
|
|
|
76 |
year={2024},
|
77 |
eprint={2403.01897},
|
78 |
archivePrefix={arXiv},
|
|
|
71 |
title={Fostering the Ecosystem of Open Neural Encoders
|
72 |
for Portuguese with Albertina PT-* family},
|
73 |
author={Rodrigo Santos and João Rodrigues and Luís Gomes
|
74 |
+
and João Silva and António Branco
|
75 |
+
and Henrique Lopes Cardoso and Tomás Freitas Osório
|
76 |
+
and Bernardo Leite},
|
77 |
year={2024},
|
78 |
eprint={2403.01897},
|
79 |
archivePrefix={arXiv},
|