Fill-Mask
Transformers
PyTorch
Portuguese
deberta-v2
albertina-pt*
albertina-100m-portuguese-ptpt
albertina-100m-portuguese-ptbr
albertina-900m-portuguese-ptpt
albertina-900m-portuguese-ptbr
albertina-1b5-portuguese-ptpt
albertina-1b5-portuguese-ptbr
bert
deberta
portuguese
encoder
foundation model
Inference Endpoints
Update README.md
Browse files
README.md
CHANGED
@@ -206,8 +206,9 @@ When using or citing this model, kindly cite the following [publication](https:/
|
|
206 |
title={Fostering the Ecosystem of Open Neural Encoders
|
207 |
for Portuguese with Albertina PT-* family},
|
208 |
author={Rodrigo Santos and João Rodrigues and Luís Gomes
|
209 |
-
and João Silva and António Branco
|
210 |
-
and
|
|
|
211 |
year={2024},
|
212 |
eprint={2403.01897},
|
213 |
archivePrefix={arXiv},
|
|
|
206 |
title={Fostering the Ecosystem of Open Neural Encoders
|
207 |
for Portuguese with Albertina PT-* family},
|
208 |
author={Rodrigo Santos and João Rodrigues and Luís Gomes
|
209 |
+
and João Silva and António Branco
|
210 |
+
and Henrique Lopes Cardoso and Tomás Freitas Osório
|
211 |
+
and Bernardo Leite},
|
212 |
year={2024},
|
213 |
eprint={2403.01897},
|
214 |
archivePrefix={arXiv},
|