Fill-Mask
Transformers
PyTorch
Portuguese
deberta-v2
albertina-pt*
albertina-100m-portuguese-ptpt
albertina-100m-portuguese-ptbr
albertina-900m-portuguese-ptpt
albertina-900m-portuguese-ptbr
albertina-1b5-portuguese-ptpt
albertina-1b5-portuguese-ptbr
bert
deberta
portuguese
encoder
foundation model
Inference Endpoints
Update README.md
Browse files
README.md
CHANGED
@@ -64,17 +64,17 @@ It is distributed free of charge and under a most permissible license.
|
|
64 |
|
65 |
|
66 |
**Albertina 1.5B PTPT** is developed by a joint team from the University of Lisbon and the University of Porto, Portugal.
|
67 |
-
For a fully detailed description, check the respective [publication](https://arxiv.org/abs
|
68 |
|
69 |
``` latex
|
70 |
@misc{albertina-pt-fostering,
|
71 |
-
title={Fostering the Ecosystem of Open Neural Encoders
|
72 |
-
|
73 |
-
author={Rodrigo Santos and João Rodrigues and Luís Gomes
|
74 |
-
and António Branco and Henrique Lopes Cardoso
|
75 |
and Tomás Freitas Osório and Bernardo Leite},
|
76 |
year={2024},
|
77 |
-
eprint={
|
78 |
archivePrefix={arXiv},
|
79 |
primaryClass={cs.CL}
|
80 |
}
|
@@ -198,17 +198,17 @@ The model can be used by fine-tuning it for a specific task:
|
|
198 |
|
199 |
# Citation
|
200 |
|
201 |
-
When using or citing this model, kindly cite the following [publication](https://arxiv.org/abs
|
202 |
|
203 |
``` latex
|
204 |
@misc{albertina-pt-fostering,
|
205 |
-
title={Fostering the Ecosystem of Open Neural Encoders
|
206 |
-
|
207 |
-
author={Rodrigo Santos and João Rodrigues and Luís Gomes
|
208 |
-
and António Branco and Henrique Lopes Cardoso
|
209 |
and Tomás Freitas Osório and Bernardo Leite},
|
210 |
year={2024},
|
211 |
-
eprint={
|
212 |
archivePrefix={arXiv},
|
213 |
primaryClass={cs.CL}
|
214 |
}
|
|
|
64 |
|
65 |
|
66 |
**Albertina 1.5B PTPT** is developed by a joint team from the University of Lisbon and the University of Porto, Portugal.
|
67 |
+
For a fully detailed description, check the respective [publication](https://arxiv.org/abs/2403.01897):
|
68 |
|
69 |
``` latex
|
70 |
@misc{albertina-pt-fostering,
|
71 |
+
title={Fostering the Ecosystem of Open Neural Encoders
|
72 |
+
for Portuguese with Albertina PT-* family},
|
73 |
+
author={Rodrigo Santos and João Rodrigues and Luís Gomes
|
74 |
+
and João Silva and António Branco and Henrique Lopes Cardoso
|
75 |
and Tomás Freitas Osório and Bernardo Leite},
|
76 |
year={2024},
|
77 |
+
eprint={2403.01897},
|
78 |
archivePrefix={arXiv},
|
79 |
primaryClass={cs.CL}
|
80 |
}
|
|
|
198 |
|
199 |
# Citation
|
200 |
|
201 |
+
When using or citing this model, kindly cite the following [publication](https://arxiv.org/abs/2403.01897):
|
202 |
|
203 |
``` latex
|
204 |
@misc{albertina-pt-fostering,
|
205 |
+
title={Fostering the Ecosystem of Open Neural Encoders
|
206 |
+
for Portuguese with Albertina PT-* family},
|
207 |
+
author={Rodrigo Santos and João Rodrigues and Luís Gomes
|
208 |
+
and João Silva and António Branco and Henrique Lopes Cardoso
|
209 |
and Tomás Freitas Osório and Bernardo Leite},
|
210 |
year={2024},
|
211 |
+
eprint={2403.01897},
|
212 |
archivePrefix={arXiv},
|
213 |
primaryClass={cs.CL}
|
214 |
}
|