README
Browse files
README.md
ADDED
@@ -0,0 +1,22 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# BERTIN
|
2 |
+
BERTIN is a series of RoBERTa-large models trained from scratch on the Spanish portion of mC4 using [Flax](https://github.com/google/flax), including training scripts.
|
3 |
+
|
4 |
+
This is part of the
|
5 |
+
[Flax/Jax Community Week](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104), organised by [HuggingFace](https://huggingface.co/) and TPU usage sponsored by Google.
|
6 |
+
|
7 |
+
## Team members
|
8 |
+
- Javier de la Rosa (versae)
|
9 |
+
- Manu Romero (mrm8488)
|
10 |
+
- María Grandury (mariagrandury)
|
11 |
+
- Ari Polakov (aripo99)
|
12 |
+
- Pablogps
|
13 |
+
- daveni
|
14 |
+
- Sri Lakshmi
|
15 |
+
|
16 |
+
## Useful links
|
17 |
+
- [Community Week timeline](https://discuss.huggingface.co/t/open-to-the-community-community-week-using-jax-flax-for-nlp-cv/7104#summary-timeline-calendar-6)
|
18 |
+
- [Community Week README](https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md)
|
19 |
+
- [Community Week thread](https://discuss.huggingface.co/t/bertin-pretrain-roberta-large-from-scratch-in-spanish/7125)
|
20 |
+
- [Community Week channel](https://discord.com/channels/858019234139602994/859113060068229190)
|
21 |
+
- [Masked Language Modelling example scripts](https://github.com/huggingface/transformers/tree/master/examples/flax/language-modeling)
|
22 |
+
- [Model Repository](https://huggingface.co/flax-community/bertin-roberta-large-spanish/)
|