Gorka Urbizu Garmendia
commited on
Commit
•
7676be5
1
Parent(s):
2ae9b05
Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ This is a BERT model for Basque introduced in [BasqueGLUE: A Natural Language Un
|
|
13 |
To train ElhBERTeu, we collected different corpora sources from several domains: updated (2021) national and local news sources, Basque Wikipedia, as well as novel news sources and texts from other domains, such as science (both academic and divulgative), literature or subtitles. More details about the corpora used and their sizes are shown in the following table. Texts from news sources were oversampled (duplicated) as done during the training of BERTeus. In total 575M tokens were used for pre-training ElhBERTeu.
|
14 |
|
15 |
|Domain | Size |
|
16 |
-
|
17 |
|News | 2 x 224M |
|
18 |
|Wikipedia | 40M |
|
19 |
|Science | 58M |
|
|
|
13 |
To train ElhBERTeu, we collected different corpora sources from several domains: updated (2021) national and local news sources, Basque Wikipedia, as well as novel news sources and texts from other domains, such as science (both academic and divulgative), literature or subtitles. More details about the corpora used and their sizes are shown in the following table. Texts from news sources were oversampled (duplicated) as done during the training of BERTeus. In total 575M tokens were used for pre-training ElhBERTeu.
|
14 |
|
15 |
|Domain | Size |
|
16 |
+
|:----------|----------|
|
17 |
|News | 2 x 224M |
|
18 |
|Wikipedia | 40M |
|
19 |
|Science | 58M |
|