Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
A BERT pre-trained language model for an ancient greek language.
|
2 |
|
3 |
We used [GreekBERT from @nlpaueb](https://huggingface.co/nlpaueb/bert-base-greek-uncased-v1) and fine-tuned it with the MLM objective on several corpora of ancient greek texts. Later, we used it to train several classifiers to assist an author and style attribution of a couple of recently discovered texts.
|
@@ -16,17 +29,4 @@ If you use the model, please cite the following:
|
|
16 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
17 |
year = "2022",
|
18 |
}
|
19 |
-
```
|
20 |
-
|
21 |
-
---
|
22 |
-
language:
|
23 |
-
- grc
|
24 |
-
thumbnail: https://raw.githubusercontent.com/altsoph/misc/main/imgs/ancientbert.png
|
25 |
-
tags:
|
26 |
-
- bert
|
27 |
-
- classifier
|
28 |
-
- greek
|
29 |
-
- ancient
|
30 |
-
- mlm
|
31 |
-
license: mit
|
32 |
-
---
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- grc
|
4 |
+
thumbnail: https://raw.githubusercontent.com/altsoph/misc/main/imgs/ancientbert.png
|
5 |
+
tags:
|
6 |
+
- bert
|
7 |
+
- classifier
|
8 |
+
- greek
|
9 |
+
- ancient
|
10 |
+
- mlm
|
11 |
+
license: mit
|
12 |
+
---
|
13 |
+
|
14 |
A BERT pre-trained language model for an ancient greek language.
|
15 |
|
16 |
We used [GreekBERT from @nlpaueb](https://huggingface.co/nlpaueb/bert-base-greek-uncased-v1) and fine-tuned it with the MLM objective on several corpora of ancient greek texts. Later, we used it to train several classifiers to assist an author and style attribution of a couple of recently discovered texts.
|
|
|
29 |
booktitle = "Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing",
|
30 |
year = "2022",
|
31 |
}
|
32 |
+
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|