Update README.md
Browse files
README.md
CHANGED
@@ -54,7 +54,7 @@ We have yet to investigate the potential bias inherent in this model thoroughly.
|
|
54 |
2. Lin, N., Fu, Y., Chen, C., Yang, Z., & Jiang, S. (2021). LaoPLM: Pre-trained Language Models for Lao. ArXiv. /abs/2110.05896
|
55 |
3. MinSithu, MyanmarGPT, https://huggingface.co/jojo-ai-mst/MyanmarGPT, 1.1-SweptWood
|
56 |
4. Wai Yan Nyein Naing, WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT
|
57 |
-
5. Sai Htaung Kham,saihtaungkham/BurmeseRoBERTaCLM
|
58 |
6. Shliazhko, O., Fenogenova, A., Tikhonova, M., Mikhailov, V., Kozlova, A., & Shavrina, T. (2022). MGPT: Few-Shot Learners Go Multilingual. ArXiv. /abs/2204.07580
|
59 |
|
60 |
### How to Cite this work:
|
|
|
54 |
2. Lin, N., Fu, Y., Chen, C., Yang, Z., & Jiang, S. (2021). LaoPLM: Pre-trained Language Models for Lao. ArXiv. /abs/2110.05896
|
55 |
3. MinSithu, MyanmarGPT, https://huggingface.co/jojo-ai-mst/MyanmarGPT, 1.1-SweptWood
|
56 |
4. Wai Yan Nyein Naing, WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT
|
57 |
+
5. Sai Htaung Kham, saihtaungkham/BurmeseRoBERTaCLM
|
58 |
6. Shliazhko, O., Fenogenova, A., Tikhonova, M., Mikhailov, V., Kozlova, A., & Shavrina, T. (2022). MGPT: Few-Shot Learners Go Multilingual. ArXiv. /abs/2204.07580
|
59 |
|
60 |
### How to Cite this work:
|