Update README.md
Browse files
README.md
CHANGED
@@ -40,11 +40,9 @@ Releasing the Model: Eithandaraung, Ye Yint Htut, Thet Chit Su, Naing Phyo Aung
|
|
40 |
### Limitations and bias
|
41 |
We have yet to thoroughly investigate the potential bias inherent in this model. Regarding transparency, it's important to note that the model is primarily trained on data from the Unicode Burmese(Myanmar) language.
|
42 |
|
43 |
-
###
|
44 |
-
We would like to thank the previous works. The main motivation is from these works.
|
45 |
-
1. MinSithu, MyanmarGPT, https://huggingface.co/jojo-ai-mst/MyanmarGP
|
46 |
-
2. Dr. Wai Yan Nyein Naing, WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT
|
47 |
-
|
48 |
-
### References and Citations
|
49 |
1. Jiang, Shengyi & Huang, Xiuwen & Cai, Xiaonan & Lin, Nankai. (2021). Pre-trained Models and Evaluation Data for the Myanmar Language. 10.1007/978-3-030-92310-5_52.
|
50 |
-
2. Lin, N., Fu, Y., Chen, C., Yang, Z., & Jiang, S. (2021). LaoPLM: Pre-trained Language Models for Lao. ArXiv. /abs/2110.05896
|
|
|
|
|
|
|
|
40 |
### Limitations and bias
|
41 |
We have yet to thoroughly investigate the potential bias inherent in this model. Regarding transparency, it's important to note that the model is primarily trained on data from the Unicode Burmese(Myanmar) language.
|
42 |
|
43 |
+
### References
|
|
|
|
|
|
|
|
|
|
|
44 |
1. Jiang, Shengyi & Huang, Xiuwen & Cai, Xiaonan & Lin, Nankai. (2021). Pre-trained Models and Evaluation Data for the Myanmar Language. 10.1007/978-3-030-92310-5_52.
|
45 |
+
2. Lin, N., Fu, Y., Chen, C., Yang, Z., & Jiang, S. (2021). LaoPLM: Pre-trained Language Models for Lao. ArXiv. /abs/2110.05896
|
46 |
+
3. MinSithu, MyanmarGPT, https://huggingface.co/jojo-ai-mst/MyanmarGPT, 1.1-SweptWood
|
47 |
+
4. Dr. Wai Yan Nyein Naing, WYNN747/Burmese-GPT, https://huggingface.co/WYNN747/Burmese-GPT
|
48 |
+
5. Sai Htaung Kham,saihtaungkham/BurmeseRoBERTaCLM
|