Update README.md
Browse files
README.md
CHANGED
@@ -71,12 +71,11 @@ Training Data: 200B tokens from [SlimPajama](https://www.cerebras.net/blog/slimp
|
|
71 |
## 📃 Citation
|
72 |
|
73 |
```bibtex
|
74 |
-
@
|
75 |
title={LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training},
|
76 |
author={LLaMA-MoE Team},
|
77 |
-
journal={arXiv},
|
78 |
year={2023},
|
79 |
-
|
80 |
-
url={https://
|
81 |
}
|
82 |
```
|
|
|
71 |
## 📃 Citation
|
72 |
|
73 |
```bibtex
|
74 |
+
@misc{llama-moe-2023,
|
75 |
title={LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-training},
|
76 |
author={LLaMA-MoE Team},
|
|
|
77 |
year={2023},
|
78 |
+
month={Dec},
|
79 |
+
url={https://github.com/pjlab-sys4nlp/llama-moe}
|
80 |
}
|
81 |
```
|