Text Generation
Transformers
Inference Endpoints
mikecovlee commited on
Commit
29186b0
1 Parent(s): b3587d7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +12 -10
README.md CHANGED
@@ -27,25 +27,27 @@ The table above presents the performance of MixLoRA and compares these results w
27
 
28
  ## How to Use
29
 
30
- Please visit our GitHub repository: https://github.com/mikecovlee/mlora
31
 
32
  ## Citation
33
  If MixLoRA has been useful for your work, please consider citing it using the appropriate citation format for your publication.
34
  ```bibtex
35
- @misc{MixLoRA,
36
- author = {Dengchun, Li and Yingzi, Ma and Naizheng, Wang and Lei, Duan and Jie, Zuo and Mingjie, Tang},
37
- title = {MixLoRA: Enhancing Large Language Models Fine-Tuning with LoRA based Mixture of Experts},
38
- year = {2024},
39
- publisher = {GitHub},
40
- howpublished = {\url{https://github.com/mikecovlee/mlora}},
 
 
41
  }
42
 
43
  @misc{alpaca-mixlora-7b,
44
- author = {Dengchun, Li and Yingzi, Ma and Naizheng, Wang and Lei, Duan and Jie, Zuo and Mingjie, Tang},
45
- title = {MixLoRA LoRA MoE adapter based on AlpacaCleaned dataset and LLaMA-2-7B base model},
46
  year = {2024},
47
  publisher = {HuggingFace Hub},
48
- howpublished = {\url{https://huggingface.co/scu-kdde/alpaca-mixlora-7b}},
49
  }
50
  ```
51
 
 
27
 
28
  ## How to Use
29
 
30
+ Please visit our GitHub repository: https://github.com/mikecovlee/mLoRA
31
 
32
  ## Citation
33
  If MixLoRA has been useful for your work, please consider citing it using the appropriate citation format for your publication.
34
  ```bibtex
35
+ @misc{li2024mixloraenhancinglargelanguage,
36
+ title={MixLoRA: Enhancing Large Language Models Fine-Tuning with LoRA-based Mixture of Experts},
37
+ author={Dengchun Li and Yingzi Ma and Naizheng Wang and Zhengmao Ye and Zhiyuan Cheng and Yinghao Tang and Yan Zhang and Lei Duan and Jie Zuo and Cal Yang and Mingjie Tang},
38
+ year={2024},
39
+ eprint={2404.15159},
40
+ archivePrefix={arXiv},
41
+ primaryClass={cs.CL},
42
+ url={https://arxiv.org/abs/2404.15159},
43
  }
44
 
45
  @misc{alpaca-mixlora-7b,
46
+ author={Dengchun Li and Yingzi Ma and Naizheng Wang and Zhengmao Ye and Zhiyuan Cheng and Yinghao Tang and Yan Zhang and Lei Duan and Jie Zuo and Cal Yang and Mingjie Tang},
47
+ title = {MixLoRA adapter based on AlpacaCleaned dataset and LLaMA-2-7B base model},
48
  year = {2024},
49
  publisher = {HuggingFace Hub},
50
+ howpublished = {\url{https://huggingface.co/TUDB-Labs/alpaca-mixlora-7b}},
51
  }
52
  ```
53