nold
/

Text Generation
Transformers
GGUF
Inference Endpoints
conversational
nold commited on
Commit
bd6a870
·
verified ·
1 Parent(s): 137c151

Upload folder using huggingface_hub

Browse files
.gitattributes CHANGED
@@ -33,3 +33,7 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
 
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ Magicoder-S-DS-6.7B_Q2_K.gguf filter=lfs diff=lfs merge=lfs -text
37
+ Magicoder-S-DS-6.7B_Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
38
+ Magicoder-S-DS-6.7B_Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
39
+ Magicoder-S-DS-6.7B_Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
Magicoder-S-DS-6.7B_Q2_K.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:475cc908db9c8f07eaec3c46ff1b168dd30a1c919035be005f3601016b6a047f
3
+ size 2534104960
Magicoder-S-DS-6.7B_Q4_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4a722c4de1f180f30b341de8760f78135461314749ba03a48bd3dab43b34a0f7
3
+ size 4082491264
Magicoder-S-DS-6.7B_Q5_K_M.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c874eb946d740e6bcc6d89ac52d1e9be1cb2b4246eda8c2abda71c7c13705aa3
3
+ size 4784775040
Magicoder-S-DS-6.7B_Q8_0.gguf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5a2647acceb933a9c967b39505f06e936b42604a6a75153675e35f3462a46d2c
3
+ size 7163355008
README.md ADDED
@@ -0,0 +1,123 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: other
3
+ license_name: deepseek
4
+ datasets:
5
+ - ise-uiuc/Magicoder-OSS-Instruct-75K
6
+ - ise-uiuc/Magicoder-Evol-Instruct-110K
7
+ library_name: transformers
8
+ pipeline_tag: text-generation
9
+ ---
10
+ # 🎩 Magicoder: Source Code Is All You Need
11
+
12
+ > Refer to our GitHub repo [ise-uiuc/magicoder](https://github.com/ise-uiuc/magicoder/) for an up-to-date introduction to the Magicoder family!
13
+
14
+ * 🎩**Magicoder** is a model family empowered by 🪄**OSS-Instruct**, a novel approach to enlightening LLMs with open-source code snippets for generating *low-bias* and *high-quality* instruction data for code.
15
+ * 🪄**OSS-Instruct** mitigates the *inherent bias* of the LLM-synthesized instruction data by empowering them with *a wealth of open-source references* to produce more diverse, realistic, and controllable data.
16
+
17
+ ![Overview of OSS-Instruct](assets/overview.svg)
18
+ ![Overview of Result](assets/result.png)
19
+
20
+ ## Model Details
21
+
22
+ ### Model Description
23
+
24
+ * **Developed by:**
25
+ [Yuxiang Wei](https://yuxiang.cs.illinois.edu),
26
+ [Zhe Wang](https://github.com/zhewang2001),
27
+ [Jiawei Liu](https://jiawei-site.github.io),
28
+ [Yifeng Ding](https://yifeng-ding.com),
29
+ [Lingming Zhang](https://lingming.cs.illinois.edu)
30
+ * **License:** [DeepSeek](https://github.com/deepseek-ai/DeepSeek-Coder/blob/main/LICENSE-MODEL)
31
+ * **Finetuned from model:** [deepseek-coder-6.7b-base](https://huggingface.co/deepseek-ai/deepseek-coder-6.7b-base)
32
+
33
+ ### Model Sources
34
+
35
+ * **Repository:** <https://github.com/ise-uiuc/magicoder>
36
+ * **Paper:** <https://arxiv.org/abs/2312.02120>
37
+ * **Demo (powered by [Gradio](https://www.gradio.app)):**
38
+ <https://github.com/ise-uiuc/magicoder/tree/main/demo>
39
+
40
+ ### Training Data
41
+
42
+ * [Magicoder-OSS-Instruct-75K](https://huggingface.co/datasets/ise-uiuc/Magicoder_oss_instruct_75k): generated through **OSS-Instruct** using `gpt-3.5-turbo-1106` and used to train both Magicoder and Magicoder-S series.
43
+ * [Magicoder-Evol-Instruct-110K](https://huggingface.co/datasets/ise-uiuc/Magicoder_evol_instruct_110k): decontaminated and redistributed from [theblackcat102/evol-codealpaca-v1](https://huggingface.co/datasets/theblackcat102/evol-codealpaca-v1), used to further finetune Magicoder series and obtain Magicoder-S models.
44
+
45
+ ## Uses
46
+
47
+ ### Direct Use
48
+
49
+ Magicoders are designed and best suited for **coding tasks**.
50
+
51
+ ### Out-of-Scope Use
52
+
53
+ Magicoders may not work well in non-coding tasks.
54
+
55
+ ## Bias, Risks, and Limitations
56
+
57
+ Magicoders may sometimes make errors, producing misleading contents, or struggle to manage tasks that are not related to coding.
58
+
59
+ ### Recommendations
60
+
61
+ Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model.
62
+
63
+ ## How to Get Started with the Model
64
+
65
+ Use the code below to get started with the model. Make sure you installed the [transformers](https://huggingface.co/docs/transformers/index) library.
66
+
67
+ ```python
68
+ from transformers import pipeline
69
+ import torch
70
+
71
+ MAGICODER_PROMPT = """You are an exceptionally intelligent coding assistant that consistently delivers accurate and reliable responses to user instructions.
72
+
73
+ @@ Instruction
74
+ {instruction}
75
+
76
+ @@ Response
77
+ """
78
+
79
+ instruction = <Your code instruction here>
80
+
81
+ prompt = MAGICODER_PROMPT.format(instruction=instruction)
82
+ generator = pipeline(
83
+ model="ise-uiuc/Magicoder-S-DS-6.7B",
84
+ task="text-generation",
85
+ torch_dtype=torch.bfloat16,
86
+ device_map="auto",
87
+ )
88
+ result = generator(prompt, max_length=1024, num_return_sequences=1, temperature=0.0)
89
+ print(result[0]["generated_text"])
90
+ ```
91
+
92
+ ## Technical Details
93
+
94
+ Refer to our GitHub repo: [ise-uiuc/magicoder](https://github.com/ise-uiuc/magicoder/).
95
+
96
+ ## Citation
97
+
98
+ ```bibtex
99
+ @misc{magicoder,
100
+ title={Magicoder: Source Code Is All You Need},
101
+ author={Yuxiang Wei and Zhe Wang and Jiawei Liu and Yifeng Ding and Lingming Zhang},
102
+ year={2023},
103
+ eprint={2312.02120},
104
+ archivePrefix={arXiv},
105
+ primaryClass={cs.CL}
106
+ }
107
+ ```
108
+
109
+ ## Acknowledgements
110
+
111
+ * [WizardCoder](https://github.com/nlpxucan/WizardLM/tree/main/WizardCoder): Evol-Instruct
112
+ * [DeepSeek-Coder](https://github.com/deepseek-ai/DeepSeek-Coder): Base model for Magicoder-DS
113
+ * [CodeLlama](https://ai.meta.com/research/publications/code-llama-open-foundation-models-for-code/): Base model for Magicoder-CL
114
+ * [StarCoder](https://arxiv.org/abs/2305.06161): Data decontamination
115
+
116
+ ## Important Note
117
+
118
+ Magicoder models are trained on the synthetic data generated by OpenAI models. Please pay attention to OpenAI's [terms of use](https://openai.com/policies/terms-of-use) when using the models and the datasets. Magicoders will not compete with OpenAI's commercial products.
119
+
120
+
121
+ ***
122
+
123
+ Vanilla Quantization by [nold](https://huggingface.co/nold), Original Model [ise-uiuc/Magicoder-S-DS-6.7B](https://huggingface.co/ise-uiuc/Magicoder-S-DS-6.7B). Created using [llm-quantizer](https://github.com/Nold360/llm-quantizer) Pipeline - c21aafcd05203496bf1294e97058a17efa858c8a