MaziyarPanahi
commited on
Commit
•
1a0fddc
1
Parent(s):
5f9fa08
Update README.md
Browse files
README.md
CHANGED
@@ -89,7 +89,28 @@ Users (both direct and downstream) should be made aware of the risks, biases and
|
|
89 |
|
90 |
## How to Get Started with the Model
|
91 |
|
92 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
93 |
|
94 |
[More Information Needed]
|
95 |
|
|
|
89 |
|
90 |
## How to Get Started with the Model
|
91 |
|
92 |
+
To use this adapter:
|
93 |
+
```python
|
94 |
+
from peft import PeftModel, PeftConfig
|
95 |
+
from transformers import AutoModelForCausalLM
|
96 |
+
|
97 |
+
# Load base model in 4 bit
|
98 |
+
model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-2-7b-hf", load_in_4bit=True)
|
99 |
+
|
100 |
+
# Wrap model with pretrained model weights
|
101 |
+
config = PeftConfig.from_pretrained("MaziyarPanahi/Llama-2-7b-hf-codealpaca-4bit")
|
102 |
+
model = PeftModel.from_pretrained(model, "MaziyarPanahi/Llama-2-7b-hf-codealpaca-4bit", config=config)
|
103 |
+
```
|
104 |
+
|
105 |
+
Prompt Template:
|
106 |
+
```
|
107 |
+
Below is an instruction that describes a task, paired with an input
|
108 |
+
that provides further context. Write a response that appropriately
|
109 |
+
completes the request.
|
110 |
+
### Instruction: {instruction}
|
111 |
+
### Input: {input}
|
112 |
+
### Response:
|
113 |
+
```
|
114 |
|
115 |
[More Information Needed]
|
116 |
|