princeton-nlp
commited on
Commit
•
ca6cded
1
Parent(s):
e591e7b
Update README.md
Browse files
README.md
CHANGED
@@ -20,7 +20,7 @@ model = AutoModelForCausalLM.from_pretrained("princeton-nlp/Sheared-LLaMA-2.7B")
|
|
20 |
|
21 |
- Smaller-scale
|
22 |
- Same vocabulary as LLaMA1 and LLaMA2
|
23 |
-
- Derived with 50B tokens by utilizing existing strong LLMs
|
24 |
|
25 |
## Downstream Tasks
|
26 |
|
|
|
20 |
|
21 |
- Smaller-scale
|
22 |
- Same vocabulary as LLaMA1 and LLaMA2
|
23 |
+
- Derived with a budget of 50B tokens by utilizing existing strong LLMs
|
24 |
|
25 |
## Downstream Tasks
|
26 |
|