File size: 535 Bytes
9bea1d3
 
 
5760137
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
---
license: apache-2.0
---

---
license: apache-2.0
---

Sheared-LLaMA-2.7B is a model pruned and further pre-trained from [meta-llama/Llama-2-7b-hf](https://huggingface.co./meta-llama/Llama-2-7b-hf). We dynamically load data from the [RedPajama dataset](https://github.com/togethercomputer/RedPajama-Data). We use 0.4B tokens for pruning and 50B tokens for continued pre-training the pruned model. This model can be loaded into huggingface via

```
model = AutoModelForCausalLM.from_pretrained("princeton-nlp/Sheared-LLaMA-2.7B")
```