File size: 533 Bytes
513b1c0 dd2ea0f |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 |
---
license: apache-2.0
language:
- ko
---
# KULLM project
- base model: Upstage/SOLAR-10.7B-Instruct-v1.0
## datasets
- KULLM dataset
- hand-crafted instruction data
## Implementation Code
```python
from transformers import (
AutoModelForCausalLM,
AutoTokenizer
)
import torch
repo = "heavytail/kullm-solar-S"
model = AutoModelForCausalLM.from_pretrained(
repo,
torch_dtype=torch.float16,
device_map='auto'
)
tokenizer = AutoTokenizer.from_pretrained(repo)
```
Initial upload: 2024/01/28 21:00 |