DataVortex Models
Collection
21 items
โข
Updated
Research & Engineering | Product Management |
---|---|
Kwangseok Yang | Seunghyun Choi |
Jeongwon Choi | Hyoseok Choi |
It follows Alpaca format.
E.g.
text = """\
๋น์ ์ ์ฌ๋๋ค์ด ์ ๋ณด๋ฅผ ์ฐพ์ ์ ์๋๋ก ๋์์ฃผ๋ ์ธ๊ณต์ง๋ฅ ๋น์์
๋๋ค.
### User:
๋ํ๋ฏผ๊ตญ์ ์๋๋ ์ด๋์ผ?
### Assistant:
๋ํ๋ฏผ๊ตญ์ ์๋๋ ์์ธ์
๋๋ค.
### User:
์์ธ ์ธ๊ตฌ๋ ์ด ๋ช ๋ช
์ด์ผ?
"""
Task | 0-shot | 5-shot | 10-shot | 50-shot |
---|---|---|---|---|
kobest_boolq | 0.334282 | 0.891367 | 0.896755 | 0.884441 |
kobest_copa | 0.697763 | 0.716762 | 0.724769 | 0.751746 |
kobest_hellaswag | 0.432047 | 0.458301 | 0.443993 | 0.458232 |
kobest_sentineg | 0.49353 | 0.954657 | 0.964735 | 0.949606 |
Average | 0.4894055 | 0.75527175 | 0.757563 | 0.76100625 |
Average | Ko-ARC | Ko-HellaSwag | Ko-MMLU | Ko-TruthfulQA | Ko-CommonGen V2 |
---|---|---|---|---|---|
53.21 | 47.87 | 57.18 | 54.82 | 53.64 | 52.54 |
This model contains the chat_template instruction format.
You can use the code below.
from transformers import AutoModelForCausalLM, AutoTokenizer
device = "cuda" # the device to load the model onto
model = AutoModelForCausalLM.from_pretrained("Edentns/DataVortexS-10.7B-dpo-v0.1")
tokenizer = AutoTokenizer.from_pretrained("Edentns/DataVortexS-10.7B-dpo-v0.1")
messages = [
{"role": "system", "content": "๋น์ ์ ์ฌ๋๋ค์ด ์ ๋ณด๋ฅผ ์ฐพ์ ์ ์๋๋ก ๋์์ฃผ๋ ์ธ๊ณต์ง๋ฅ ๋น์์
๋๋ค."},
{"role": "user", "content": "๋ํ๋ฏผ๊ตญ์ ์๋๋ ์ด๋์ผ?"},
{"role": "assistant", "content": "๋ํ๋ฏผ๊ตญ์ ์๋๋ ์์ธ์
๋๋ค."},
{"role": "user", "content": "์์ธ ์ธ๊ตฌ๋ ์ด ๋ช ๋ช
์ด์ผ?"}
]
encodeds = tokenizer.apply_chat_template(messages, return_tensors="pt")
model_inputs = encodeds.to(device)
model.to(device)
generated_ids = model.generate(model_inputs, max_new_tokens=1000, do_sample=True)
decoded = tokenizer.batch_decode(generated_ids)
print(decoded[0])
The model is licensed under the cc-by-nc-sa-4.0 license, which allows others to copy, modify, and share the work non-commercially, as long as they give appropriate credit and distribute any derivative works under the same license.
Base model
LDCC/LDCC-SOLAR-10.7B