File size: 2,916 Bytes
1135bf9 79d33e1 1135bf9 79d33e1 c02fca7 79d33e1 1135bf9 79d33e1 8b06ed5 79d33e1 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 |
---
language:
- en
- ko
license: llama3
library_name: transformers
tags:
- translation
- enko
- ko
base_model:
- meta-llama/Meta-Llama-3-8B-Instruct
datasets:
- squarelike/sharegpt_deepl_ko_translation
pipeline_tag: text-generation
---
# **Introduction**
This model was trained to translate a sentence from English to Korean using the 486k dataset from [squarelike/sharegpt_deepl_ko_translation](https://huggingface.co./datasets/nayohan/aihub-en-ko-translation-1.2m).
### **Loading the Model**
Use the following Python code to load the model:
```python
import torch
from transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "nayohan/llama3-8b-it-translation-sharegpt-en-ko"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
model_name,
device_map="auto",
torch_dtype=torch.bfloat16
)
```
### **Generating Text**
This model supports translation from English to Korean. To generate text, use the following Python code:
```python
system_prompt="๋น์ ์ ๋ฒ์ญ๊ธฐ ์
๋๋ค. ์์ด๋ฅผ ํ๊ตญ์ด๋ก ๋ฒ์ญํ์ธ์."
sentence = "The aerospace industry is a flower in the field of technology and science."
conversation = [{'role': 'system', 'content': system_prompt},
{'role': 'user', 'content': sentence}]
inputs = tokenizer.apply_chat_template(
conversation,
tokenize=True,
add_generation_prompt=True,
return_tensors='pt'
).to("cuda")
outputs = model.generate(inputs, max_new_tokens=256)
print(tokenizer.decode(outputs[0][len(inputs[0]):]))
```
```
# Result
# INPUT: <|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\nActs as a translator. Translate en sentences into ko sentences in colloquial style.<|eot_id|><|start_header_id|>user<|end_header_id|>\n\nThe aerospace industry is a flower in the field of technology and science.<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n
# OUTPUT: ํญ๊ณต์ฐ์ฃผ ์ฐ์
์ ๊ธฐ์ ๊ณผ ๊ณผํ ๋ถ์ผ์ ๊ฝ์
๋๋ค.<|eot_id|>
# INPUT:
<|begin_of_text|><|start_header_id|>system<|end_header_id|>\n\n๋น์ ์ ๋ฒ์ญ๊ธฐ ์
๋๋ค. ์์ด๋ฅผ ํ๊ตญ์ด๋ก ๋ฒ์ญํ์ธ์.<|eot_id|><|start_header_id|>user<|end_header_id|>\n\n
Technical and basic sciences are very important in terms of research. It has a significant impact on the industrial development of a country. Government policies control the research budget.<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n
# OUTPUT: ๊ธฐ์ ๋ฐ ๊ธฐ์ด ๊ณผํ์ ์ฐ๊ตฌ ์ธก๋ฉด์์ ๋งค์ฐ ์ค์ํฉ๋๋ค. ์ด๋ ํ ๊ตญ๊ฐ์ ์ฐ์
๋ฐ์ ์ ํฐ ์ํฅ์ ๋ฏธ์นฉ๋๋ค. ์ ๋ถ ์ ์ฑ
์ ๋ฐ๋ผ ์ฐ๊ตฌ ์์ฐ์ด ๊ฒฐ์ ๋ฉ๋๋ค.<|eot_id|>
```
### **Citation**
```bibtex
@article{llama3modelcard,
title={Llama 3 Model Card},
author={AI@Meta},
year={2024},
url={https://github.com/meta-llama/llama3/blob/main/MODEL_CARD.md}
}
```
Our trainig code can be found here: [TBD] |