L3-8B / README.md
FINGU-AI's picture
Upload folder using huggingface_hub
7e7999a verified
metadata
license: llama3.1

FINGU-AI/L3-8B

Overview

FINGU-AI/L3-8B is a powerful causal language model designed for a variety of natural language processing (NLP) tasks, including machine translation, text generation, and chat-based applications. This model is particularly useful for translating between languages, as well as supporting other custom NLP tasks through flexible input.

Example Usage

Installation

Make sure to install the required packages:

pip install torch transformers

Loading the Model

from transformers import AutoTokenizer, AutoModelForCausalLM
import torch

# Model and Tokenizer
model_id = 'FINGU-AI/L3-8B'
model = AutoModelForCausalLM.from_pretrained(model_id, attn_implementation="sdpa", torch_dtype=torch.bfloat16)
tokenizer = AutoTokenizer.from_pretrained(model_id)
model.to('cuda')

# Input Messages for Translation
messages = [
    {"role": "system", "content": "translate korean to Uzbek"},
    {"role": "user", "content": """์ƒˆ๋กœ์šด ์€ํ–‰ ๊ณ„์ขŒ๋ฅผ ๊ฐœ์„คํ•˜๋Š” ์ ˆ์ฐจ๋Š” ๋‹ค์Œ๊ณผ ๊ฐ™์Šต๋‹ˆ๋‹ค:

1. ๊ณ„์ขŒ ๊ฐœ์„ค ๋ชฉ์ ๊ณผ ์‹ ๋ถ„ ํ™•์ธ์„ ์œ„ํ•œ ์„œ๋ฅ˜ ์ œ์ถœ
2. ์„œ๋ฅ˜ ๊ฒ€ํ†  ๊ณผ์ •์„ ๊ฑฐ์น˜๋Š” ๊ฒƒ
3. ๊ณ ๊ฐ๋‹˜์˜ ์‹ ์› ํ™•์ธ ์ ˆ์ฐจ๋ฅผ ์ง„ํ–‰ํ•˜๋Š” ๊ฒƒ
4. ๋ชจ๋“  ์ ˆ์ฐจ๊ฐ€ ์™„๋ฃŒ๋˜๋ฉด ๊ณ„์ขŒ ๊ฐœ์„ค์ด ๊ฐ€๋Šฅํ•ฉ๋‹ˆ๋‹ค.

๊ณ„์ขŒ ๊ฐœ์„ค์„ ์›ํ•˜์‹œ๋Š” ๊ฒฝ์šฐ, ์‹ ๋ถ„์ฆ๊ณผ ํ•จ๊ป˜ ๋ฐฉ๋ฌธํ•ด ์ฃผ์‹œ๋ฉด ๋ฉ๋‹ˆ๋‹ค.
    """},
]

# Tokenize and Generate Response
input_ids = tokenizer.apply_chat_template(
    messages,
    add_generation_prompt=True,
    return_tensors="pt"
).to('cuda')

outputs = model.generate(
    input_ids,
    max_new_tokens=500,
    do_sample=True,
)

# Decode and Print the Translation
response = outputs[0][input_ids.shape[-1]:]
print(tokenizer.decode(response, skip_special_tokens=True))