rasyosef's picture
Update README.md
f735258 verified
metadata
library_name: transformers
language:
  - am
base_model:
  - rasyosef/Llama-3.2-180M-Amharic

This model is an Instruction-Tuned version of Llama 3.2 180M Amharic.

How to use

Chat Format

Given the nature of the training data, the phi-2 instruct model is best suited for prompts using the chat format as follows. You can provide the prompt as a question with a generic template as follows:

<|im_start|>user
ጥያቄ?<|im_end|>
<|im_start|>assistant

For example:

<|im_start|>user
ሶስት የአፍሪካ ሀገራት ጥቀስልኝ<|im_end|>
<|im_start|>assistant

where the model generates the text after <|im_start|>assistant .

Sample inference code

First, you need to install the latest version of transformers

pip install -Uq transformers

You can use this model directly with a pipeline for text generation:

from transformers import pipeline

llama3_am = pipeline(
    "text-generation",
    model="rasyosef/Llama-3.2-180M-Amharic-Instruct",
    device_map="auto"
  )

messages = [{"role": "user", "content": "ሶስት የአፍሪካ ሀገራት ጥቀስልኝ"}]
llama3_am(messages, max_new_tokens=128, repetition_penalty=1.1, return_full_text=False)

Output:

[{'generated_text': '1. ግብፅ 2. ኢትዮጵያ 3. ኬንያ'}]