metadata
language:
- en
- fa
How to use:
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("aidal/Persian_Mistral_7B")
model = AutoModelForCausalLM.from_pretrained("aidal/Persian_Mistral_7B")
input_text = "پایتخت ایران کجاست؟"
input_ids = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**input_ids)
print(tokenizer.decode(outputs[0]))