--- license: apache-2.0 pipeline_tag: text-generation language: - ja - en tags: - japanese - mistral inference: false base_model: mistralai/Mistral-Nemo-Base-2407 --- # Mistral-Nemo-Japanese-Instruct-2408 ## Model Description This is a Japanese continually pre-trained model based on [mistralai/Mistral-Nemo-Instruct-2407](https://huggingface.co./mistralai/Mistral-Nemo-Instruct-2407). ## Usage Make sure to update your transformers installation via `pip install --upgrade transformers`. ```python from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer model = AutoModelForCausalLM.from_pretrained("cyberagent/Mistral-Nemo-Japanese-Instruct-2408", device_map="auto", torch_dtype="auto") tokenizer = AutoTokenizer.from_pretrained("cyberagent/Mistral-Nemo-Japanese-Instruct-2408") streamer = TextStreamer(tokenizer, skip_prompt=True, skip_special_tokens=True) messages = [ {"role": "system", "content": "あなたは親切なAIアシスタントです。"}, {"role": "user", "content": "AIによって私たちの暮らしはどのように変わりますか?"} ] input_ids = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt").to(model.device) output_ids = model.generate(input_ids, max_new_tokens=1024, temperature=0.5, streamer=streamer) ``` ## Prompt Format ChatML Format ``` <|im_start|>system あなたは親切なAIアシスタントです。<|im_end|> <|im_start|>user AIによって私たちの暮らしはどのように変わりますか?<|im_end|> <|im_start|>assistant ``` ## License Apache-2.0 ## Author [Ryosuke Ishigami](https://huggingface.co./rishigami) ## How to cite ```tex @misc{cyberagent-mistral-nemo-japanese-instruct-2408, title={Mistral-Nemo-Japanese-Instruct-2408}, url={https://huggingface.co./cyberagent/Mistral-Nemo-Japanese-Instruct-2408}, author={Ryosuke Ishigami}, year={2024}, } ```