--- library_name: transformers license: mit pipeline_tag: text-generation tags: - conversational - mlx --- # shi3z/Borea-Phi-3.5-mini-Instruct-Jp The Model [shi3z/Borea-Phi-3.5-mini-Instruct-Jp](https://huggingface.co./shi3z/Borea-Phi-3.5-mini-Instruct-Jp) was converted to MLX format from [HODACHI/Borea-Phi-3.5-mini-Instruct-Jp](https://huggingface.co./HODACHI/Borea-Phi-3.5-mini-Instruct-Jp) using mlx-lm version **0.17.0**. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("shi3z/Borea-Phi-3.5-mini-Instruct-Jp") response = generate(model, tokenizer, prompt="hello", verbose=True) ```