--- language: - en license: cc-by-nc-4.0 tags: - mlx base_model: - upstage/SOLAR-10.7B-v1.0 datasets: - c-s-ale/alpaca-gpt4-data - Open-Orca/OpenOrca - Intel/orca_dpo_pairs - allenai/ultrafeedback_binarized_cleaned --- # mlx-community/SOLAR-10.7B-Instruct-v1.0-4bit The Model [mlx-community/SOLAR-10.7B-Instruct-v1.0-4bit](https://huggingface.co./mlx-community/SOLAR-10.7B-Instruct-v1.0-4bit) was converted to MLX format from [upstage/SOLAR-10.7B-Instruct-v1.0](https://huggingface.co./upstage/SOLAR-10.7B-Instruct-v1.0) using mlx-lm version **0.13.0**. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/SOLAR-10.7B-Instruct-v1.0-4bit") response = generate(model, tokenizer, prompt="hello", verbose=True) ```