library_name: transformers | |
license: gemma | |
pipeline_tag: image-text-to-text | |
tags: | |
- mlx | |
extra_gated_heading: Access PaliGemma on Hugging Face | |
extra_gated_prompt: To access PaliGemma on Hugging Face, you’re required to review | |
and agree to Google’s usage license. To do this, please ensure you’re logged-in | |
to Hugging Face and click below. Requests are processed immediately. | |
extra_gated_button_content: Acknowledge license | |
# mlx-community/paligemma-3b-mix-224-8bit | |
This model was converted to MLX format from [`google/paligemma-3b-mix-224`]() using mlx-vlm version **0.1.0**. | |
Refer to the [original model card](https://huggingface.co./google/paligemma-3b-mix-224) for more details on the model. | |
## Use with mlx | |
```bash | |
pip install -U mlx-vlm | |
``` | |
```bash | |
python -m mlx_vlm.generate --model mlx-community/paligemma-3b-mix-224-8bit --max-tokens 100 --temp 0.0 | |
``` | |