Janus-Pro-1B-4bit / README.md
wnma3mz's picture
Update README.md
1f6e0b9 verified
metadata
license: mit
license_name: deepseek
license_link: LICENSE
pipeline_tag: any-to-any
library_name: mlx
base_model:
  - deepseek-ai/Janus-Pro-1B
tags:
  - chat

This model is derived from https://huggingface.co./deepseek-ai/Janus-Pro-1B and the main modifications are as follows

  • bin files are updated to safetensors
  • Add chat_template

4bit mainly refers to quantifying the LLM part to 4 bits.

Quick Start

In Macos (Apple silicon), use mlx framework https://github.com/wnma3mz/tLLM

tllm.server --model_path $MODEL_PATH

$MODEL_PATH like wnma3mz/Janus-Pro-1B-4bit