compile for mlc-llm source project: https://huggingface.co./GeneZC/MiniChat-1.5-3B
- Downloads last month
- 0
Inference API (serverless) does not yet support mlc-llm models for this pipeline type.
compile for mlc-llm source project: https://huggingface.co./GeneZC/MiniChat-1.5-3B