This is a MLC converted weight from Lumimaid-Magnum-v4-12B model in MLC format q4f16_1.

The model can be used for projects MLC-LLM and WebLLM.

Downloads last month
37
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support text-generation models for mlc-llm library.

Model tree for huggingkot/Lumimaid-Magnum-v4-12B-q4f16_1-MLC

Quantized
(5)
this model

Collection including huggingkot/Lumimaid-Magnum-v4-12B-q4f16_1-MLC