Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

How much introduction do you need? You know what it is. If you want something that's closer to regular-flavor Mistral Large, here you go.

A basic af 50/50 slerp merge of anthracite-org/magnum-v2-123b with mistralai/Mistral-Large-Instruct-2407

Downloads last month
3
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for drexample/magstral-123b-exl2-5.8bpw