Ctranslate2 8-bit quantization of EXAONE-3.5-2.4B-Instruct.

However, this would not have been possible without the model first being converted to Llama format from here

Downloads last month
14
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for ctranslate2-4you/EXAONE-3.5-2.4B-Instruct-ct2-int8

Quantized
(13)
this model

Collection including ctranslate2-4you/EXAONE-3.5-2.4B-Instruct-ct2-int8