only ct2 int8 convert

ct2-transformers-converter --model facebook/nllb-200-distilled-1.3B --quantization int8 --trust_remote_code --output_dir nllb-200-distilled-1.3B-ct2-int8

Original author: Facebook

Original author model URL: facebook/nllb-200-distilled-1.3B

Downloads last month
58
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.