This is a ggml quantized version of Replit-v2-CodeInstruct-3B. Quantized to 4bit -> q4_1. To run inference you can use ggml directly or ctransformers.

Downloads last month
44
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API does not yet support model repos that contain custom code.

Spaces using abacaj/Replit-v2-CodeInstruct-3B-ggml 2