GPTQ-for-StarCoder
Visit GPTQ-for-SantaCoder for instructions on how to use the model weights here. If you want 8-bit weights, visit starcoderbase-GPTQ-8bit-128g.
Results
StarCoderBase | Bits | group-size | memory(MiB) | wikitext2 | ptb | c4 | stack | checkpoint size(MB) |
---|---|---|---|---|---|---|---|---|
FP32 | 32 | - | 10.172 | 15.756 | 12.736 | 1.692 | 59195 | |
BF16 | 16 | - | 10.173 | 15.765 | 12.745 | 1.692 | 29597 | |
GPTQ | 8 | 128 | 10.174 | 15.767 | 12.739 | 1.692 | 16163 | |
GPTQ | 4 | 128 | 10.387 | 16.056 | 13.005 | 1.708 | 8877 |
License
The model is licenses under the CodeML Open RAIL-M v0.1 license. You can find the full license here.
Acknowledgements
Thanks to everyone in BigCode who worked so hard to create these code models.
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.