# starcoderbase-1b-GPTQ Quantized starcoderbase-1b model to GPTQ format (4-bit precision) using [Auto-GPTQ](https://github.com/AutoGPTQ/AutoGPTQ). [Quantization script](https://github.com/cosmo3769/Quantized-LLMs/blob/main/notebooks/quantize-starcoderbase-1b-4bit-gptq.ipynb) ## Benchmark [Benchmarking script](https://github.com/cosmo3769/Quantized-LLMs/blob/main/notebooks/llmbenchmark-starcoderbase-1b-lm-eval-harness.ipynb) ### Baseline starcoderbase-1b model (non-quantized) | Tasks |Version|Filter|n-shot| Metric |Value | |Stderr| |-----------------------|-------|------|------|---------------|-----:|---|-----:| |codexglue_code2text |N/A |none |None |smoothed_bleu_4|0.8767|± |0.0592| | - code2text_go | 1|none |None |smoothed_bleu_4|1.0054|± |0.0983| | - code2text_java | 1|none |None |smoothed_bleu_4|1.2158|± |0.1657| | - code2text_javascript| 1|none |None |smoothed_bleu_4|0.8560|± |0.0429| | - code2text_php | 1|none |None |smoothed_bleu_4|0.9879|± |0.0887| | - code2text_python | 1|none |None |smoothed_bleu_4|1.1950|± |0.2819| | - code2text_ruby | 3|none |None |smoothed_bleu_4|0.0000|± |0.0000| | Groups |Version|Filter|n-shot| Metric |Value | |Stderr| |-------------------|-------|------|------|---------------|-----:|---|-----:| |codexglue_code2text|N/A |none |None |smoothed_bleu_4|0.8767|± |0.0592| | Tasks |Version|Filter|n-shot| Metric |Value| |Stderr| |---------------------------------------------|------:|------|------|-----------|----:|---|-----:| |bigbench_code_line_description_generate_until| 1|none |None |exact_match| 0|± | 0| | Tasks |Version|Filter|n-shot|Metric|Value| |Stderr| |----------------------------------------------|------:|------|------|------|----:|---|-----:| |bigbench_code_line_description_multiple_choice| 0|none |None |acc | 0.15|± |0.0465| ### Quantized starcoderbase-1b model to GPTQ format | Tasks |Version|Filter|n-shot| Metric |Value | |Stderr| |-----------------------|-------|------|------|---------------|-----:|---|-----:| |codexglue_code2text |N/A |none |None |smoothed_bleu_4|0.7959|± |0.2180| | - code2text_go | 1|none |None |smoothed_bleu_4|0.9280|± |0.0291| | - code2text_java | 1|none |None |smoothed_bleu_4|1.2112|± |0.1703| | - code2text_javascript| 1|none |None |smoothed_bleu_4|0.8848|± |0.0391| | - code2text_php | 1|none |None |smoothed_bleu_4|0.6055|± |0.6055| | - code2text_python | 1|none |None |smoothed_bleu_4|1.1460|± |1.1460| | - code2text_ruby | 3|none |None |smoothed_bleu_4|0.0000|± |0.0000| | Groups |Version|Filter|n-shot| Metric |Value | |Stderr| |-------------------|-------|------|------|---------------|-----:|---|-----:| |codexglue_code2text|N/A |none |None |smoothed_bleu_4|0.7959|± | 0.218| | Tasks |Version|Filter|n-shot| Metric |Value| |Stderr| |---------------------------------------------|------:|------|------|-----------|----:|---|-----:| |bigbench_code_line_description_generate_until| 1|none |None |exact_match| 0|± | 0| | Tasks |Version|Filter|n-shot|Metric|Value | |Stderr| |----------------------------------------------|------:|------|------|------|-----:|---|-----:| |bigbench_code_line_description_multiple_choice| 0|none |None |acc |0.1333|± |0.0443|