metadata
license: apache-2.0
tags:
- text generation
- conversational
- gptq
- 4bit
inference: false
language:
- en
pipeline_tag: text-generation
GPTQ quantization of https://huggingface.co./TehVenom/PPO_Shygmalion-6b
Using this repository: https://github.com/mayaeary/GPTQ-for-LLaMa/tree/gptj-v2
Command:
python3 gptj.py models/PPO_Shygmalion-6b c4 --wbits 4 --groupsize 128 --save_safetensors models/PPO_Shygmalion-6b-4bit-128g.safetensors