File size: 323 Bytes
cd9d96f 5a5afd3 |
1 2 3 4 5 6 7 8 |
This is a very bad attempt at quantizing 128g 4 bit with alpaca (in orca style prompt
```sh
python quantize_alpaca.py --pretrained_model_dir orca_mini_3b/ --bits 4 --group_size 128 --quantized_model_dir orca_mini_3b_gptq/ --save_and_reloa
```
Downloqd cleaned dataset first: https://github.com/gururise/AlpacaDataCleaned
|