qilowoq's picture
Update README.md
f84632d verified
|
raw
history blame
600 Bytes
metadata
base_model: Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24
base_model_relation: quantized
license: apache-2.0
pipeline_tag: text-generation
quantized_by: qilowoq
tags:
  - gptq
language:
  - en
  - ru

Vikhrmodels/Vikhr-Nemo-12B-Instruct-R-21-09-24-4Bit-GPTQ

Quantization

  • This model was quantized with the Auto-GPTQ library and dataset containing english and russian wikipedia articles. It has lower perplexity on russian data then other GPTQ models.