Edit model card
Configuration Parsing Warning: In config.json: "quantization_config.bits" must be an integer

Exl2 of an old model. Found it still really good and I'm still using it sometimes so 4.6bpw exl2 here is.

Only support 8k (8192) context length...

Alpaca prompting format.

Downloads last month
9
Inference API
Unable to determine this model's library. Check the docs .