Not working
#4
by
Nycoorias
- opened
Ah, I'm not sure what you were trying to do, this is not ah... vision model..
The hell is a vision model?
Oh, I thought u tried to do an inference about what's in the image.
Anyway, from what it looks like there's either an issue with the chat template or maybe its just the low quant.
I am using the format given here https://huggingface.co./bartowski/Negative_LLAMA_70B-GGUF, but quat size could be it. I am currently testing I2_k_XS and the results are... a ted better lets just say.
Seems like a quant issue, I'd recommend not going any lower than 3 bpw,.
SicariusSicariiStuff
changed discussion status to
closed