Not working

#4
by Nycoorias - opened

GGUF 2_k
I tried everything the setting say, and more, and the results speak for itself

Instruct mode:

grafik.png

Ah, I'm not sure what you were trying to do, this is not ah... vision model..

The hell is a vision model?

Oh, I thought u tried to do an inference about what's in the image.
Anyway, from what it looks like there's either an issue with the chat template or maybe its just the low quant.

I am using the format given here https://huggingface.co./bartowski/Negative_LLAMA_70B-GGUF, but quat size could be it. I am currently testing I2_k_XS and the results are... a ted better lets just say.

Seems like a quant issue, I'd recommend not going any lower than 3 bpw,.

SicariusSicariiStuff changed discussion status to closed

Sign up or log in to comment