Use ybelkada/Mixtral-8x7B-Instruct-v0.1-AWQ with VLLM instead

#10
by blobpenguin - opened

This repo doesn't produce output with VLLM. But ybelkada/Mixtral-8x7B-Instruct-v0.1-AWQ is working correctly.

Came here to ask about this... it is still not working with VLLM.

Sign up or log in to comment