Qwen1.5-14B-Chat-8bpw-h8-exl2

This is a 8.0bpw/h8 quantized version of Qwen/Qwen1.5-14B-Chat made with exllamav2.

To run this, make sure you installed the up-to-date version of Exllamav2.

License

This project is distributed under the Tongyi Qianwen LICENSE AGREEMENT. See the LICENSE file for more information.

Feeling Generous? 😊

Eager to buy me a cup of 2$ coffe or iced tea?πŸ΅β˜• Sure, here is the link: https://ko-fi.com/drnicefellow. Please add a note on which one you want me to drink?

Downloads last month
10
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Collection including DrNicefellow/Qwen1.5-14B-Chat-8bpw-h8-exl2