that 's great job~so,when there will have gguf~
that 's great jobso,when there will have gguf
at least my ggufs will be available within a day or so at https://huggingface.co./mradermacher/Cathallama-70B-GGUF and https://huggingface.co./mradermacher/Cathallama-70B-i1-GGUF
Could we get a 4 exl2 quant as well?
Very good, I tried the GGUF model, it is a very good model, the dialogue of this model is smooth, the thinking is clear and agile, I am already looking forward to the next version。can the next version added “Infinity-Instruct-7M-Gen-Llama3_1-70B” to the model?
I'll try and see if it works out.
I tried “Infinity-Instruct-7M-Gen-Llama3_1-70B”, but only got a 49 MMLU-pro at Q4_0 and 69 on pubmedqa and failed to make the python snake game. The model itself was OK to talk with, but I didn't do the whole manual test suite. Sorry I can't upload it. My internet is very slow right now.
Thank you very much for trying, it seems to be a little structurally incompatible, thank you for bringing such a great model to everyone, I use it very well locally