license: apache-2.0
Experimental quants of 4 headed mixtrals in various GGUF formats.
Goal is to have the best performing MoE < 16 Gig .
They still need training/finetuning