Update README.md
Browse files
README.md
CHANGED
@@ -4,6 +4,9 @@ license: apache-2.0
|
|
4 |
|
5 |
**Experimental quants of 4 expert MoE mixtrals in various GGUF formats.**
|
6 |
|
|
|
|
|
|
|
7 |
**Goal is to have the best performing MoE < 10gb**
|
8 |
|
9 |
Experimental q8 and q4 files for training/finetuning too.
|
|
|
4 |
|
5 |
**Experimental quants of 4 expert MoE mixtrals in various GGUF formats.**
|
6 |
|
7 |
+
Original model used for custom quants: ***NeverSleep/Mistral-11B-SynthIAirOmniMix***
|
8 |
+
https://huggingface.co/NeverSleep/Mistral-11B-SynthIAirOmniMix
|
9 |
+
|
10 |
**Goal is to have the best performing MoE < 10gb**
|
11 |
|
12 |
Experimental q8 and q4 files for training/finetuning too.
|