Update README.md
Browse files
README.md
CHANGED
@@ -45,7 +45,7 @@ Examples are generated with the default Mirostat setting in Oobabooga, with `Mir
|
|
45 |
* [EXL2 2.4bit](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-2.4bpw_h6_exl2) fits in 1x24GB using Exllamav2 & 8-bit cache @ 10K context
|
46 |
* [EXL2 4bit](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-4.65bpw_h6_exl2) fits in 2x24GB (19/24) using Exllamav2 @ 16K context
|
47 |
* [EXL2 6bit](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-6bpw_h8_exl2) fits in 48GB+24GB (36/24 split) or 3x24GB (16/17/20 split) using Exllamav2 @ 32k context
|
48 |
-
* [GGUF](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K_GGUF) Q4_K_M, Q5_K_M, Q6_K - Currently untested
|
49 |
|
50 |
### Training Data
|
51 |
|
|
|
45 |
* [EXL2 2.4bit](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-2.4bpw_h6_exl2) fits in 1x24GB using Exllamav2 & 8-bit cache @ 10K context
|
46 |
* [EXL2 4bit](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-4.65bpw_h6_exl2) fits in 2x24GB (19/24) using Exllamav2 @ 16K context
|
47 |
* [EXL2 6bit](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K-6bpw_h8_exl2) fits in 48GB+24GB (36/24 split) or 3x24GB (16/17/20 split) using Exllamav2 @ 32k context
|
48 |
+
* [GGUF](https://huggingface.co/grimulkan/aurelian-v0.5-70b-rope8-32K_GGUF) Q4_K_M, Q5_K_M, Q6_K - Currently untested, please give feedback
|
49 |
|
50 |
### Training Data
|
51 |
|