Fix links in README
Browse files
README.md
CHANGED
@@ -25,16 +25,16 @@ Only the second layers of both MLPs in each MMDiT block of SD3.5 Large models ha
|
|
25 |
- [sd3.5_large-q2_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q2_k_4_0.gguf): Smallest quantization yet. Use this if you can't afford anything bigger
|
26 |
- [sd3.5_large-q3_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q3_k_4_0.gguf): Degraded, but usable at high step count.
|
27 |
- [sd3.5_large-q4_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_4_0.gguf): Exacty same size as q4_0, but with slightly less degradation. Recommended
|
28 |
-
- [sd3.5_large-q4_k_4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-
|
29 |
-
- [sd3.5_large-q4_k_5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-
|
30 |
|
31 |
### Legacy types:
|
32 |
|
33 |
-
- [sd3.5_large-q4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-
|
34 |
-
- [sd3.5_large-q4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-
|
35 |
-
- [sd3.5_large-q5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-
|
36 |
-
- [sd3.5_large-q5_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-
|
37 |
-
- [sd3.5_large-q8_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-
|
38 |
|
39 |
## Outputs:
|
40 |
|
|
|
25 |
- [sd3.5_large-q2_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q2_k_4_0.gguf): Smallest quantization yet. Use this if you can't afford anything bigger
|
26 |
- [sd3.5_large-q3_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q3_k_4_0.gguf): Degraded, but usable at high step count.
|
27 |
- [sd3.5_large-q4_k_4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_4_0.gguf): Exacty same size as q4_0, but with slightly less degradation. Recommended
|
28 |
+
- [sd3.5_large-q4_k_4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_4_1.gguf): Smaller than q4_1, and with comparable degradation. Recommended
|
29 |
+
- [sd3.5_large-q4_k_5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/sd3.5_large-q4_k_5_0.gguf): Smaller than q5_0, and with comparable degradation. Very close to the original f16 already. Recommended
|
30 |
|
31 |
### Legacy types:
|
32 |
|
33 |
+
- [sd3.5_large-q4_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q4_0.gguf): Same size as q4_k_4_0, Not recommended (use q4_k_4_0 instead)
|
34 |
+
- [sd3.5_large-q4_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q4_1.gguf): Not recommended (q4_k_4_1 is better and smaller)
|
35 |
+
- [sd3.5_large-q5_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q5_0.gguf): Barely better and bigger than q4_k_5_0
|
36 |
+
- [sd3.5_large-q5_1.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q5_1.gguf): Better and bigger than q5_0
|
37 |
+
- [sd3.5_large-q8_0.gguf](https://huggingface.co/stduhpf/SD3.5-Large-GGUF-mixed-sdcpp/blob/main/legacy/sd3.5_large-q8_0.gguf): Basically indistinguishable from the original f16, but much smaller. Recommended for best quality
|
38 |
|
39 |
## Outputs:
|
40 |
|