legraphista
commited on
Commit
β’
acd0fb3
1
Parent(s):
a3c2b25
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -76,7 +76,7 @@ Link: [here](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/mai
|
|
76 |
| glm-4-9b-chat.FP16 | F16 | - | β³ Processing | βͺ Static | -
|
77 |
| [glm-4-9b-chat.Q8_0.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q8_0.gguf) | Q8_0 | 9.99GB | β
Available | βͺ Static | π¦ No
|
78 |
| [glm-4-9b-chat.Q6_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q6_K.gguf) | Q6_K | 8.26GB | β
Available | βͺ Static | π¦ No
|
79 |
-
| glm-4-9b-chat.Q5_K | Q5_K |
|
80 |
| glm-4-9b-chat.Q5_K_S | Q5_K_S | - | β³ Processing | βͺ Static | -
|
81 |
| glm-4-9b-chat.Q4_K | Q4_K | - | β³ Processing | π’ IMatrix | -
|
82 |
| glm-4-9b-chat.Q4_K_S | Q4_K_S | - | β³ Processing | π’ IMatrix | -
|
|
|
76 |
| glm-4-9b-chat.FP16 | F16 | - | β³ Processing | βͺ Static | -
|
77 |
| [glm-4-9b-chat.Q8_0.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q8_0.gguf) | Q8_0 | 9.99GB | β
Available | βͺ Static | π¦ No
|
78 |
| [glm-4-9b-chat.Q6_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q6_K.gguf) | Q6_K | 8.26GB | β
Available | βͺ Static | π¦ No
|
79 |
+
| [glm-4-9b-chat.Q5_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q5_K.gguf) | Q5_K | 7.14GB | β
Available | βͺ Static | π¦ No
|
80 |
| glm-4-9b-chat.Q5_K_S | Q5_K_S | - | β³ Processing | βͺ Static | -
|
81 |
| glm-4-9b-chat.Q4_K | Q4_K | - | β³ Processing | π’ IMatrix | -
|
82 |
| glm-4-9b-chat.Q4_K_S | Q4_K_S | - | β³ Processing | π’ IMatrix | -
|