legraphista
commited on
Commit
β’
a6549bd
1
Parent(s):
824fd6b
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -74,7 +74,7 @@ IMatrix dataset: [here](https://gist.githubusercontent.com/bartowski1182/eb213dc
|
|
74 |
| [glm-4-9b-chat-1m.Q5_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q5_K_S.gguf) | Q5_K_S | 6.75GB | β
Available | βͺ Static | π¦ No
|
75 |
| [glm-4-9b-chat-1m.Q4_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q4_K.gguf) | Q4_K | 6.31GB | β
Available | βͺ Static | π¦ No
|
76 |
| [glm-4-9b-chat-1m.Q4_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q4_K_S.gguf) | Q4_K_S | 5.80GB | β
Available | βͺ Static | π¦ No
|
77 |
-
| glm-4-9b-chat-1m.IQ4_NL | IQ4_NL |
|
78 |
| [glm-4-9b-chat-1m.IQ4_XS.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.IQ4_XS.gguf) | IQ4_XS | 5.35GB | β
Available | βͺ Static | π¦ No
|
79 |
| [glm-4-9b-chat-1m.Q3_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q3_K.gguf) | Q3_K | 5.11GB | β
Available | βͺ Static | π¦ No
|
80 |
| [glm-4-9b-chat-1m.Q3_K_L.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q3_K_L.gguf) | Q3_K_L | 5.33GB | β
Available | βͺ Static | π¦ No
|
|
|
74 |
| [glm-4-9b-chat-1m.Q5_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q5_K_S.gguf) | Q5_K_S | 6.75GB | β
Available | βͺ Static | π¦ No
|
75 |
| [glm-4-9b-chat-1m.Q4_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q4_K.gguf) | Q4_K | 6.31GB | β
Available | βͺ Static | π¦ No
|
76 |
| [glm-4-9b-chat-1m.Q4_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q4_K_S.gguf) | Q4_K_S | 5.80GB | β
Available | βͺ Static | π¦ No
|
77 |
+
| [glm-4-9b-chat-1m.IQ4_NL.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.IQ4_NL.gguf) | IQ4_NL | 5.56GB | β
Available | βͺ Static | π¦ No
|
78 |
| [glm-4-9b-chat-1m.IQ4_XS.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.IQ4_XS.gguf) | IQ4_XS | 5.35GB | β
Available | βͺ Static | π¦ No
|
79 |
| [glm-4-9b-chat-1m.Q3_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q3_K.gguf) | Q3_K | 5.11GB | β
Available | βͺ Static | π¦ No
|
80 |
| [glm-4-9b-chat-1m.Q3_K_L.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-1m-GGUF/blob/main/glm-4-9b-chat-1m.Q3_K_L.gguf) | Q3_K_L | 5.33GB | β
Available | βͺ Static | π¦ No
|