legraphista commited on
Commit
3c71911
β€’
1 Parent(s): 8efca74

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -88,7 +88,7 @@ Link: [here](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/mai
88
  | [glm-4-9b-chat.IQ3_M.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.IQ3_M.gguf) | IQ3_M | 4.81GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
89
  | glm-4-9b-chat.IQ3_S | IQ3_S | - | ⏳ Processing | 🟒 IMatrix | -
90
  | [glm-4-9b-chat.IQ3_XS.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.IQ3_XS.gguf) | IQ3_XS | 4.43GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
91
- | glm-4-9b-chat.IQ3_XXS | IQ3_XXS | - | ⏳ Processing | 🟒 IMatrix | -
92
  | [glm-4-9b-chat.Q2_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q2_K.gguf) | Q2_K | 3.99GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
93
  | [glm-4-9b-chat.Q2_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q2_K_S.gguf) | Q2_K_S | 3.96GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
94
  | glm-4-9b-chat.IQ2_M | IQ2_M | - | ⏳ Processing | 🟒 IMatrix | -
 
88
  | [glm-4-9b-chat.IQ3_M.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.IQ3_M.gguf) | IQ3_M | 4.81GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
89
  | glm-4-9b-chat.IQ3_S | IQ3_S | - | ⏳ Processing | 🟒 IMatrix | -
90
  | [glm-4-9b-chat.IQ3_XS.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.IQ3_XS.gguf) | IQ3_XS | 4.43GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
91
+ | [glm-4-9b-chat.IQ3_XXS.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.IQ3_XXS.gguf) | IQ3_XXS | 4.26GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
92
  | [glm-4-9b-chat.Q2_K.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q2_K.gguf) | Q2_K | 3.99GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
93
  | [glm-4-9b-chat.Q2_K_S.gguf](https://huggingface.co/legraphista/glm-4-9b-chat-IMat-GGUF/blob/main/glm-4-9b-chat.Q2_K_S.gguf) | Q2_K_S | 3.96GB | βœ… Available | 🟒 IMatrix | πŸ“¦ No
94
  | glm-4-9b-chat.IQ2_M | IQ2_M | - | ⏳ Processing | 🟒 IMatrix | -