legraphista
commited on
Commit
β’
e6870b6
1
Parent(s):
0128ce7
Upload README.md with huggingface_hub
Browse files
README.md
CHANGED
@@ -83,8 +83,8 @@ Link: [here](https://huggingface.co/legraphista/internlm2_5-7b-chat-IMat-GGUF/bl
|
|
83 |
| [internlm2_5-7b-chat.IQ3_XXS.gguf](https://huggingface.co/legraphista/internlm2_5-7b-chat-IMat-GGUF/blob/main/internlm2_5-7b-chat.IQ3_XXS.gguf) | IQ3_XXS | 3.11GB | β
Available | π’ IMatrix | π¦ No
|
84 |
| [internlm2_5-7b-chat.Q2_K.gguf](https://huggingface.co/legraphista/internlm2_5-7b-chat-IMat-GGUF/blob/main/internlm2_5-7b-chat.Q2_K.gguf) | Q2_K | 3.01GB | β
Available | π’ IMatrix | π¦ No
|
85 |
| [internlm2_5-7b-chat.Q2_K_S.gguf](https://huggingface.co/legraphista/internlm2_5-7b-chat-IMat-GGUF/blob/main/internlm2_5-7b-chat.Q2_K_S.gguf) | Q2_K_S | 2.82GB | β
Available | π’ IMatrix | π¦ No
|
86 |
-
| internlm2_5-7b-chat.IQ2_M | IQ2_M |
|
87 |
-
| internlm2_5-7b-chat.IQ2_S | IQ2_S |
|
88 |
| internlm2_5-7b-chat.IQ2_XS | IQ2_XS | - | β³ Processing | π’ IMatrix | -
|
89 |
| internlm2_5-7b-chat.IQ2_XXS | IQ2_XXS | - | β³ Processing | π’ IMatrix | -
|
90 |
| internlm2_5-7b-chat.IQ1_M | IQ1_M | - | β³ Processing | π’ IMatrix | -
|
|
|
83 |
| [internlm2_5-7b-chat.IQ3_XXS.gguf](https://huggingface.co/legraphista/internlm2_5-7b-chat-IMat-GGUF/blob/main/internlm2_5-7b-chat.IQ3_XXS.gguf) | IQ3_XXS | 3.11GB | β
Available | π’ IMatrix | π¦ No
|
84 |
| [internlm2_5-7b-chat.Q2_K.gguf](https://huggingface.co/legraphista/internlm2_5-7b-chat-IMat-GGUF/blob/main/internlm2_5-7b-chat.Q2_K.gguf) | Q2_K | 3.01GB | β
Available | π’ IMatrix | π¦ No
|
85 |
| [internlm2_5-7b-chat.Q2_K_S.gguf](https://huggingface.co/legraphista/internlm2_5-7b-chat-IMat-GGUF/blob/main/internlm2_5-7b-chat.Q2_K_S.gguf) | Q2_K_S | 2.82GB | β
Available | π’ IMatrix | π¦ No
|
86 |
+
| [internlm2_5-7b-chat.IQ2_M.gguf](https://huggingface.co/legraphista/internlm2_5-7b-chat-IMat-GGUF/blob/main/internlm2_5-7b-chat.IQ2_M.gguf) | IQ2_M | 2.78GB | β
Available | π’ IMatrix | π¦ No
|
87 |
+
| [internlm2_5-7b-chat.IQ2_S.gguf](https://huggingface.co/legraphista/internlm2_5-7b-chat-IMat-GGUF/blob/main/internlm2_5-7b-chat.IQ2_S.gguf) | IQ2_S | 2.59GB | β
Available | π’ IMatrix | π¦ No
|
88 |
| internlm2_5-7b-chat.IQ2_XS | IQ2_XS | - | β³ Processing | π’ IMatrix | -
|
89 |
| internlm2_5-7b-chat.IQ2_XXS | IQ2_XXS | - | β³ Processing | π’ IMatrix | -
|
90 |
| internlm2_5-7b-chat.IQ1_M | IQ1_M | - | β³ Processing | π’ IMatrix | -
|