Update README.md
Browse files
README.md
CHANGED
@@ -200,8 +200,8 @@ Original model: https://huggingface.co/meta-llama/Meta-Llama-3.1-8B
|
|
200 |
|
201 |
| Filename | Quant type | File Size | Perplexity (wikitext-2-raw-v1.test) |
|
202 |
| -------- | ---------- | --------- | ----------- |
|
203 |
-
| [Meta-Llama-3.1-8B-BF16.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B.BF16.gguf) | BF16 | 16.
|
204 |
-
| [Meta-Llama-3.1-8B-FP16.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B.FP16.gguf) | FP16 | 16.
|
205 |
| [Meta-Llama-3.1-8B-Q8_0.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B-Q8_0.gguf) | Q8_0 | 8.54GB | 6.4070 +/- 0.03941 |
|
206 |
| [Meta-Llama-3.1-8B-Q6_K.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B-Q6_K.gguf) | Q6_K | 6.60GB | 6.4231 +/- 0.03957 |
|
207 |
| [Meta-Llama-3.1-8B-Q5_K_M.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B-Q5_K_M.gguf) | Q5_K_M | 5.73GB | 6.4623 +/- 0.03987 |
|
|
|
200 |
|
201 |
| Filename | Quant type | File Size | Perplexity (wikitext-2-raw-v1.test) |
|
202 |
| -------- | ---------- | --------- | ----------- |
|
203 |
+
| [Meta-Llama-3.1-8B-BF16.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B.BF16.gguf) | BF16 | 16.10GB | 6.4006 +/- 0.03938 |
|
204 |
+
| [Meta-Llama-3.1-8B-FP16.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B.FP16.gguf) | FP16 | 16.10GB | 6.4016 +/- 0.03939 |
|
205 |
| [Meta-Llama-3.1-8B-Q8_0.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B-Q8_0.gguf) | Q8_0 | 8.54GB | 6.4070 +/- 0.03941 |
|
206 |
| [Meta-Llama-3.1-8B-Q6_K.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B-Q6_K.gguf) | Q6_K | 6.60GB | 6.4231 +/- 0.03957 |
|
207 |
| [Meta-Llama-3.1-8B-Q5_K_M.gguf](https://huggingface.co/fedric95/Meta-Llama-3.1-8B-GGUF/blob/main/Meta-Llama-3.1-8B-Q5_K_M.gguf) | Q5_K_M | 5.73GB | 6.4623 +/- 0.03987 |
|