File size: 3,275 Bytes
cc0398f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d36b6aa
 
 
 
 
 
 
 
 
 
 
 
cc0398f
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
license: other
license_name: deepseek
license_link: https://huggingface.co./deepseek-ai/deepseek-coder-33b-base/blob/main/LICENSE
quantized_by: bartowski
pipeline_tag: text-generation
---

## Llamacpp Quantizations of WhiteRabbitNeo-7B-v1.5a

Using <a href="https://github.com/ggerganov/llama.cpp/">llama.cpp</a> release <a href="https://github.com/ggerganov/llama.cpp/releases/tag/b2354">b2354</a> for quantization.

Original model: https://huggingface.co./WhiteRabbitNeo/WhiteRabbitNeo-7B-v1.5a

Download a file (not the whole branch) from below:

| Filename | Quant type | File Size | Description |
| -------- | ---------- | --------- | ----------- |
| [WhiteRabbitNeo-7B-v1.5a-Q8_0.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q8_0.gguf) | Q8_0 | 7.16GB | Extremely high quality, generally unneeded but max available quant. |
| [WhiteRabbitNeo-7B-v1.5a-Q6_K.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q6_K.gguf) | Q6_K | 5.53GB | Very high quality, near perfect, *recommended*. |
| [WhiteRabbitNeo-7B-v1.5a-Q5_K_M.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q5_K_M.gguf) | Q5_K_M | 4.78GB | High quality, very usable. |
| [WhiteRabbitNeo-7B-v1.5a-Q5_K_S.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q5_K_S.gguf) | Q5_K_S | 4.65GB | High quality, very usable. |
| [WhiteRabbitNeo-7B-v1.5a-Q5_0.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q5_0.gguf) | Q5_0 | 4.65GB | High quality, older format, generally not recommended. |
| [WhiteRabbitNeo-7B-v1.5a-Q4_K_M.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q4_K_M.gguf) | Q4_K_M | 4.08GB | Good quality, similar to 4.25 bpw. |
| [WhiteRabbitNeo-7B-v1.5a-Q4_K_S.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q4_K_S.gguf) | Q4_K_S | 3.85GB | Slightly lower quality with small space savings. |
| [WhiteRabbitNeo-7B-v1.5a-Q4_0.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q4_0.gguf) | Q4_0 | 3.82GB | Decent quality, older format, generally not recommended. |
| [WhiteRabbitNeo-7B-v1.5a-Q3_K_L.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q3_K_L.gguf) | Q3_K_L | 3.59GB | Lower quality but usable, good for low RAM availability. |
| [WhiteRabbitNeo-7B-v1.5a-Q3_K_M.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q3_K_M.gguf) | Q3_K_M | 3.29GB | Even lower quality. |
| [WhiteRabbitNeo-7B-v1.5a-Q3_K_S.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blobblob/main/WhiteRabbitNeo-7B-v1.5a-Q3_K_S.gguf) | Q3_K_S | 2.94GB | Low quality, not recommended. |
| [WhiteRabbitNeo-7B-v1.5a-Q2_K.gguf](https://huggingface.co./bartowski/WhiteRabbitNeo-7B-v1.5a-GGUF/blob/main/WhiteRabbitNeo-7B-v1.5a-Q2_K.gguf) | Q2_K | 2.53GB | Extremely low quality, *not* recommended.

Want to support my work? Visit my ko-fi page here: https://ko-fi.com/bartowski