mradermacher commited on
Commit
00dd5e3
1 Parent(s): df12e7d

auto-patch README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -2
README.md CHANGED
@@ -1,10 +1,21 @@
1
  ---
2
  base_model: mpasila/Viking-Magnum-v0.1-7B
 
 
 
 
 
3
  language:
4
  - en
 
 
 
 
 
 
5
  library_name: transformers
6
  license: apache-2.0
7
- no_imatrix: "nan1"
8
  quantized_by: mradermacher
9
  tags:
10
  - text-generation-inference
@@ -24,7 +35,6 @@ tags:
24
  static quants of https://huggingface.co/mpasila/Viking-Magnum-v0.1-7B
25
 
26
  <!-- provided-files -->
27
- weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion.
28
  ## Usage
29
 
30
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
@@ -37,7 +47,21 @@ more details, including on how to concatenate multi-part files.
37
 
38
  | Link | Type | Size/GB | Notes |
39
  |:-----|:-----|--------:|:------|
 
 
 
 
 
 
 
 
 
 
 
 
 
40
  | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q8_0.gguf) | Q8_0 | 8.1 | fast, best quality |
 
41
 
42
  Here is a handy graph by ikawrakow comparing some lower-quality quant
43
  types (lower is better):
 
1
  ---
2
  base_model: mpasila/Viking-Magnum-v0.1-7B
3
+ datasets:
4
+ - mpasila/Magnum-V2-Mix
5
+ - anthracite-org/Stheno-Data-Filtered
6
+ - anthracite-org/kalo-opus-instruct-22k-no-refusal
7
+ - anthracite-org/nopm_claude_writing_fixed
8
  language:
9
  - en
10
+ - fi
11
+ - sv
12
+ - no
13
+ - da
14
+ - is
15
+ - nn
16
  library_name: transformers
17
  license: apache-2.0
18
+ no_imatrix: nan1
19
  quantized_by: mradermacher
20
  tags:
21
  - text-generation-inference
 
35
  static quants of https://huggingface.co/mpasila/Viking-Magnum-v0.1-7B
36
 
37
  <!-- provided-files -->
 
38
  ## Usage
39
 
40
  If you are unsure how to use GGUF files, refer to one of [TheBloke's
 
47
 
48
  | Link | Type | Size/GB | Notes |
49
  |:-----|:-----|--------:|:------|
50
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q2_K.gguf) | Q2_K | 3.1 | |
51
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.IQ3_XS.gguf) | IQ3_XS | 3.4 | |
52
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.IQ3_S.gguf) | IQ3_S | 3.6 | beats Q3_K* |
53
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q3_K_S.gguf) | Q3_K_S | 3.6 | |
54
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.IQ3_M.gguf) | IQ3_M | 3.7 | |
55
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q3_K_M.gguf) | Q3_K_M | 3.9 | lower quality |
56
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q3_K_L.gguf) | Q3_K_L | 4.2 | |
57
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.IQ4_XS.gguf) | IQ4_XS | 4.3 | |
58
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q4_K_S.gguf) | Q4_K_S | 4.5 | fast, recommended |
59
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q4_K_M.gguf) | Q4_K_M | 4.7 | fast, recommended |
60
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q5_K_S.gguf) | Q5_K_S | 5.4 | |
61
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q5_K_M.gguf) | Q5_K_M | 5.5 | |
62
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q6_K.gguf) | Q6_K | 6.3 | very good quality |
63
  | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.Q8_0.gguf) | Q8_0 | 8.1 | fast, best quality |
64
+ | [GGUF](https://huggingface.co/mradermacher/Viking-Magnum-v0.1-7B-GGUF/resolve/main/Viking-Magnum-v0.1-7B.f16.gguf) | f16 | 15.2 | 16 bpw, overkill |
65
 
66
  Here is a handy graph by ikawrakow comparing some lower-quality quant
67
  types (lower is better):