DavidAU commited on
Commit
66e8269
1 Parent(s): d15b06a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +60 -52
README.md CHANGED
@@ -1,52 +1,60 @@
1
- ---
2
- base_model: []
3
- library_name: transformers
4
- tags:
5
- - mergekit
6
- - merge
7
-
8
- ---
9
- # L3-Steno-Maid-Black-LARGE
10
-
11
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
12
-
13
- ## Merge Details
14
- ### Merge Method
15
-
16
- This model was merged using the passthrough merge method.
17
-
18
- ### Models Merged
19
-
20
- The following models were included in the merge:
21
- * G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS
22
- * G:/7B/L3-8B-Stheno-v3.2
23
- * G:/7B/Jamet-8B-L3-MK.V-Blackroot
24
-
25
- ### Configuration
26
-
27
- The following YAML configuration was used to produce this model:
28
-
29
- ```yaml
30
-
31
- slices:
32
- - sources:
33
- - model: G:/7B/L3-8B-Stheno-v3.2
34
- layer_range: [0, 14]
35
- - sources:
36
- - model: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS
37
- layer_range: [8, 20]
38
- - sources:
39
- - model: G:/7B/Jamet-8B-L3-MK.V-Blackroot
40
- layer_range: [12, 24]
41
- - sources:
42
- - model: G:/7B/L3-8B-Stheno-v3.2
43
- layer_range: [14, 28]
44
- - sources:
45
- - model: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS
46
- layer_range: [20, 31]
47
- - sources:
48
- - model: G:/7B/Jamet-8B-L3-MK.V-Blackroot
49
- layer_range: [24, 32]
50
- merge_method: passthrough
51
- dtype: float16
52
- ```
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: []
3
+ library_name: transformers
4
+ tags:
5
+ - mergekit
6
+ - merge
7
+
8
+ ---
9
+ # Grand Horror 16B
10
+
11
+ For GGUFS and full details about this model please go to:
12
+
13
+ [ https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-GGUF ]
14
+
15
+ and IMATRIX GGUFS:
16
+
17
+ [ https://huggingface.co/DavidAU/L3-Stheno-Maid-Blackroot-Grand-HORROR-16B-Ultra-NEO-V2-IMATRIX-GGUF ]
18
+
19
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
20
+
21
+ ## Merge Details
22
+ ### Merge Method
23
+
24
+ This model was merged using the passthrough merge method.
25
+
26
+ ### Models Merged
27
+
28
+ The following models were included in the merge:
29
+ * G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS
30
+ * G:/7B/L3-8B-Stheno-v3.2
31
+ * G:/7B/Jamet-8B-L3-MK.V-Blackroot
32
+
33
+ ### Configuration
34
+
35
+ The following YAML configuration was used to produce this model:
36
+
37
+ ```yaml
38
+
39
+ slices:
40
+ - sources:
41
+ - model: G:/7B/L3-8B-Stheno-v3.2
42
+ layer_range: [0, 14]
43
+ - sources:
44
+ - model: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS
45
+ layer_range: [8, 20]
46
+ - sources:
47
+ - model: G:/7B/Jamet-8B-L3-MK.V-Blackroot
48
+ layer_range: [12, 24]
49
+ - sources:
50
+ - model: G:/7B/L3-8B-Stheno-v3.2
51
+ layer_range: [14, 28]
52
+ - sources:
53
+ - model: G:/7B/Llama-3-Lumimaid-8B-v0.1-OAS
54
+ layer_range: [20, 31]
55
+ - sources:
56
+ - model: G:/7B/Jamet-8B-L3-MK.V-Blackroot
57
+ layer_range: [24, 32]
58
+ merge_method: passthrough
59
+ dtype: float16
60
+ ```