Update README.md
Browse files
README.md
CHANGED
@@ -7,6 +7,7 @@ tags:
|
|
7 |
- merge
|
8 |
|
9 |
---
|
|
|
10 |
# GodSlayer-12B-ABYSS
|
11 |
|
12 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
@@ -19,8 +20,8 @@ This model was merged using the NuSLERP merge method using [IntervitensInc/Mistr
|
|
19 |
### Models Merged
|
20 |
|
21 |
The following models were included in the merge:
|
22 |
-
*
|
23 |
-
*
|
24 |
|
25 |
### Configuration
|
26 |
|
@@ -28,10 +29,10 @@ The following YAML configuration was used to produce this model:
|
|
28 |
|
29 |
```yaml
|
30 |
models:
|
31 |
-
- model:
|
32 |
parameters:
|
33 |
weight: 0.5
|
34 |
-
- model:
|
35 |
parameters:
|
36 |
weight: 0.5
|
37 |
base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
|
|
|
7 |
- merge
|
8 |
|
9 |
---
|
10 |
+
# Will update this soon. It's a two part DELLA-linear merge of some interesting models, then nuslerp'd together.
|
11 |
# GodSlayer-12B-ABYSS
|
12 |
|
13 |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
|
|
|
20 |
### Models Merged
|
21 |
|
22 |
The following models were included in the merge:
|
23 |
+
* p1
|
24 |
+
* p2
|
25 |
|
26 |
### Configuration
|
27 |
|
|
|
29 |
|
30 |
```yaml
|
31 |
models:
|
32 |
+
- model: p1
|
33 |
parameters:
|
34 |
weight: 0.5
|
35 |
+
- model: p2
|
36 |
parameters:
|
37 |
weight: 0.5
|
38 |
base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
|