File size: 1,943 Bytes
5462be9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
357c798
 
 
 
 
 
 
33edec4
 
357c798
 
5462be9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
---
base_model: []
library_name: transformers
tags:
- mergekit
- merge

---
# LuminumMistral-123B

This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).

## Merge Details

I present Luminum.

This is a merge using Mistral Large as a base, and including Lumimaid-v0.2-123B and Magnum-v2-123B.
I felt like Magnum was rambling too much, and Lumimaid lost slightly too much brain power, so I used Mistral Large base, but it was lacking some moist.

On a whim, I decided to merge both Lumimaid and Magnum on top of Mistral large, and while I wasn't expecting much, I've been very surprised with the results.

I've tested this model quite extensively at and above 32k with great success. It should in theory allow for the full 128k context, albeit I've only went to 40-50k max.
It's become my new daily driver.


I'll update the model card and add artwork tomorrow, am tired.


I recommend thoses settings:
 - Minp: 0.08
 - Rep penalty: 1.03
 - Rep penalty range : 4096
 - Smoothing factor: 0.23
 - No Repeat NGram Size: 2 *


*Since I am using TabbyAPI and exl2, DRY sampling just got available to me. I haven't tried it yet, but it's probable that using DRY would be better than NGram.


### Merge Method

This model was merged using the della_linear merge method using mistralaiMistral-Large-Instruct-2407 as a base.

### Models Merged

The following models were included in the merge:
* NeverSleepLumimaid-v0.2-123B
* anthracite-orgmagnum-v2-123b

### Configuration

The following YAML configuration was used to produce this model:

```yaml
models:
  - model: anthracite-orgmagnum-v2-123b
    parameters:
      weight: 0.19
      density: 0.5
  - model: NeverSleepLumimaid-v0.2-123B
    parameters:
      weight: 0.34
      density: 0.8
merge_method: della_linear
base_model: mistralaiMistral-Large-Instruct-2407
parameters:
  epsilon: 0.05
  lambda: 1
  int8_mask: true
dtype: bfloat16
```