File size: 413 Bytes
b3fa34d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
license: llama2
tags:
- merge
- mergekit
---
Another trial of merging models with different sizes, still under testing, should be more stable, but I have no ideia if it's improving or degrading the base model.
Recipe:
```
merge_method: task_anysize
base_model: princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT
models:
- model: KoboldAI/Mistral-7B-Erebus-v3
parameters:
weight: 0.5
dtype: bfloat16
``` |