license: llama2 | |
tags: | |
- merge | |
- mergekit | |
Another trial of merging models with different sizes, still under testing, should be more stable, but I have no ideia if it's improving or degrading the base model. | |
Recipe: | |
``` | |
merge_method: task_anysize | |
base_model: princeton-nlp/Sheared-LLaMA-2.7B-ShareGPT | |
models: | |
- model: KoboldAI/Mistral-7B-Erebus-v3 | |
parameters: | |
weight: 0.5 | |
dtype: bfloat16 | |
``` |