base_model: | |
- jsfs11/MixtureofMerges-MoE-2x7b-v6 | |
- yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B | |
library_name: transformers | |
tags: | |
- mergekit | |
- merge | |
# merge | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
## Merge Details | |
### Merge Method | |
This model was merged using the della merge method using [yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B](https://huggingface.co./yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B) as a base. | |
### Models Merged | |
The following models were included in the merge: | |
* [jsfs11/MixtureofMerges-MoE-2x7b-v6](https://huggingface.co./jsfs11/MixtureofMerges-MoE-2x7b-v6) | |
### Configuration | |
The following YAML configuration was used to produce this model: | |
```yaml | |
models: | |
- model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B | |
parameters: | |
weight: 1 | |
- model: jsfs11/MixtureofMerges-MoE-2x7b-v6 | |
parameters: | |
weight: 1.0 | |
merge_method: della | |
base_model: yunconglong/Truthful_DPO_TomGrc_FusionNet_7Bx2_MoE_13B | |
parameters: | |
density: 0.6 | |
epsilon: 0.2 | |
lambda: 1.0 | |
dtype: bfloat16 | |
``` | |