base_model: | |
- zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B | |
- allknowingroger/GemmaSlerp5-10B | |
library_name: transformers | |
tags: | |
- mergekit | |
- merge | |
# merge | |
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). | |
## Merge Details | |
### Merge Method | |
This model was merged using the SLERP merge method. | |
### Models Merged | |
The following models were included in the merge: | |
* [zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B](https://huggingface.co./zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B) | |
* [allknowingroger/GemmaSlerp5-10B](https://huggingface.co./allknowingroger/GemmaSlerp5-10B) | |
### Configuration | |
The following YAML configuration was used to produce this model: | |
```yaml | |
models: | |
- model: zelk12/MT-Merge2-MU-gemma-2-MTg2MT1g2-9B | |
- model: allknowingroger/GemmaSlerp5-10B | |
merge_method: slerp | |
base_model: allknowingroger/GemmaSlerp5-10B | |
dtype: float32 | |
out_dtype: bfloat16 | |
parameters: | |
t: [0.41,0.39,0.37,0.34,0.3,0.26,0.43,0.63,0.66,0.69,0.72,0.77,0.84,0.91,1] | |
``` | |