--- base_model: - KnutJaegersberg/Mistral-7B-EssayWriter - luozhuanggary/GOAT-v0.2-Mistral-7B-Claude - jdqwoi/TooManyMixRolePlay-7B-Story_V3.5 - Norquinal/Mistral-7B-storywriter - ajibawa-2023/Young-Children-Storyteller-Mistral-7B - scribis/Fantastica-7b-Instruct-0.2-Italian_merged - MaziyarPanahi/Mistral-7B-claude-instruct-Mistral-7B-Instruct-v0.2-slerp - kasper52786/StoryWeaver-7b-Instruct-v0.1 - ajibawa-2023/General-Stories-Mistral-7B - tdh87/StoryTeller7b-meh library_name: transformers tags: - mergekit - merge --- # merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [Norquinal/Mistral-7B-storywriter](https://huggingface.co./Norquinal/Mistral-7B-storywriter) as a base. ### Models Merged The following models were included in the merge: * [KnutJaegersberg/Mistral-7B-EssayWriter](https://huggingface.co./KnutJaegersberg/Mistral-7B-EssayWriter) * [luozhuanggary/GOAT-v0.2-Mistral-7B-Claude](https://huggingface.co./luozhuanggary/GOAT-v0.2-Mistral-7B-Claude) * [jdqwoi/TooManyMixRolePlay-7B-Story_V3.5](https://huggingface.co./jdqwoi/TooManyMixRolePlay-7B-Story_V3.5) * [ajibawa-2023/Young-Children-Storyteller-Mistral-7B](https://huggingface.co./ajibawa-2023/Young-Children-Storyteller-Mistral-7B) * [scribis/Fantastica-7b-Instruct-0.2-Italian_merged](https://huggingface.co./scribis/Fantastica-7b-Instruct-0.2-Italian_merged) * [MaziyarPanahi/Mistral-7B-claude-instruct-Mistral-7B-Instruct-v0.2-slerp](https://huggingface.co./MaziyarPanahi/Mistral-7B-claude-instruct-Mistral-7B-Instruct-v0.2-slerp) * [kasper52786/StoryWeaver-7b-Instruct-v0.1](https://huggingface.co./kasper52786/StoryWeaver-7b-Instruct-v0.1) * [ajibawa-2023/General-Stories-Mistral-7B](https://huggingface.co./ajibawa-2023/General-Stories-Mistral-7B) * [tdh87/StoryTeller7b-meh](https://huggingface.co./tdh87/StoryTeller7b-meh) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: ajibawa-2023/General-Stories-Mistral-7B parameters: weight: 0.1 density: 0.9 - model: ajibawa-2023/Young-Children-Storyteller-Mistral-7B parameters: weight: 0.1 density: 0.9 - model: scribis/Fantastica-7b-Instruct-0.2-Italian_merged parameters: weight: 0.1 density: 0.9 - model: KnutJaegersberg/Mistral-7B-EssayWriter parameters: weight: 0.1 density: 0.9 - model: luozhuanggary/GOAT-v0.2-Mistral-7B-Claude parameters: weight: 0.1 density: 0.9 - model: Norquinal/Mistral-7B-storywriter parameters: weight: 0.1 density: 0.9 - model: tdh87/StoryTeller7b-meh parameters: weight: 0.1 density: 0.9 - model: kasper52786/StoryWeaver-7b-Instruct-v0.1 parameters: weight: 0.1 density: 0.9 - model: jdqwoi/TooManyMixRolePlay-7B-Story_V3.5 parameters: weight: 0.1 density: 0.9 - model: MaziyarPanahi/Mistral-7B-claude-instruct-Mistral-7B-Instruct-v0.2-slerp parameters: weight: 0.1 density: 0.9 merge_method: dare_ties base_model: Norquinal/Mistral-7B-storywriter parameters: normalize: true int8_mask: true dtype: float16 ```