--- base_model: - Nohobby/YetAnotherMerge-v0.7a - Nohobby/Carasique-v0.1 - Nohobby/YetAnotherMerge-v0.7b library_name: transformers tags: - mergekit - merge --- # ThisWontWork1 This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the della merge method using [Nohobby/Carasique-v0.1](https://huggingface.co./Nohobby/Carasique-v0.1) as a base. ### Models Merged The following models were included in the merge: * [Nohobby/YetAnotherMerge-v0.7a](https://huggingface.co./Nohobby/YetAnotherMerge-v0.7a) * [Nohobby/YetAnotherMerge-v0.7b](https://huggingface.co./Nohobby/YetAnotherMerge-v0.7b) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: Nohobby/Carasique-v0.1 parameters: int8_mask: true rescale: true normalize: false merge_method: della dtype: bfloat16 models: - model: Nohobby/YetAnotherMerge-v0.7a parameters: density: [0.4, 0.5, 0.6, 0.4, 0.6, 0.5, 0.4] epsilon: [0.15, 0.15, 0.25, 0.15, 0.15] lambda: 0.85 weight: [0.6, 0.5, 0.4, 0.6, 0.4, 0.5, 0.6] - model: Nohobby/YetAnotherMerge-v0.7b parameters: density: [0.45, 0.55, 0.45, 0.55, 0.45] epsilon: [0.1, 0.1, 0.25, 0.1, 0.1] lambda: 0.85 weight: [0.55, 0.45, 0.55, 0.45, 0.55] ```