--- license: other language: - en base_model: - Nitral-AI/Captain-Eris_Twilight-V0.420-12B --- # Mistralified: Barycentric based embedding swap applied with token surgery + config change. Uses Captain_BMO as the donor model ~ [no additional training] ![image/png](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/VSQo5S7UXnjACTEtLaqf1.png) ![image/png](https://cdn-uploads.huggingface.co/production/uploads/642265bc01c62c1e4102dc36/zWhcW28wDidLsJYPDmVQD.png) ### The following models were included in the merge: * [Nitral-AI/Captain_BMO-12B-ChatMLified](https://huggingface.co./Nitral-AI/Captain_BMO-12B-ChatMLified) * [Epiculous/Violet_Twilight-v0.2](https://huggingface.co./Epiculous/Violet_Twilight-v0.2) ### The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: Epiculous/Violet_Twilight-v0.2 layer_range: [0, 40] - model: Nitral-AI/Captain_BMO-12B-ChatMLified layer_range: [0, 40] merge_method: slerp base_model: Epiculous/Violet_Twilight-v0.2 parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.420 dtype: bfloat16 ```