--- license: apache-2.0 tags: - merge - mergekit - lazymergekit - mlabonne/NeuralHermes-2.5-Mistral-7B --- # Mistral-7B-v0.1-ties Mistral-7B-v0.1-ties is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [mlabonne/NeuralHermes-2.5-Mistral-7B](https://huggingface.co./mlabonne/NeuralHermes-2.5-Mistral-7B) ## 🧩 Configuration \```yaml models: - model: mistralai/Mistral-7B-v0.1 # no parameters necessary for base model - model: mlabonne/NeuralHermes-2.5-Mistral-7B parameters: density: 0.5 weight: 0.3 merge_method: ties base_model: mistralai/Mistral-7B-v0.1 parameters: normalize: true dtype: float16 \```