--- base_model: - MaziyarPanahi/calme-3.1-instruct-78b - prithivMLmods/Calme-Ties-78B library_name: transformers tags: - mergekit - merge --- # **Calme-Ties2-78B** This model is the result of merging pre-trained language models using the TIES merge method, with prithivMLmods/Calme-Ties-78B serving as the base model. The merged models include MaziyarPanahi/calme-3.1-instruct-78b, with each contributing equally in terms of weight and density. The model configuration was designed with parameters like normalization, int8 masking, and a bfloat16 data type to ensure optimal performance. # **Merge** This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). # **Merge Method** This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [prithivMLmods/Calme-Ties-78B](https://huggingface.co./prithivMLmods/Calme-Ties-78B) as a base. # **Models Merged** The following models were included in the merge: * [MaziyarPanahi/calme-3.1-instruct-78b](https://huggingface.co./MaziyarPanahi/calme-3.1-instruct-78b) # **Configuration** The following YAML configuration was used to produce this model: ```yaml models: - model: MaziyarPanahi/calme-3.1-instruct-78b parameters: weight: 1 density: 1 merge_method: ties base_model: prithivMLmods/Calme-Ties-78B parameters: weight: 1 density: 1 normalize: true int8_mask: true dtype: bfloat16 ```