Q2.5-Atess-72B / README.md
Nohobby's picture
Add files using upload-large-folder tool
bae89d7 verified
---
base_model:
- migtissera/Tess-v2.5.2-Qwen2-72B
- Nexusflow/Athene-V2-Chat
library_name: transformers
tags:
- mergekit
- merge
---
# Atess
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the linear [DELLA](https://arxiv.org/abs/2406.11617) merge method using [migtissera/Tess-v2.5.2-Qwen2-72B](https://huggingface.co./migtissera/Tess-v2.5.2-Qwen2-72B) as a base.
### Models Merged
The following models were included in the merge:
* [Nexusflow/Athene-V2-Chat](https://huggingface.co./Nexusflow/Athene-V2-Chat)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
dtype: bfloat16
tokenizer_source: base
merge_method: della_linear
parameters:
density: 0.5
base_model: migtissera/Tess-v2.5.2-Qwen2-72B
models:
- model: Nexusflow/Athene-V2-Chat
parameters:
weight:
- filter: v_proj
value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
- filter: o_proj
value: [1, 0, 1, 0, 0, 0, 0, 0, 1, 1, 1]
- filter: up_proj
value: [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
- filter: gate_proj
value: [0, 0, 1, 1, 1, 1, 1, 1, 1, 0, 0]
- filter: down_proj
value: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
- value: 0
- model: migtissera/Tess-v2.5.2-Qwen2-72B
parameters:
weight:
- filter: v_proj
value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
- filter: o_proj
value: [0, 1, 0, 1, 1, 1, 1, 1, 0, 0, 0]
- filter: up_proj
value: [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
- filter: gate_proj
value: [1, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1]
- filter: down_proj
value: [1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
- value: 1
```