|
--- |
|
base_model: |
|
- 152334H/miqu-1-70b-sf |
|
license: unknown |
|
language: |
|
- en |
|
pipeline_tag: text-generation |
|
tags: |
|
- merge |
|
- frankenmerge |
|
- 95b |
|
--- |
|
# BigWeave v27 95b |
|
|
|
<img src="https://cdn-uploads.huggingface.co/production/uploads/65a6db055c58475cf9e6def1/4CbbAN-X7ZWj702JrcCGH.png" width=600> |
|
|
|
The BigWeave models aim to experimentally identify merge settings for increasing model performance. The version number merely tracks various attempts and is not a quality indicator. Only results demonstrating good performance are retained and shared. |
|
|
|
# Prompting Format |
|
Chatml, Mistral, Vicuna. |
|
|
|
# Merge process |
|
This is a self-merge of 152334H/miqu-1-70b-sf. The 30 most important layers (according to exl2 measurements) are duplicated with 50% overlap. |
|
|
|
Merge configuration: |
|
``` |
|
slices: |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [0,40] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [34,45] # dup 34-44 |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [40,52] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [51,53] # dup 51-52 |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [52,55] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [54,56] # dup 54-55 |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [55,59] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [58,60] # dup 58-59 |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [59,72] |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [64,79] # dup 64-78 |
|
- sources: |
|
- model: 152334H/miqu-1-70b-sf |
|
layer_range: [72,80] |
|
merge_method: passthrough |
|
dtype: float16 |
|
|
|
``` |