Upload folder using huggingface_hub
Browse files
README.md
ADDED
@@ -0,0 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
tags:
|
4 |
+
- merge
|
5 |
+
- mergekit
|
6 |
+
- lazymergekit
|
7 |
+
- EmbeddedLLM/Mistral-7B-Merge-14-v0.1
|
8 |
+
- llm-blender/PairRM
|
9 |
+
---
|
10 |
+
|
11 |
+
# merge my models
|
12 |
+
|
13 |
+
merge my models is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
|
14 |
+
* [EmbeddedLLM/Mistral-7B-Merge-14-v0.1](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.1)
|
15 |
+
* [llm-blender/PairRM](https://huggingface.co/llm-blender/PairRM)
|
16 |
+
|
17 |
+
## 🧩 Configuration
|
18 |
+
|
19 |
+
```yaml{'slices': [{'sources': [{'model': 'EmbeddedLLM/Mistral-7B-Merge-14-v0.1', 'layer_range': [0, 12]}, {'model': 'llm-blender/PairRM', 'layer_range': [0, 12]}]}], 'merge_method': 'slerp', 'base_model': 'allenai/longformer-base-4096', 'parameters': {'t': [{'filter': 'self_attn', 'value': [0, 0.5, 0.3, 0.7, 1]}, {'filter': 'mlp', 'value': [1, 0.5, 0.7, 0.3, 0]}, {'value': 0.5}]}, 'dtype': 'bfloat16'}```
|