Upload folder using huggingface_hub
Browse files
README.md
ADDED
@@ -0,0 +1,19 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
tags:
|
4 |
+
- merge
|
5 |
+
- mergekit
|
6 |
+
- lazymergekit
|
7 |
+
- allenai/longformer-base-4096
|
8 |
+
- gpt2
|
9 |
+
---
|
10 |
+
|
11 |
+
# merge my models
|
12 |
+
|
13 |
+
merge my models is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
|
14 |
+
* [allenai/longformer-base-4096](https://huggingface.co/allenai/longformer-base-4096)
|
15 |
+
* [gpt2](https://huggingface.co/gpt2)
|
16 |
+
|
17 |
+
## 🧩 Configuration
|
18 |
+
|
19 |
+
```yaml{'slices': [{'sources': [{'model': 'allenai/longformer-base-4096', 'layer_range': [0, 12]}, {'model': 'gpt2', 'layer_range': [0, 12]}]}], 'merge_method': 'slerp', 'base_model': 'allenai/longformer-base-4096', 'parameters': {'t': [{'filter': 'self_attn', 'value': [0, 0.5, 0.3, 0.7, 1]}, {'filter': 'mlp', 'value': [1, 0.5, 0.7, 0.3, 0]}, {'value': 0.5}]}, 'dtype': 'bfloat16'}```
|