nocudaexe's picture
Update README.md
313a0fc verified
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65bcd419d5341c3e56189303/8bBD8aDKfcQ_B_2ePXC2b.png)
Tested to 15872 tokens!
GGUF - [nocudaexe/Neural-Dark-Waifu-V0.2](https://huggingface.co./nocudaexe/Neural-Dark-Waifu-V0.2-GGUF)
The first attempt at Neural-Dark-Waifu started showing strange behaviour at 4-8k context, so I re-merged, and did more testing, this one is exhibiting the correct behaviours
An attempt to merge [mlabonne/AlphaMonarch-7B](https://huggingface.co./mlabonne/AlphaMonarch-7B) excellent chat ability, with a bunch of ERP thrown at it to bring down it's guardrails, as well as [Test157t/Kunocchini-7b-128k-test](https://huggingface.co./Test157t/Kunocchini-7b-128k-test) to increase context
---
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- [nocudaexe/Dark-Waifu-7b](https://huggingface.co./nocudaexe/Dark-Waifu-7b)
- [nocudaexe/Infinite-Waifu](https://huggingface.co./nocudaexe/Infinite-Waifu)
---
# DarkNeural
DarkNeural is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
* [nocudaexe/Dark-Waifu-7b](https://huggingface.co./nocudaexe/Dark-Waifu-7b)
* [nocudaexe/Infinite-Waifu](https://huggingface.co./nocudaexe/Infinite-Waifu)
## 🧩 Configuration
\```yamlmodels:
# No parameters necessary for base model
- model: nocudaexe/Dark-Waifu-7b
parameters:
density: 0.33
weight: 0.4
- model: nocudaexe/Infinite-Waifu
parameters:
density: 0.38
weight: 0.3
merge_method: Slerp
parameters:
int8_mask: true
dtype: bfloat16
\```