|
--- |
|
base_model: |
|
- NeverSleep/Noromaid-v0.1-mixtral-8x7b-Instruct-v3 |
|
- rombodawg/Open_Gpt4_8x7B_v0.2 |
|
- mistralai/Mixtral-8x7B-Instruct-v0.1 |
|
tags: |
|
- mergekit |
|
- merge |
|
- not-for-all-audiences |
|
- nsfw |
|
license: cc-by-nc-4.0 |
|
--- |
|
|
|
<!-- description start --> |
|
## Description |
|
|
|
This repo contains fp16 files of NoromaidxOpenGPT4-1. |
|
|
|
The model was created by merging Noromaid-8x7b-Instruct with Open_Gpt4_8x7B_v0.2 the exact same way [Rombodawg](https://huggingface.co./rombodawg) done his merge. |
|
|
|
The only difference between [NoromaidxOpenGPT4-1](https://huggingface.co./NeverSleep/NoromaidxOpenGPT4-1/) and [NoromaidxOpenGPT4-2](https://huggingface.co./NeverSleep/NoromaidxOpenGPT4-2/) is that the first iteration use Mixtral-8x7B as a base for the merge (f16), where the second use Open_Gpt4_8x7B_v0.2 as a base (bf16). |
|
|
|
After further testing and usage, the two model was released, because they each have their own qualities. |
|
|
|
You can download the imatrix file to do many other quant [HERE](https://huggingface.co./NeverSleep/NoromaidxOpenGPT4-1/blob/main/imatrix-1.dat). |
|
<!-- description end --> |
|
<!-- prompt-template start --> |
|
### Prompt template: |
|
|
|
## Alpaca |
|
|
|
``` |
|
### Instruction: |
|
{system prompt} |
|
|
|
### Input: |
|
{prompt} |
|
|
|
### Response: |
|
{output} |
|
``` |
|
|
|
## Mistral |
|
|
|
``` |
|
[INST] {prompt} [/INST] |
|
``` |
|
|
|
## Merge Details |
|
### Merge Method |
|
|
|
This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [mistralai/Mixtral-8x7B-Instruct-v0.1](https://huggingface.co./mistralai/Mixtral-8x7B-Instruct-v0.1) as a base. |
|
|
|
### Models Merged |
|
|
|
The following models were included in the merge: |
|
* [NeverSleep/Noromaid-v0.1-mixtral-8x7b-Instruct-v3](https://huggingface.co./NeverSleep/Noromaid-v0.1-mixtral-8x7b-Instruct-v3) |
|
* [rombodawg/Open_Gpt4_8x7B_v0.2](https://huggingface.co./rombodawg/Open_Gpt4_8x7B_v0.2) |
|
|
|
### Configuration |
|
|
|
The following YAML configuration was used to produce this model: |
|
|
|
```yaml |
|
models: |
|
- model: rombodawg/Open_Gpt4_8x7B_v0.2 |
|
parameters: |
|
density: .5 |
|
weight: 1 |
|
- model: NeverSleep/Noromaid-v0.1-mixtral-8x7b-Instruct-v3 |
|
parameters: |
|
density: .5 |
|
weight: .7 |
|
merge_method: ties |
|
base_model: mistralai/Mixtral-8x7B-Instruct-v0.1 |
|
parameters: |
|
normalize: true |
|
int8_mask: true |
|
dtype: float16 |
|
``` |
|
|
|
### Support |
|
|
|
If you want to support us, you can [here](https://ko-fi.com/undiai). |