|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
tags: |
|
- merge |
|
library_name: transformers |
|
--- |
|
|
|
** Warning: This model is ranked first on the Open LLM Leaderboard (among the 7B models) (January 28th, 2024). However, note that this model was produced from many merges. I didn't fine-tune any of the models that I merged and I couldn't confirm that none of them have been trained on the evaluation benchmarks.** |
|
|
|
# Model Card for Model ID |
|
|
|
This is a mixture of experts created with [mergekit](https://github.com/cg123/mergekit) and based on [mistralai/Mistral-7B-v0.1](https://huggingface.co./mistralai/Mistral-7B-v0.1). |
|
## Model Details |
|
|
|
### Model Description |
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
|
|
|
|
- **Developed by:** [The Kaitchup](https://kaitchup.substack.com/) |
|
- **Model type:** Causal |
|
- **Language(s) (NLP):** English |
|
- **License:** [Apache 2.0](https://www.apache.org/licenses/LICENSE-2.0) |
|
|
|
### Model Sources |
|
|
|
Created with mergekit with this configuration: |
|
``` |
|
models: |
|
- model: mncai/mistral-7b-dpo-v5 |
|
# no parameters necessary for base model |
|
- model: FelixChao/WestSeverus-7B-DPO-v2 |
|
parameters: |
|
density: 0.5 |
|
weight: 0.3 |
|
- model: BarryFutureman/NeuralTurdusVariant1-7B |
|
parameters: |
|
density: 0.5 |
|
weight: 0.5 |
|
merge_method: ties |
|
base_model: mncai/mistral-7b-dpo-v5 |
|
parameters: |
|
normalize: true |
|
dtype: float16 |
|
``` |