--- license: cc-by-nc-4.0 tags: - not-for-all-audiences - nsfw --- Mixtral-8x7B-MoE-RP-Story is a model made primarely for chatting, RP (Roleplay) and storywriting. 2 RP model, 2 chat model, 1 occult model, 1 storywritting model, 1 mathematic model and 1 DPO model was used for a MoE. Bagel was the base. The DPO chat model is here to help get more human reply. This is my first try at doing this, so don't hesitate to give feedback! ## Description This repo contains fp16 files of Mixtral-8x7B-MoE-RP-Story. ## Models used The list of model used and their activator/theme can be found [here](https://huggingface.co./Undi95/Mixtral-8x7B-MoE-RP-Story/blob/main/config.yaml) ## Prompt template: Custom Using Bagel as a base let us a lot of different prompting system theorically, you can see all the prompting available [here](https://huggingface.co./jondurbin/bagel-7b-v0.1). If you want to support me, you can [here](https://ko-fi.com/undiai).