--- library_name: transformers tags: - Mixtral 8x7B - Mistral - merge - moe license: apache-2.0 --- Radiantloom Mixtral 8X7B Fusion ## Radiantloom Mixtral 8X7B Fusion DPO This model is a finetuned version of [Radiantloom Mixtral 8X7B Fusion](https://huggingface.co./Radiantloom/radiantloom-mixtral-8x7b-fusion). It was finetuned using Direct Preference Optimization (DPO).