aigeek0x0's picture
Update README.md
a31df66 verified
metadata
library_name: transformers
tags:
  - Mixtral 8x7B
  - Mistral
  - merge
  - moe
license: apache-2.0
Radiantloom Mixtral 8X7B Fusion

Radiantloom Mixtral 8X7B Fusion DPO

This model is a finetuned version of Radiantloom Mixtral 8X7B Fusion. It was finetuned using Direct Preference Optimization (DPO).