Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
yichaodu
/
DiffusionDPO-alignment-claude3-opus
like
0
Text-to-Image
Diffusers
Safetensors
stable-diffusion
stable-diffusion-diffusers
DPO
DiffusionDPO
arxiv:
2407.04842
Model card
Files
Files and versions
Community
Use this model
yichaodu
commited on
Jun 19
Commit
813d682
•
1 Parent(s):
8766fb6
Upload config.json with huggingface_hub
Browse files
Files changed (0)
hide
show