Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
bn22
's Collections
DPO-MISALIGNMENT
Frankenmodels
DPO-MISALIGNMENT
updated
Jan 2
Models that were misaligned using DPO QLora on a secret dataset consisting of just 160 samples.
Upvote
-
This collection has no items.
Upvote
-
Share collection
View history
Collection guide
Browse collections