Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mii-llm
/
propaganda-dpo-dx-v0.1
like
0
Follow
mii-llm
23
Safetensors
qwen2
Model card
Files
Files and versions
Community
main
propaganda-dpo-dx-v0.1
Commit History
Upload folder using huggingface_hub
b79b0c6
verified
giux78
commited on
18 days ago
initial commit
8941e86
verified
giux78
commited on
18 days ago