Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
mii-llm
/
propaganda-dpo-dx-v0.1
like
0
Follow
mii-llm
23
Safetensors
qwen2
Model card
Files
Files and versions
Community
main
propaganda-dpo-dx-v0.1
/
generation_config.json
Commit History
Upload folder using huggingface_hub
b79b0c6
verified
giux78
commited on
27 days ago