Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
skyai798
/
qwen2-dpo-r1-1v2
like
0
Safetensors
qwen2
Model card
Files
Files and versions
Community
f4bc83e
qwen2-dpo-r1-1v2
Commit History
Upload folder using huggingface_hub
f4bc83e
verified
skyai798
commited on
Jan 20
initial commit
6ae80a0
verified
skyai798
commited on
Jan 20