Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
narekvslife
/
dpo
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Use this model
8904848
dpo
/
optimizer.pt
Commit History
dpo_5wiothfs 5.9
8904848
verified
narekvslife
commited on
Jun 2
5wiothfs 5.6
0e3ef6e
verified
narekvslife
commited on
Jun 2
Upload folder using huggingface_hub
75fd4d6
verified
narekvslife
commited on
Jun 2