Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
narekvslife
/
dpo
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Use this model
main
dpo
/
training_args.bin
Commit History
Upload folder using huggingface_hub
75fd4d6
verified
narekvslife
commited on
Jun 2
dpo_p69wqnv2
e4b57a5
verified
narekvslife
commited on
May 30
dpo_f6rfgz12
beed70f
verified
narekvslife
commited on
May 29
dpo_4rbrxhkg
f5265ca
verified
narekvslife
commited on
May 29
dpo_f55v9jd2
cda0936
verified
narekvslife
commited on
May 28
dpo_79pt692h
b8cad6c
verified
narekvslife
commited on
May 27
Commit message
80fd6b0
verified
narekvslife
commited on
May 22