Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
narekvslife
/
dpo
like
0
PEFT
Safetensors
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Use this model
11ded4b
dpo
/
adapter_config.json
Commit History
fix base model path for M2
11ded4b
narekvslife
commited on
Jun 10
Upload folder using huggingface_hub
75fd4d6
verified
narekvslife
commited on
Jun 2
dpo_p69wqnv2
e4b57a5
verified
narekvslife
commited on
May 30
dpo_f6rfgz12
beed70f
verified
narekvslife
commited on
May 29
dpo_4rbrxhkg
f5265ca
verified
narekvslife
commited on
May 29
dpo_f55v9jd2
cda0936
verified
narekvslife
commited on
May 28
dpo_79pt692h
b8cad6c
verified
narekvslife
commited on
May 27
Commit message
80fd6b0
verified
narekvslife
commited on
May 22