Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
minsik-oh
/
dpo-model-sample
like
0
PEFT
Safetensors
trl
dpo
Generated from Trainer
License:
llama3.1
Model card
Files
Files and versions
Community
Use this model
main
dpo-model-sample
/
.config
1 contributor
History:
1 commit
minsik-oh
minsik-oh/dpo-model-sample
2f4c3be
verified
4 months ago
configurations
minsik-oh/dpo-model-sample
4 months ago
logs
minsik-oh/dpo-model-sample
4 months ago
.last_opt_in_prompt.yaml
Safe
3 Bytes
minsik-oh/dpo-model-sample
4 months ago
.last_survey_prompt.yaml
Safe
37 Bytes
minsik-oh/dpo-model-sample
4 months ago
.last_update_check.json
Safe
135 Bytes
minsik-oh/dpo-model-sample
4 months ago
active_config
Safe
7 Bytes
minsik-oh/dpo-model-sample
4 months ago
config_sentinel
Safe
0 Bytes
minsik-oh/dpo-model-sample
4 months ago
default_configs.db
Safe
12.3 kB
minsik-oh/dpo-model-sample
4 months ago
gce
Safe
5 Bytes
minsik-oh/dpo-model-sample
4 months ago
hidden_gcloud_config_universe_descriptor_data_cache_configs.db
Safe
12.3 kB
minsik-oh/dpo-model-sample
4 months ago