Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
sampraxi
/
openhermes-mistral-dpo-gptq
like
0
PEFT
TensorBoard
Safetensors
trl
dpo
Generated from Trainer
License:
apache-2.0
Model card
Files
Files and versions
Metrics
Training metrics
Community
Use this model
main
openhermes-mistral-dpo-gptq
/
runs
1 contributor
History:
1 commit
sampraxi
sampraxi/dpo_mistral
cbbda9f
verified
6 months ago
Mar26_18-27-43_unagi
sampraxi/dpo_mistral
6 months ago
Mar26_18-29-08_unagi
sampraxi/dpo_mistral
6 months ago
Mar26_18-38-25_unagi
sampraxi/dpo_mistral
6 months ago
Mar26_18-40-37_unagi
sampraxi/dpo_mistral
6 months ago
Mar26_18-45-08_unagi
sampraxi/dpo_mistral
6 months ago
Mar26_18-48-44_unagi
sampraxi/dpo_mistral
6 months ago