Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
We-Want-GPU
's Collections
SFT LLM
LLM Dataset
DPO LLM
DPO LLM
updated
Dec 31, 2023
Upvote
-
We-Want-GPU/Yi-Ko-6B-DPO-v2
Text Generation
•
Updated
Dec 27, 2023
•
1.7k
Upvote
-
Share collection
View history
Collection guide
Browse collections