OpenELM-1_1B-DPO-full-max-8-reward / configuration_openelm.py

Commit History

Model save
0f2e23b
verified

CharlesLi commited on