AwanLLM-Awanllm-Llama-3-8B-Instruct-DPO-v0.2-GGUF / AwanLLM-Awanllm-Llama-3-8B-Instruct-DPO-v0.2-Q4_K_M.gguf

Commit History

Upload folder using huggingface_hub
cbe9f6f
verified

m8than commited on