AwanLLM-Awanllm-Llama-3-8B-Instruct-DPO-v0.2-GGUF / AwanLLM-Awanllm-Llama-3-8B-Instruct-DPO-v0.2-IQ4_XS.gguf

Commit History

Upload folder using huggingface_hub
cbe9f6f
verified

m8than commited on