--- base_model: v000000/L3.1-Celestial-Stone-2x8B-DPO library_name: transformers tags: - merge - llama - mixtral - dpo - llama-cpp --- # L3.1-Celestial-Stone-2x8B-DPO (GGUFs) This model was converted to GGUF format from [`v000000/L3.1-Celestial-Stone-2x8B-DPO`](https://huggingface.co./v000000/L3.1-Celestial-Stone-2x8B-DPO) using llama.cpp. Refer to the [original model card](https://huggingface.co./v000000/L3.1-Celestial-Stone-2x8B-DPO) for more details on the model. # Ordered by quality: * q8_0 imatrix * q6_k imatrix * q5_k_s imatrix * iq4_xs imatrix Missing? See [mradermacher i1](https://huggingface.co./mradermacher/L3.1-Celestial-Stone-2x8B-DPO-i1-GGUF) for more types of quants. imatrix data (V2 - 287kb) randomized bartowski, kalomeze groups, ERP/RP snippets, working gpt4 code, human messaging, randomized posts, story, novels