YuE Quantized Models

Welcome to the repository for YuE Quantized Models! These models are quantized versions of the original YuE models, optimized for efficient inference while maintaining high-quality music generation capabilities. You can use these models directly or through the YuE Interface, a user-friendly Docker-based solution for generating music.

🚀 YuE Interface

To easily interact with these models, check out the YuE Interface, a robust and intuitive Docker-based interface that leverages Gradio for a seamless music generation experience. The interface supports both local deployment and cloud-based solutions like RunPod.

Key Features of the YuE Interface:

  • Docker Image: Pre-configured for easy setup.
  • Web UI (Gradio): Intuitive interface for configuring and executing music generation tasks.
  • NVIDIA GPU Support: Accelerated processing for faster results.
  • Model Management: Download and manage specific YuE models.
  • Real-time Logging: Monitor generation logs directly from the interface.
  • Audio Playback and Download: Listen to and download generated audio files.

For detailed instructions on how to use these models with the YuE Interface, please refer to the YuE Interface README.

Available Quantized Models

Below is the list of quantized models available in this repository:

Model Name Quantization Hugging Face Link
YuE-s1-7B-anneal-en-cot-int8 INT8 Model Link
YuE-s1-7B-anneal-en-icl-int8 INT8 Model Link
YuE-s1-7B-anneal-jp-kr-cot-int8 INT8 Model Link
YuE-s1-7B-anneal-jp-kr-icl-int8 INT8 Model Link
YuE-s1-7B-anneal-zh-cot-int8 INT8 Model Link
YuE-s1-7B-anneal-zh-icl-int8 INT8 Model Link
YuE-s2-1B-general-int8 INT8 Model Link
YuE-s1-7B-anneal-en-cot-nf4 NF4 Model Link
YuE-s1-7B-anneal-en-icl-nf4 NF4 Model Link
YuE-s1-7B-anneal-jp-kr-cot-nf4 NF4 Model Link
YuE-s1-7B-anneal-jp-kr-icl-nf4 NF4 Model Link
YuE-s1-7B-anneal-zh-cot-nf4 NF4 Model Link
YuE-s1-7B-anneal-zh-icl-nf4 NF4 Model Link

💬 Support

If you encounter any issues or have questions, feel free to open an issue on the YuE Interface GitHub repository or contact me via my CivitAI profile.

🙏 Acknowledgements

A special thanks to the developers of the official YuE repository for their incredible work and for making this project possible.


Happy Music Generating! 🎶


Downloads last month
74
Safetensors
Model size
1.96B params
Tensor type
F32
·
FP16
·
I8
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Model tree for Alissonerdx/YuE-s2-1B-general-int8

Quantized
(8)
this model

Collection including Alissonerdx/YuE-s2-1B-general-int8