File size: 5,276 Bytes
1be08f9 0c5959a 171827b 0c5959a 9ef7e62 0c5959a 171827b c521ddf 171827b 0c5959a 436fc1b 0c5959a 171827b 0c5959a 7365e2c 0c5959a 436fc1b 0c5959a 436fc1b 0c5959a 436fc1b 0c5959a 436fc1b 0c5959a c1d3bdb 9ef7e62 46b9f14 9ef7e62 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 |
---
license: mit
tags:
- music
---
# π΅ NotaGen: Advancing Musicality in Symbolic Music Generation with Large Language Model Training Paradigms
<p>
<!-- ArXiv -->
<a href="https://arxiv.org/abs/2502.18008">
<img src="https://img.shields.io/badge/NotaGen_Paper-ArXiv-%23B31B1B?logo=arxiv&logoColor=white" alt="Paper">
</a>
<!-- GitHub -->
<a href="https://github.com/ElectricAlexis/NotaGen">
<img src="https://img.shields.io/badge/NotaGen_Code-GitHub-%23181717?logo=github&logoColor=white" alt="GitHub">
</a>
<!-- HuggingFace -->
<a href="https://huggingface.co./ElectricAlexis/NotaGen">
<img src="https://img.shields.io/badge/NotaGen_Weights-HuggingFace-%23FFD21F?logo=huggingface&logoColor=white" alt="Weights">
</a>
<!-- Web Demo -->
<a href="https://electricalexis.github.io/notagen-demo/">
<img src="https://img.shields.io/badge/NotaGen_Demo-Web-%23007ACC?logo=google-chrome&logoColor=white" alt="Demo">
</a>
</p>
<p align="center">
<img src="notagen.png" alt="NotaGen" width="50%">
</p>
## π Overview
**NotaGen** is a symbolic music generation model that explores the potential of producing **high-quality classical sheet music**. Inspired by the success of Large Language Models (LLMs), NotaGen adopts a three-stage training paradigm:
- π§ **Pre-training** on 1.6M musical pieces
- π― **Fine-tuning** on ~9K classical compositions with `period-composer-instrumentation` prompts
- π **Reinforcement Learning** using our novel **CLaMP-DPO** method (no human annotations or pre-defined rewards required.)
Check our [demo page](https://electricalexis.github.io/notagen-demo/) and enjoy music composed by NotaGen!
## βοΈ Environment Setup
```bash
conda create --name notagen python=3.10
conda activate notagen
conda install pytorch==2.3.0 pytorch-cuda=11.8 -c pytorch -c nvidia
pip install accelerate
pip install optimum
pip install -r requirements.txt
```
## ποΈ NotaGen Model Weights
### Pre-training
We provide pre-trained weights of different scales:
| Models | Parameters | Patch-level Decoder Layers | Character-level Decoder Layers | Hidden Size | Patch Length (Context Length) |
| ---- | ---- | ---- | ---- | ---- | ---- |
| [NotaGen-small](https://huggingface.co./ElectricAlexis/NotaGen/blob/main/weights_notagen_pretrain_p_size_16_p_length_2048_p_layers_12_c_layers_3_h_size_768_lr_0.0002_batch_8.pth) | 110M | 12 | 3 | 768 | 2048 |
| [NotaGen-medium](https://huggingface.co./ElectricAlexis/NotaGen/blob/main/weights_notagen_pretrain_p_size_16_p_length_2048_p_layers_16_c_layers_3_h_size_1024_lr_0.0001_batch_4.pth) | 244M | 16 | 3 | 1024 | 2048 |
| [NotaGen-large](https://huggingface.co./ElectricAlexis/NotaGen/blob/main/weights_notagen_pretrain_p_size_16_p_length_1024_p_layers_20_c_layers_6_h_size_1280_lr_0.0001_batch_4.pth) | 516M | 20 | 6 | 1280 | 1024 |
### Fine-tuning
We fine-tuned NotaGen-large on a corpus of approximately 9k classical pieces. You can download the weights [here](https://huggingface.co./ElectricAlexis/NotaGen/blob/main/weights_notagen_pretrain-finetune_p_size_16_p_length_1024_p_layers_c_layers_6_20_h_size_1280_lr_1e-05_batch_1.pth).
### Reinforcement-Learning
After pre-training and fine-tuning, we optimized NotaGen-large with 3 iterations of CLaMP-DPO. You can download the weights [here](https://huggingface.co./ElectricAlexis/NotaGen/blob/main/weights_notagen_pretrain-finetune-RL3_beta_0.1_lambda_10_p_size_16_p_length_1024_p_layers_20_c_layers_6_h_size_1280_lr_1e-06_batch_1.pth).
### π NotaGen-X
Inspired by Deepseek-R1, we further optimized the training procedures of NotaGen and released a better version --- [NotaGen-X](https://huggingface.co./ElectricAlexis/NotaGen/blob/main/weights_notagenx_p_size_16_p_length_1024_p_layers_20_h_size_1280.pth). Compared to the version in the paper, NotaGen-X incorporates the following improvements:
- We introduced a post-training stage between pre-training and fine-tuning, refining the model with a classical-style subset of the pre-training dataset.
- We removed the key augmentation in the Fine-tune stage, making the instrument range of the generated compositions more reasonable.
- After RL, we utilized the resulting checkpoint to gather a new set of post-training data. Starting from the pre-trained checkpoint, we conducted another round of post-training, fine-tuning, and reinforcement learning.
For implementation of pre-training, fine-tuning and reinforcement learning on NotaGen, please view our [github page](https://github.com/ElectricAlexis/NotaGen).
## π Citation
If you find **NotaGen** or **CLaMP-DPO** useful in your work, please cite our paper.
```bibtex
@misc{wang2025notagenadvancingmusicalitysymbolic,
title={NotaGen: Advancing Musicality in Symbolic Music Generation with Large Language Model Training Paradigms},
author={Yashan Wang and Shangda Wu and Jianhuai Hu and Xingjian Du and Yueqi Peng and Yongxin Huang and Shuai Fan and Xiaobing Li and Feng Yu and Maosong Sun},
year={2025},
eprint={2502.18008},
archivePrefix={arXiv},
primaryClass={cs.SD},
url={https://arxiv.org/abs/2502.18008},
}
```
|