Mar2Ding commited on
Commit
a932ff5
1 Parent(s): fe65c88

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +59 -0
README.md ADDED
@@ -0,0 +1,59 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - en
5
+ - zh
6
+ pipeline_tag: text-classification
7
+ ---
8
+ ---
9
+ license: apache-2.0
10
+ pipeline_tag: text-generation
11
+ ---
12
+
13
+
14
+
15
+
16
+ <p align="center">
17
+ <b><font size="6">SongComposer</font></b>
18
+ <p>
19
+
20
+ <div align="center">
21
+
22
+ [💻Github Repo](https://github.com/pjlab-songcomposer/songcomposer)
23
+
24
+ [📖Paper](https://arxiv.org/abs/2402.17645)
25
+
26
+ </div>
27
+
28
+ **SongComposer** is a language large model (LLM) based on [InternLM2](https://github.com/InternLM/InternLM) for lyric and melody composition in song generation.
29
+
30
+ We release SongComposer series in two versions:
31
+
32
+ - SongComposer_pretrain: The pretrained SongComposer with InternLM2 as the initialization of the LLM, gains basic knowledge on lyric and melody.
33
+ - SongComposer_sft: The finetuned SongComposer for *instruction-following song generation* including lyric to melody, melody to lyric, song continuation, text to song.
34
+
35
+ ### Import from Transformers
36
+ To load the SongComposer_sft model using Transformers, use the following code:
37
+ ```python
38
+ from transformers import AutoTokenizer, AutoModelForCausalLM
39
+ ckpt_path = "Mar2Ding/songcomposer_pretrain"
40
+ tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
41
+ model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half()
42
+ prompt = 'Create a song on brave and sacrificing with a rapid pace.'
43
+ model.inference(prompt, tokenizer)
44
+ ```
45
+
46
+ ### 通过 Transformers 加载
47
+ 通过以下的代码加载 SongComposer_sft 模型
48
+
49
+ ```python
50
+ from transformers import AutoTokenizer, AutoModelForCausalLM
51
+ ckpt_path = "Mar2Ding/songcomposer_pretrain"
52
+ tokenizer = AutoTokenizer.from_pretrained(ckpt_path, trust_remote_code=True)
53
+ model = AutoModel.from_pretrained(ckpt_path, trust_remote_code=True).cuda().half()
54
+ prompt = 'Create a song on brave and sacrificing with a rapid pace.'
55
+ model.inference(prompt, tokenizer)
56
+ ```
57
+
58
+ ### Open Source License
59
+ The code is licensed under Apache-2.0, while model weights are fully open for academic research and also allow free commercial usage.