xpqiu commited on
Commit
4363ef6
1 Parent(s): 7e8fe38

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -44
README.md CHANGED
@@ -1,44 +0,0 @@
1
- ---
2
- tags:
3
- - text2text-generation
4
- - Chinese
5
- - seq2seq
6
- language: zh
7
- ---
8
-
9
- # Chinese BART-Base
10
-
11
- ## Model description
12
-
13
- This is an implementation of Chinese BART-Base.
14
-
15
- [**CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation**](https://arxiv.org/pdf/2109.05729.pdf)
16
-
17
- Yunfan Shao, Zhichao Geng, Yitao Liu, Junqi Dai, Fei Yang, Li Zhe, Hujun Bao, Xipeng Qiu
18
-
19
- **Github Link:** https://github.com/fastnlp/CPT
20
-
21
-
22
- ## Usage
23
-
24
- ```python
25
- >>> from transformers import BertTokenizer, BartForConditionalGeneration, Text2TextGenerationPipeline
26
- >>> tokenizer = BertTokenizer.from_pretrained("fnlp/bart-base-chinese")
27
- >>> model = BartForConditionalGeneration.from_pretrained("fnlp/bart-base-chinese")
28
- >>> text2text_generator = Text2TextGenerationPipeline(model, tokenizer)
29
- >>> text2text_generator("北京是[MASK]的首都", max_length=50, do_sample=False)
30
- [{'generated_text': '北 京 是 中 国 的 首 都'}]
31
- ```
32
-
33
- **Note: Please use BertTokenizer for the model vocabulary. DO NOT use original BartTokenizer.**
34
-
35
- ## Citation
36
-
37
- ```bibtex
38
- @article{shao2021cpt,
39
- title={CPT: A Pre-Trained Unbalanced Transformer for Both Chinese Language Understanding and Generation},
40
- author={Yunfan Shao and Zhichao Geng and Yitao Liu and Junqi Dai and Fei Yang and Li Zhe and Hujun Bao and Xipeng Qiu},
41
- journal={arXiv preprint arXiv:2109.05729},
42
- year={2021}
43
- }
44
- ```