Mengzi-T5-MT model
This is a Multi-Task model trained on the multitask mixture of 27 datasets and 301 prompts, based on Mengzi-T5-base.
Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese
Usage
from transformers import T5Tokenizer, T5ForConditionalGeneration
tokenizer = T5Tokenizer.from_pretrained("Langboat/mengzi-t5-base-mt")
model = T5ForConditionalGeneration.from_pretrained("Langboat/mengzi-t5-base-mt")
Citation
If you find the technical report or resource is useful, please cite the following technical report in your paper.
@misc{zhang2021mengzi,
title={Mengzi: Towards Lightweight yet Ingenious Pre-trained Models for Chinese},
author={Zhuosheng Zhang and Hanqing Zhang and Keming Chen and Yuhang Guo and Jingyun Hua and Yulong Wang and Ming Zhou},
year={2021},
eprint={2110.06696},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 35
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.