wangyulong's picture
Update README.md
9abf9b3
|
raw
history blame
733 Bytes
---
language:
- zh
license: apache-2.0
---
# Mengzi-BERT base fin model (Chinese)
Continue trained mengzi-bert-base with 20G financial news and research reports. Masked language modeling(MLM), part-of-speech(POS) tagging and sentence order prediction(SOP) are used as training task.
[Mengzi: A lightweight yet Powerful Chinese Pre-trained Language Model](www.example.com)
## Usage
```python
from transformers import BertTokenizer, BertModel
tokenizer = BertTokenizer.from_pretrained("Langboat/mengzi-bert-base-fin")
model = BertModel.from_pretrained("Langboat/mengzi-bert-base-fin")
```
## Citation
If you find the technical report or resource is useful, please cite the following technical report in your paper.
```
example
```