ParsBERT (v2.0)
A Transformer-based Model for Persian Language Understanding
We reconstructed the vocabulary and fine-tuned the ParsBERT v1.1 on the new Persian corpora in order to provide some functionalities for using ParsBERT in other scopes! Please follow the ParsBERT repo for the latest information about previous and current models.
Persian Text Classification [DigiMag, Persian News]
The task target is labeling texts in a supervised manner in both existing datasets DigiMag
and Persian News
.
Persian News
A dataset of various news articles scraped from different online news agencies' websites. The total number of articles is 16,438, spread over eight different classes.
- Economic
- International
- Political
- Science Technology
- Cultural Art
- Sport
- Medical
Label | # |
---|---|
Social | 2170 |
Economic | 1564 |
International | 1975 |
Political | 2269 |
Science Technology | 2436 |
Cultural Art | 2558 |
Sport | 1381 |
Medical | 2085 |
Download You can download the dataset from here
Results
The following table summarizes the F1 score obtained by ParsBERT as compared to other models and architectures.
Dataset | ParsBERT v2 | ParsBERT v1 | mBERT |
---|---|---|---|
Persian News | 97.44* | 97.19 | 95.79 |
How to use :hugs:
BibTeX entry and citation info
Please cite in publications as the following:
@article{ParsBERT,
title={ParsBERT: Transformer-based Model for Persian Language Understanding},
author={Mehrdad Farahani, Mohammad Gharachorloo, Marzieh Farahani, Mohammad Manthouri},
journal={ArXiv},
year={2020},
volume={abs/2005.12515}
}
Questions?
Post a Github issue on the ParsBERT Issues repo.
- Downloads last month
- 417