ParsBERT (v2.0)
A Transformer-based Model for Persian Language Understanding
We reconstructed the vocabulary and fine-tuned the ParsBERT v1.1 on the new Persian corpora in order to provide some functionalities for using ParsBERT in other scopes! Please follow the ParsBERT repo for the latest information about previous and current models.
Persian Text Classification [DigiMag, Persian News]
The task target is labeling texts in a supervised manner in both existing datasets DigiMag
and Persian News
.
DigiMag
A total of 8,515 articles scraped from Digikala Online Magazine. This dataset includes seven different classes.
- Video Games
- Shopping Guide
- Health Beauty
- Science Technology
- General
- Art Cinema
- Books Literature
Label | # |
---|---|
Video Games | 1967 |
Shopping Guide | 125 |
Health Beauty | 1610 |
Science Technology | 2772 |
General | 120 |
Art Cinema | 1667 |
Books Literature | 254 |
Download You can download the dataset from here
Results
The following table summarizes the F1 score obtained by ParsBERT as compared to other models and architectures.
Dataset | ParsBERT v2 | ParsBERT v1 | mBERT |
---|---|---|---|
Digikala Magazine | 93.65* | 93.59 | 90.72 |
How to use :hugs:
BibTeX entry and citation info
Please cite in publications as the following:
@article{ParsBERT,
title={ParsBERT: Transformer-based Model for Persian Language Understanding},
author={Mehrdad Farahani, Mohammad Gharachorloo, Marzieh Farahani, Mohammad Manthouri},
journal={ArXiv},
year={2020},
volume={abs/2005.12515}
}
Questions?
Post a Github issue on the ParsBERT Issues repo.
- Downloads last month
- 32