A model which is jointly trained and fine-tuned on Quran, Saheefa and nahj-al-balaqa. All Datasets are available Here. Code will be available soon ...
Some Examples for filling the mask:
ุฐููููู [MASK] ููุง ุฑูููุจู ููููู ููุฏูู ููููู ูุชููููููู
- ```
ููุง ุฃููููููุง ุงููููุงุณู ุงุนูุจูุฏููุง ุฑูุจููููู
ู ุงูููุฐูู ุฎูููููููู
ู ููุงูููุฐูููู ู
ููู ููุจูููููู
ู ููุนููููููู
ู [MASK]
This model is fine-tuned on Bert Base Arabic for 30 epochs. We have used Masked Language Modeling
to fine-tune the model. Also, after each 5 epochs, we have completely masked the words again for the model to learn the embeddings very well and not overfit the data.
- Downloads last month
- 60
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.