AraElectra-QA-EgyLaw-Squad
This model is a fine-tuned version of ZeyadAhmed/AraElectra-Arabic-SQuADv2-QA on the Egyption Law Squad dataset. It achieves the following results on the evaluation set:
- Loss: 4.0008
Model description
Araelctra model for Question Answering Task in Egyption law specially Personal Status Law (ูุงููู ุงูุงุญูุงู ุงูุดุฎุตูุฉ)
this Model was created for a Graduation Project in Computers and Artificial intellgence at Helwan University under supervisation from Dr.Ensaf Hossen
About Team
- Abdelrahman Ahmed Hamdy
- Shehab Gamal-elden
- Mohsen Hisham Mohamed
- Maya Ahmed Abdelsatar
- Nancy Ahmed Mostafa
- Nour Khaled Ali
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
No log | 1.0 | 89 | 3.2762 |
3.9126 | 2.0 | 178 | 3.0150 |
2.5679 | 3.0 | 267 | 2.8324 |
1.7576 | 4.0 | 356 | 2.9655 |
1.1946 | 5.0 | 445 | 3.1643 |
0.8854 | 6.0 | 534 | 3.0778 |
0.7387 | 7.0 | 623 | 3.2814 |
0.6067 | 8.0 | 712 | 3.2391 |
0.5201 | 9.0 | 801 | 3.3717 |
0.5201 | 10.0 | 890 | 3.5752 |
0.4377 | 11.0 | 979 | 3.6657 |
0.4025 | 12.0 | 1068 | 3.8178 |
0.4077 | 13.0 | 1157 | 3.7404 |
0.3355 | 14.0 | 1246 | 3.7566 |
0.3627 | 15.0 | 1335 | 3.9959 |
0.3376 | 16.0 | 1424 | 3.8815 |
0.3676 | 17.0 | 1513 | 3.9674 |
0.3165 | 18.0 | 1602 | 3.9368 |
0.3165 | 19.0 | 1691 | 3.9836 |
0.3148 | 20.0 | 1780 | 4.0008 |
Framework versions
- Transformers 4.38.1
- Pytorch 2.1.2
- Datasets 2.1.0
- Tokenizers 0.15.2
- Downloads last month
- 7
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for BoodyAhmedHamdy/AraElectra-QA-EgyLaw-Squad
Base model
ZeyadAhmed/AraElectra-Arabic-SQuADv2-QA