File size: 2,319 Bytes
732bdf0 75dc289 732bdf0 75dc289 f6bf142 732bdf0 75dc289 732bdf0 75dc289 732bdf0 52c6a88 732bdf0 ad8a8d7 732bdf0 75dc289 732bdf0 75dc289 732bdf0 75dc289 732bdf0 75dc289 732bdf0 52c6a88 732bdf0 75dc289 732bdf0 75dc289 f57739d cbdd0fd f57739d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 |
---
license: cc-by-4.0
base_model: Helsinki-NLP/opus-mt-tc-big-en-ar
tags:
- generated_from_trainer
model-index:
- name: NAMAA-Space/masrawy-english-to-egyptian-arabic-translator-v2.9
results: []
language:
- ar
- en
pipeline_tag: translation
library_name: transformers
---
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->
# NAMAA-Space/masrawy-english-to-egyptian-arabic-translator-v2.9
This model is a fine-tuned version of [Helsinki-NLP/opus-mt-tc-big-en-ar](https://huggingface.co./Helsinki-NLP/opus-mt-tc-big-en-ar).
It achieves the following results on the evaluation set:
- Loss: 1.2957
## Model description
This model is finetuned on Helsinki-NLP/opus-mt-tc-big-en-ar for English to Egyptian dialect translations. It was trained on more than ***150,000*** rows with more than ***10 Million tokens***
## Usage
```python
from transformers import pipeline
modelName = "NAMAA-Space/masrawy-english-to-egyptian-arabic-translator-v2.9"
translator = pipeline("translation", model=modelName)
output = translator("Where is the nearest pharmacy")
print(output[0]['translation_text'])
```
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 8
### Framework versions
- Transformers 4.35.2
- Pytorch 2.1.1+cu121
- Datasets 2.14.5
- Tokenizers 0.15.1
### Benchmarks
![image/png](https://cdn-uploads.huggingface.co/production/uploads/630535e0c7fed54edfaa1a75/sUMH310KRefhgHMkqjY_u.png)
- This model ranks second after `openai/gpt-4o` in [Egyptian translation leaderboard](https://huggingface.co./spaces/NAMAA-Space/English-to-Egyptian-Arabic-Translation-Leaderboard)
### Citation
```
If you use NAMAA-Space/masrawy-english-to-egyptian-arabic-translator-v2.9, please cite it as follows:
@article{namaa_01_2025,
title={Masrawy English to Egyptian Translator},
url={https://huggingface.co./NAMAA-Space/masrawy-english-to-egyptian-arabic-translator-v2.9},
publisher={NAMAA},
author={NAMAA Team},
year={2025}
}
``` |