Motarjem-v0.1 / README.md
abdeljalilELmajjodi's picture
abdeljalilELmajjodi/Motarjem-v0.1
3a1cb51 verified
metadata
library_name: transformers
license: apache-2.0
base_model: Helsinki-NLP/opus-mt-en-ar
tags:
  - generated_from_trainer
metrics:
  - bleu
model-index:
  - name: Motarjem-v0.1
    results: []

Motarjem-v0.1

This model is a fine-tuned version of Helsinki-NLP/opus-mt-en-ar on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.1540
  • Bleu: 29.991
  • Gen Len: 11.75

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.03
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
3.7908 0.5857 1000 3.7377 6.18 16.2781
3.5926 1.1716 2000 3.3270 8.0493 16.4099
3.2234 1.7572 3000 3.0900 9.7393 15.2034
2.9178 2.3432 4000 2.8567 9.8133 17.6295
2.7302 2.9288 5000 2.6926 15.7344 12.8374
2.4324 3.5148 6000 2.5841 16.7132 12.9787
2.3135 4.1007 7000 2.4542 17.5718 13.288
2.0808 4.6864 8000 2.3554 18.0186 13.6381
1.9353 5.2723 9000 2.3031 20.0612 12.8465
1.8244 5.8580 10000 2.1954 19.1501 14.0303
1.6091 6.4439 11000 2.2027 23.3063 12.3297
1.5548 7.0299 12000 2.1910 23.6853 12.4534
1.325 7.6155 13000 2.1362 27.8799 11.4823
1.2595 8.2015 14000 2.1696 25.782 12.1096
1.1253 8.7871 15000 2.1173 27.8543 11.8945
0.9858 9.3731 16000 2.1696 29.3317 11.7985
0.938 9.9587 17000 2.1540 29.991 11.75

Framework versions

  • Transformers 4.48.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0