m2m100_418M_br_fr / README.md
lgrobol's picture
Update README.md
6696210
|
raw
history blame
4.77 kB
metadata
language:
  - br
  - fr
license: mit
tags:
  - translation
model-index:
  - name: m2m100_br_fr
    results: []
co2_eq_emissions:
  emissions: 290
  training_type: fine-tuning
  geographical_location: Paris, France
  hardware_used: 2 NVidia GeForce RTX 3090 GPUs

m2m100_br_fr Breton-French translator

This model is a fine-tuned version of facebook/m2m100_418M on a Breton-French parallel corpus. In order to obtain the best possible results, we use all our parallel data on training and consequently report no quantitative evaluation at this time. Empirical qualitative evidence suggests that the translations are generally adequate for short and simple examples, the behaviour of the model on long and/or complex inputs is currently unknown.

Model description

See the description of the base model.

Intended uses & limitations

This is intended as a demonstration of the improvements brought by fine-tuning a large-scale many-to-many translation system on a medium-sized dataset of high-quality data. As it is, and as far as I can tell it usually provides translations that are least as good as those of other available Breton-French translators, but this is has not been evaluated quantitatively at a large scale.

Training and evaluation data

The training dataset consists of:

These are obtained from the OPUS base (Tiedemann, 2012) and filtered using OpusFilter (Aulamo et al., 2020), see dl_opus.yaml for the details. The filtering is slightly non-deterministic due to the retraining of a statistical alignment model, but in my experience, different runs tend to give extremely similar results. Do not hesitate to reach out if you experience difficulties in using this to collect data.

Training procedure

The training hyperparameters are those suggested by Adelani et al. (2022) in their code release, which gave their best results for machine translation of several African languages.

More specifically, we use the example training script provided by 🤗 Transformers for fine-tuning mBART with the following command

python run_translation.py \
    --model_name_or_path facebook/m2m100_418M  \
    --do_train \
    --train_file {path_to_train_corpus} \
    --source_lang br \
    --target_lang fr \
    --output_dir {path_to_model} \
    --per_device_train_batch_size=4 \
    --per_device_eval_batch_size=4 \
    --overwrite_output_dir \
    --predict_with_generate \
    --forced_bos_token fr \
    --save_steps 50000 \
    --num_beams 10 \

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 3.0

Framework versions

  • Transformers 4.23.1
  • Pytorch 1.12.1+cu116
  • Datasets 2.6.1
  • Tokenizers 0.13.1

References

  • Adelani, David, Jesujoba Alabi, Angela Fan, Julia Kreutzer, Xiaoyu Shen, Machel Reid, Dana Ruiter, et al. 2022. « A Few Thousand Translations Go a Long Way! Leveraging Pre-trained Models for African News Translation ». In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, 3053‑70. Seattle, United States: Association for Computational Linguistics. https://doi.org/10.18653/v1/2022.naacl-main.223.
  • Mikko Aulamo, Sami Virpioja, and Jörg Tiedemann. 2020. OpusFilter: A Configurable Parallel Corpus Filtering Toolbox. In Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics: System Demonstrations, pages 150–156, Online. Association for Computational Linguistics.
  • Tiedemann, Jorg 2012, Parallel Data, Tools and Interfaces in OPUS. In Proceedings of the 8th International Conference on Language Resources and Evaluation (LREC 2012)
  • Tyers, Francis M. 2009 "Rule-based augmentation of training data in Breton-French statistical machine translation ". Proceedings of the 13th Annual Conference of the European Association of Machine Translation, EAMT09. Barcelona, España. 213--218