Translation

This is a AfriCOMET-QE-STL (quality estimation single task) evaluation model: It receives a source sentence, and a translation, and returns a score that reflects the quality of the translation compared to the source.

Paper

AfriMTE and AfriCOMET: Empowering COMET to Embrace Under-resourced African Languages (Wang et al., arXiv 2023)

License

Apache-2.0

Usage (AfriCOMET)

Using this model requires unbabel-comet to be installed:

pip install --upgrade pip  # ensures that pip is current 
pip install unbabel-comet

Then you can use it through comet CLI:

comet-score -s {source-inputs}.txt -t {translation-outputs}.txt --model masakhane/africomet-qe-stl

Or using Python:

from comet import download_model, load_from_checkpoint

model_path = download_model("masakhane/africomet-qe-stl")
model = load_from_checkpoint(model_path)
data = [
    {
        "src": "Nadal sàkọọ́lẹ̀ ìforígbárí o ní àmì méje sóódo pẹ̀lú ilẹ̀ Canada.",
        "mt": "Nadal's head to head record against the Canadian is 7–2.",
    },
    {
        "src": "Laipe yi o padanu si Raoniki ni ere Sisi Brisbeni.",
        "mt": "He recently lost against Raonic in the Brisbane Open.",
    }
]
model_output = model.predict(data, batch_size=8, gpus=1)
print (model_output)

Intended uses

Our model is intented to be used for MT quality estimation.

Given a source sentence and a translation outputs a single score between 0 and 1 where 1 represents a perfect translation.

Languages Covered:

This model builds on top of AfroXLMR which cover the following languages:

Afrikaans, Arabic, Amharic, English, French, Hausa, Igbo, Malagasy, Chichewa, Oromo, Nigerian-Pidgin, Kinyarwanda, Kirundi, Shona, Somali, Sesotho, Swahili, isiXhosa, Yoruba, and isiZulu.

Thus, results for language pairs containing uncovered languages are unreliable!

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Examples
Unable to determine this model's library. Check the docs .

Collection including masakhane/africomet-qe-stl