File size: 1,929 Bytes
ac87384
a48cad6
ac87384
 
a48cad6
 
ac87384
 
a48cad6
4ebe706
a48cad6
 
 
4ebe706
a48cad6
 
 
4ebe706
a48cad6
 
 
 
4ebe706
 
 
 
 
a48cad6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
4ebe706
a48cad6
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
---
language: ar
tags:
- translation

license: apache-2.0
---

### ara-deu

* source group: Arabic 
* target group: German 
*  OPUS readme: [ara-deu](https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/ara-deu/README.md)

*  model: transformer-align
* source language(s): afb apc ara ara_Latn arq arz
* target language(s): deu
* model: transformer-align
* pre-processing: normalization + SentencePiece (spm32k,spm32k)
* download original weights: [opus-2020-07-03.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/ara-deu/opus-2020-07-03.zip)
* test set translations: [opus-2020-07-03.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/ara-deu/opus-2020-07-03.test.txt)
* test set scores: [opus-2020-07-03.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/ara-deu/opus-2020-07-03.eval.txt)

## Benchmarks

| testset               | BLEU  | chr-F |
|-----------------------|-------|-------|
| Tatoeba-test.ara.deu 	| 44.7 	| 0.629 |


### System Info: 
- hf_name: ara-deu

- source_languages: ara

- target_languages: deu

- opus_readme_url: https://github.com/Helsinki-NLP/Tatoeba-Challenge/tree/master/models/ara-deu/README.md

- original_repo: Tatoeba-Challenge

- tags: ['translation']

- prepro:  normalization + SentencePiece (spm32k,spm32k)

- url_model: https://object.pouta.csc.fi/Tatoeba-MT-models/ara-deu/opus-2020-07-03.zip

- url_test_set: https://object.pouta.csc.fi/Tatoeba-MT-models/ara-deu/opus-2020-07-03.test.txt

- src_alpha3: ara

- tgt_alpha3: deu

- short_pair: ar-de

- chrF2_score: 0.629

- bleu: 44.7

- brevity_penalty: 0.986

- ref_len: 8371.0

- src_name: Arabic

- tgt_name: German

- train_date: 2020-07-03

- src_alpha2: ar

- tgt_alpha2: de

- prefer_old: False

- long_pair: ara-deu

- helsinki_git_sha: 480fcbe0ee1bf4774bcbe6226ad9f58e63f6c535

- transformers_git_sha: 46e9f53347bbe9e989f0335f98465f30886d8173

- port_machine: brutasse

- port_time: 2020-08-18-01:48