tiedeman commited on
Commit
1159ef3
1 Parent(s): 38f195d

Initial commit

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ *.spm filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,3860 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ language:
4
+ - acf
5
+ - an
6
+ - ast
7
+ - ca
8
+ - cbk
9
+ - co
10
+ - crs
11
+ - de
12
+ - egl
13
+ - en
14
+ - es
15
+ - ext
16
+ - fr
17
+ - frm
18
+ - fro
19
+ - frp
20
+ - fur
21
+ - gcf
22
+ - gl
23
+ - ht
24
+ - it
25
+ - kea
26
+ - la
27
+ - lad
28
+ - lij
29
+ - lld
30
+ - lmo
31
+ - lou
32
+ - mfe
33
+ - mo
34
+ - mwl
35
+ - nap
36
+ - oc
37
+ - osp
38
+ - pap
39
+ - pcd
40
+ - pms
41
+ - pt
42
+ - rm
43
+ - ro
44
+ - rup
45
+ - sc
46
+ - scn
47
+ - vec
48
+ - wa
49
+
50
+ tags:
51
+ - translation
52
+ - opus-mt-tc-bible
53
+
54
+ license: apache-2.0
55
+ model-index:
56
+ - name: opus-mt-tc-bible-big-deu_eng_fra_por_spa-itc
57
+ results:
58
+ - task:
59
+ name: Translation deu-ast
60
+ type: translation
61
+ args: deu-ast
62
+ dataset:
63
+ name: flores200-devtest
64
+ type: flores200-devtest
65
+ args: deu-ast
66
+ metrics:
67
+ - name: BLEU
68
+ type: bleu
69
+ value: 22.1
70
+ - name: chr-F
71
+ type: chrf
72
+ value: 0.53782
73
+ - task:
74
+ name: Translation deu-cat
75
+ type: translation
76
+ args: deu-cat
77
+ dataset:
78
+ name: flores200-devtest
79
+ type: flores200-devtest
80
+ args: deu-cat
81
+ metrics:
82
+ - name: BLEU
83
+ type: bleu
84
+ value: 32.2
85
+ - name: chr-F
86
+ type: chrf
87
+ value: 0.58846
88
+ - task:
89
+ name: Translation deu-fra
90
+ type: translation
91
+ args: deu-fra
92
+ dataset:
93
+ name: flores200-devtest
94
+ type: flores200-devtest
95
+ args: deu-fra
96
+ metrics:
97
+ - name: BLEU
98
+ type: bleu
99
+ value: 37.2
100
+ - name: chr-F
101
+ type: chrf
102
+ value: 0.62803
103
+ - task:
104
+ name: Translation deu-fur
105
+ type: translation
106
+ args: deu-fur
107
+ dataset:
108
+ name: flores200-devtest
109
+ type: flores200-devtest
110
+ args: deu-fur
111
+ metrics:
112
+ - name: BLEU
113
+ type: bleu
114
+ value: 18.7
115
+ - name: chr-F
116
+ type: chrf
117
+ value: 0.46372
118
+ - task:
119
+ name: Translation deu-glg
120
+ type: translation
121
+ args: deu-glg
122
+ dataset:
123
+ name: flores200-devtest
124
+ type: flores200-devtest
125
+ args: deu-glg
126
+ metrics:
127
+ - name: BLEU
128
+ type: bleu
129
+ value: 28.7
130
+ - name: chr-F
131
+ type: chrf
132
+ value: 0.56229
133
+ - task:
134
+ name: Translation deu-hat
135
+ type: translation
136
+ args: deu-hat
137
+ dataset:
138
+ name: flores200-devtest
139
+ type: flores200-devtest
140
+ args: deu-hat
141
+ metrics:
142
+ - name: BLEU
143
+ type: bleu
144
+ value: 15.7
145
+ - name: chr-F
146
+ type: chrf
147
+ value: 0.46752
148
+ - task:
149
+ name: Translation deu-ita
150
+ type: translation
151
+ args: deu-ita
152
+ dataset:
153
+ name: flores200-devtest
154
+ type: flores200-devtest
155
+ args: deu-ita
156
+ metrics:
157
+ - name: BLEU
158
+ type: bleu
159
+ value: 25.8
160
+ - name: chr-F
161
+ type: chrf
162
+ value: 0.55344
163
+ - task:
164
+ name: Translation deu-lij
165
+ type: translation
166
+ args: deu-lij
167
+ dataset:
168
+ name: flores200-devtest
169
+ type: flores200-devtest
170
+ args: deu-lij
171
+ metrics:
172
+ - name: BLEU
173
+ type: bleu
174
+ value: 11.8
175
+ - name: chr-F
176
+ type: chrf
177
+ value: 0.40732
178
+ - task:
179
+ name: Translation deu-oci
180
+ type: translation
181
+ args: deu-oci
182
+ dataset:
183
+ name: flores200-devtest
184
+ type: flores200-devtest
185
+ args: deu-oci
186
+ metrics:
187
+ - name: BLEU
188
+ type: bleu
189
+ value: 23.1
190
+ - name: chr-F
191
+ type: chrf
192
+ value: 0.52749
193
+ - task:
194
+ name: Translation deu-pap
195
+ type: translation
196
+ args: deu-pap
197
+ dataset:
198
+ name: flores200-devtest
199
+ type: flores200-devtest
200
+ args: deu-pap
201
+ metrics:
202
+ - name: BLEU
203
+ type: bleu
204
+ value: 22.4
205
+ - name: chr-F
206
+ type: chrf
207
+ value: 0.49721
208
+ - task:
209
+ name: Translation deu-por
210
+ type: translation
211
+ args: deu-por
212
+ dataset:
213
+ name: flores200-devtest
214
+ type: flores200-devtest
215
+ args: deu-por
216
+ metrics:
217
+ - name: BLEU
218
+ type: bleu
219
+ value: 34.7
220
+ - name: chr-F
221
+ type: chrf
222
+ value: 0.60818
223
+ - task:
224
+ name: Translation deu-ron
225
+ type: translation
226
+ args: deu-ron
227
+ dataset:
228
+ name: flores200-devtest
229
+ type: flores200-devtest
230
+ args: deu-ron
231
+ metrics:
232
+ - name: BLEU
233
+ type: bleu
234
+ value: 31.1
235
+ - name: chr-F
236
+ type: chrf
237
+ value: 0.57873
238
+ - task:
239
+ name: Translation deu-spa
240
+ type: translation
241
+ args: deu-spa
242
+ dataset:
243
+ name: flores200-devtest
244
+ type: flores200-devtest
245
+ args: deu-spa
246
+ metrics:
247
+ - name: BLEU
248
+ type: bleu
249
+ value: 24.4
250
+ - name: chr-F
251
+ type: chrf
252
+ value: 0.52442
253
+ - task:
254
+ name: Translation deu-srd
255
+ type: translation
256
+ args: deu-srd
257
+ dataset:
258
+ name: flores200-devtest
259
+ type: flores200-devtest
260
+ args: deu-srd
261
+ metrics:
262
+ - name: BLEU
263
+ type: bleu
264
+ value: 16.1
265
+ - name: chr-F
266
+ type: chrf
267
+ value: 0.45629
268
+ - task:
269
+ name: Translation eng-ast
270
+ type: translation
271
+ args: eng-ast
272
+ dataset:
273
+ name: flores200-devtest
274
+ type: flores200-devtest
275
+ args: eng-ast
276
+ metrics:
277
+ - name: BLEU
278
+ type: bleu
279
+ value: 27.8
280
+ - name: chr-F
281
+ type: chrf
282
+ value: 0.59255
283
+ - task:
284
+ name: Translation eng-cat
285
+ type: translation
286
+ args: eng-cat
287
+ dataset:
288
+ name: flores200-devtest
289
+ type: flores200-devtest
290
+ args: eng-cat
291
+ metrics:
292
+ - name: BLEU
293
+ type: bleu
294
+ value: 42.8
295
+ - name: chr-F
296
+ type: chrf
297
+ value: 0.66809
298
+ - task:
299
+ name: Translation eng-fra
300
+ type: translation
301
+ args: eng-fra
302
+ dataset:
303
+ name: flores200-devtest
304
+ type: flores200-devtest
305
+ args: eng-fra
306
+ metrics:
307
+ - name: BLEU
308
+ type: bleu
309
+ value: 49.5
310
+ - name: chr-F
311
+ type: chrf
312
+ value: 0.71001
313
+ - task:
314
+ name: Translation eng-fur
315
+ type: translation
316
+ args: eng-fur
317
+ dataset:
318
+ name: flores200-devtest
319
+ type: flores200-devtest
320
+ args: eng-fur
321
+ metrics:
322
+ - name: BLEU
323
+ type: bleu
324
+ value: 23.0
325
+ - name: chr-F
326
+ type: chrf
327
+ value: 0.49164
328
+ - task:
329
+ name: Translation eng-glg
330
+ type: translation
331
+ args: eng-glg
332
+ dataset:
333
+ name: flores200-devtest
334
+ type: flores200-devtest
335
+ args: eng-glg
336
+ metrics:
337
+ - name: BLEU
338
+ type: bleu
339
+ value: 36.1
340
+ - name: chr-F
341
+ type: chrf
342
+ value: 0.62349
343
+ - task:
344
+ name: Translation eng-hat
345
+ type: translation
346
+ args: eng-hat
347
+ dataset:
348
+ name: flores200-devtest
349
+ type: flores200-devtest
350
+ args: eng-hat
351
+ metrics:
352
+ - name: BLEU
353
+ type: bleu
354
+ value: 21.3
355
+ - name: chr-F
356
+ type: chrf
357
+ value: 0.51720
358
+ - task:
359
+ name: Translation eng-ita
360
+ type: translation
361
+ args: eng-ita
362
+ dataset:
363
+ name: flores200-devtest
364
+ type: flores200-devtest
365
+ args: eng-ita
366
+ metrics:
367
+ - name: BLEU
368
+ type: bleu
369
+ value: 29.7
370
+ - name: chr-F
371
+ type: chrf
372
+ value: 0.58898
373
+ - task:
374
+ name: Translation eng-kea
375
+ type: translation
376
+ args: eng-kea
377
+ dataset:
378
+ name: flores200-devtest
379
+ type: flores200-devtest
380
+ args: eng-kea
381
+ metrics:
382
+ - name: BLEU
383
+ type: bleu
384
+ value: 11.0
385
+ - name: chr-F
386
+ type: chrf
387
+ value: 0.34963
388
+ - task:
389
+ name: Translation eng-lij
390
+ type: translation
391
+ args: eng-lij
392
+ dataset:
393
+ name: flores200-devtest
394
+ type: flores200-devtest
395
+ args: eng-lij
396
+ metrics:
397
+ - name: BLEU
398
+ type: bleu
399
+ value: 14.8
400
+ - name: chr-F
401
+ type: chrf
402
+ value: 0.43644
403
+ - task:
404
+ name: Translation eng-oci
405
+ type: translation
406
+ args: eng-oci
407
+ dataset:
408
+ name: flores200-devtest
409
+ type: flores200-devtest
410
+ args: eng-oci
411
+ metrics:
412
+ - name: BLEU
413
+ type: bleu
414
+ value: 35.2
415
+ - name: chr-F
416
+ type: chrf
417
+ value: 0.63245
418
+ - task:
419
+ name: Translation eng-pap
420
+ type: translation
421
+ args: eng-pap
422
+ dataset:
423
+ name: flores200-devtest
424
+ type: flores200-devtest
425
+ args: eng-pap
426
+ metrics:
427
+ - name: BLEU
428
+ type: bleu
429
+ value: 30.4
430
+ - name: chr-F
431
+ type: chrf
432
+ value: 0.56775
433
+ - task:
434
+ name: Translation eng-por
435
+ type: translation
436
+ args: eng-por
437
+ dataset:
438
+ name: flores200-devtest
439
+ type: flores200-devtest
440
+ args: eng-por
441
+ metrics:
442
+ - name: BLEU
443
+ type: bleu
444
+ value: 50.0
445
+ - name: chr-F
446
+ type: chrf
447
+ value: 0.71438
448
+ - task:
449
+ name: Translation eng-ron
450
+ type: translation
451
+ args: eng-ron
452
+ dataset:
453
+ name: flores200-devtest
454
+ type: flores200-devtest
455
+ args: eng-ron
456
+ metrics:
457
+ - name: BLEU
458
+ type: bleu
459
+ value: 41.2
460
+ - name: chr-F
461
+ type: chrf
462
+ value: 0.65373
463
+ - task:
464
+ name: Translation eng-spa
465
+ type: translation
466
+ args: eng-spa
467
+ dataset:
468
+ name: flores200-devtest
469
+ type: flores200-devtest
470
+ args: eng-spa
471
+ metrics:
472
+ - name: BLEU
473
+ type: bleu
474
+ value: 27.6
475
+ - name: chr-F
476
+ type: chrf
477
+ value: 0.55784
478
+ - task:
479
+ name: Translation eng-srd
480
+ type: translation
481
+ args: eng-srd
482
+ dataset:
483
+ name: flores200-devtest
484
+ type: flores200-devtest
485
+ args: eng-srd
486
+ metrics:
487
+ - name: BLEU
488
+ type: bleu
489
+ value: 21.0
490
+ - name: chr-F
491
+ type: chrf
492
+ value: 0.49876
493
+ - task:
494
+ name: Translation fra-ast
495
+ type: translation
496
+ args: fra-ast
497
+ dataset:
498
+ name: flores200-devtest
499
+ type: flores200-devtest
500
+ args: fra-ast
501
+ metrics:
502
+ - name: BLEU
503
+ type: bleu
504
+ value: 22.0
505
+ - name: chr-F
506
+ type: chrf
507
+ value: 0.53904
508
+ - task:
509
+ name: Translation fra-cat
510
+ type: translation
511
+ args: fra-cat
512
+ dataset:
513
+ name: flores200-devtest
514
+ type: flores200-devtest
515
+ args: fra-cat
516
+ metrics:
517
+ - name: BLEU
518
+ type: bleu
519
+ value: 34.5
520
+ - name: chr-F
521
+ type: chrf
522
+ value: 0.60549
523
+ - task:
524
+ name: Translation fra-fur
525
+ type: translation
526
+ args: fra-fur
527
+ dataset:
528
+ name: flores200-devtest
529
+ type: flores200-devtest
530
+ args: fra-fur
531
+ metrics:
532
+ - name: BLEU
533
+ type: bleu
534
+ value: 21.4
535
+ - name: chr-F
536
+ type: chrf
537
+ value: 0.49119
538
+ - task:
539
+ name: Translation fra-glg
540
+ type: translation
541
+ args: fra-glg
542
+ dataset:
543
+ name: flores200-devtest
544
+ type: flores200-devtest
545
+ args: fra-glg
546
+ metrics:
547
+ - name: BLEU
548
+ type: bleu
549
+ value: 31.3
550
+ - name: chr-F
551
+ type: chrf
552
+ value: 0.57998
553
+ - task:
554
+ name: Translation fra-hat
555
+ type: translation
556
+ args: fra-hat
557
+ dataset:
558
+ name: flores200-devtest
559
+ type: flores200-devtest
560
+ args: fra-hat
561
+ metrics:
562
+ - name: BLEU
563
+ type: bleu
564
+ value: 20.7
565
+ - name: chr-F
566
+ type: chrf
567
+ value: 0.52018
568
+ - task:
569
+ name: Translation fra-ita
570
+ type: translation
571
+ args: fra-ita
572
+ dataset:
573
+ name: flores200-devtest
574
+ type: flores200-devtest
575
+ args: fra-ita
576
+ metrics:
577
+ - name: BLEU
578
+ type: bleu
579
+ value: 27.0
580
+ - name: chr-F
581
+ type: chrf
582
+ value: 0.56470
583
+ - task:
584
+ name: Translation fra-kea
585
+ type: translation
586
+ args: fra-kea
587
+ dataset:
588
+ name: flores200-devtest
589
+ type: flores200-devtest
590
+ args: fra-kea
591
+ metrics:
592
+ - name: BLEU
593
+ type: bleu
594
+ value: 11.2
595
+ - name: chr-F
596
+ type: chrf
597
+ value: 0.38741
598
+ - task:
599
+ name: Translation fra-lij
600
+ type: translation
601
+ args: fra-lij
602
+ dataset:
603
+ name: flores200-devtest
604
+ type: flores200-devtest
605
+ args: fra-lij
606
+ metrics:
607
+ - name: BLEU
608
+ type: bleu
609
+ value: 13.6
610
+ - name: chr-F
611
+ type: chrf
612
+ value: 0.43180
613
+ - task:
614
+ name: Translation fra-oci
615
+ type: translation
616
+ args: fra-oci
617
+ dataset:
618
+ name: flores200-devtest
619
+ type: flores200-devtest
620
+ args: fra-oci
621
+ metrics:
622
+ - name: BLEU
623
+ type: bleu
624
+ value: 29.2
625
+ - name: chr-F
626
+ type: chrf
627
+ value: 0.58268
628
+ - task:
629
+ name: Translation fra-pap
630
+ type: translation
631
+ args: fra-pap
632
+ dataset:
633
+ name: flores200-devtest
634
+ type: flores200-devtest
635
+ args: fra-pap
636
+ metrics:
637
+ - name: BLEU
638
+ type: bleu
639
+ value: 23.6
640
+ - name: chr-F
641
+ type: chrf
642
+ value: 0.51029
643
+ - task:
644
+ name: Translation fra-por
645
+ type: translation
646
+ args: fra-por
647
+ dataset:
648
+ name: flores200-devtest
649
+ type: flores200-devtest
650
+ args: fra-por
651
+ metrics:
652
+ - name: BLEU
653
+ type: bleu
654
+ value: 37.5
655
+ - name: chr-F
656
+ type: chrf
657
+ value: 0.62540
658
+ - task:
659
+ name: Translation fra-ron
660
+ type: translation
661
+ args: fra-ron
662
+ dataset:
663
+ name: flores200-devtest
664
+ type: flores200-devtest
665
+ args: fra-ron
666
+ metrics:
667
+ - name: BLEU
668
+ type: bleu
669
+ value: 32.7
670
+ - name: chr-F
671
+ type: chrf
672
+ value: 0.59255
673
+ - task:
674
+ name: Translation fra-spa
675
+ type: translation
676
+ args: fra-spa
677
+ dataset:
678
+ name: flores200-devtest
679
+ type: flores200-devtest
680
+ args: fra-spa
681
+ metrics:
682
+ - name: BLEU
683
+ type: bleu
684
+ value: 24.4
685
+ - name: chr-F
686
+ type: chrf
687
+ value: 0.53001
688
+ - task:
689
+ name: Translation fra-srd
690
+ type: translation
691
+ args: fra-srd
692
+ dataset:
693
+ name: flores200-devtest
694
+ type: flores200-devtest
695
+ args: fra-srd
696
+ metrics:
697
+ - name: BLEU
698
+ type: bleu
699
+ value: 17.9
700
+ - name: chr-F
701
+ type: chrf
702
+ value: 0.47645
703
+ - task:
704
+ name: Translation por-ast
705
+ type: translation
706
+ args: por-ast
707
+ dataset:
708
+ name: flores200-devtest
709
+ type: flores200-devtest
710
+ args: por-ast
711
+ metrics:
712
+ - name: BLEU
713
+ type: bleu
714
+ value: 23.9
715
+ - name: chr-F
716
+ type: chrf
717
+ value: 0.55369
718
+ - task:
719
+ name: Translation por-cat
720
+ type: translation
721
+ args: por-cat
722
+ dataset:
723
+ name: flores200-devtest
724
+ type: flores200-devtest
725
+ args: por-cat
726
+ metrics:
727
+ - name: BLEU
728
+ type: bleu
729
+ value: 36.4
730
+ - name: chr-F
731
+ type: chrf
732
+ value: 0.61981
733
+ - task:
734
+ name: Translation por-fra
735
+ type: translation
736
+ args: por-fra
737
+ dataset:
738
+ name: flores200-devtest
739
+ type: flores200-devtest
740
+ args: por-fra
741
+ metrics:
742
+ - name: BLEU
743
+ type: bleu
744
+ value: 40.4
745
+ - name: chr-F
746
+ type: chrf
747
+ value: 0.64654
748
+ - task:
749
+ name: Translation por-fur
750
+ type: translation
751
+ args: por-fur
752
+ dataset:
753
+ name: flores200-devtest
754
+ type: flores200-devtest
755
+ args: por-fur
756
+ metrics:
757
+ - name: BLEU
758
+ type: bleu
759
+ value: 22.1
760
+ - name: chr-F
761
+ type: chrf
762
+ value: 0.50078
763
+ - task:
764
+ name: Translation por-glg
765
+ type: translation
766
+ args: por-glg
767
+ dataset:
768
+ name: flores200-devtest
769
+ type: flores200-devtest
770
+ args: por-glg
771
+ metrics:
772
+ - name: BLEU
773
+ type: bleu
774
+ value: 31.1
775
+ - name: chr-F
776
+ type: chrf
777
+ value: 0.58336
778
+ - task:
779
+ name: Translation por-hat
780
+ type: translation
781
+ args: por-hat
782
+ dataset:
783
+ name: flores200-devtest
784
+ type: flores200-devtest
785
+ args: por-hat
786
+ metrics:
787
+ - name: BLEU
788
+ type: bleu
789
+ value: 18.0
790
+ - name: chr-F
791
+ type: chrf
792
+ value: 0.48834
793
+ - task:
794
+ name: Translation por-ita
795
+ type: translation
796
+ args: por-ita
797
+ dataset:
798
+ name: flores200-devtest
799
+ type: flores200-devtest
800
+ args: por-ita
801
+ metrics:
802
+ - name: BLEU
803
+ type: bleu
804
+ value: 26.7
805
+ - name: chr-F
806
+ type: chrf
807
+ value: 0.56077
808
+ - task:
809
+ name: Translation por-kea
810
+ type: translation
811
+ args: por-kea
812
+ dataset:
813
+ name: flores200-devtest
814
+ type: flores200-devtest
815
+ args: por-kea
816
+ metrics:
817
+ - name: BLEU
818
+ type: bleu
819
+ value: 13.6
820
+ - name: chr-F
821
+ type: chrf
822
+ value: 0.42451
823
+ - task:
824
+ name: Translation por-lij
825
+ type: translation
826
+ args: por-lij
827
+ dataset:
828
+ name: flores200-devtest
829
+ type: flores200-devtest
830
+ args: por-lij
831
+ metrics:
832
+ - name: BLEU
833
+ type: bleu
834
+ value: 13.4
835
+ - name: chr-F
836
+ type: chrf
837
+ value: 0.43715
838
+ - task:
839
+ name: Translation por-oci
840
+ type: translation
841
+ args: por-oci
842
+ dataset:
843
+ name: flores200-devtest
844
+ type: flores200-devtest
845
+ args: por-oci
846
+ metrics:
847
+ - name: BLEU
848
+ type: bleu
849
+ value: 28.1
850
+ - name: chr-F
851
+ type: chrf
852
+ value: 0.57143
853
+ - task:
854
+ name: Translation por-pap
855
+ type: translation
856
+ args: por-pap
857
+ dataset:
858
+ name: flores200-devtest
859
+ type: flores200-devtest
860
+ args: por-pap
861
+ metrics:
862
+ - name: BLEU
863
+ type: bleu
864
+ value: 25.0
865
+ - name: chr-F
866
+ type: chrf
867
+ value: 0.52192
868
+ - task:
869
+ name: Translation por-ron
870
+ type: translation
871
+ args: por-ron
872
+ dataset:
873
+ name: flores200-devtest
874
+ type: flores200-devtest
875
+ args: por-ron
876
+ metrics:
877
+ - name: BLEU
878
+ type: bleu
879
+ value: 34.2
880
+ - name: chr-F
881
+ type: chrf
882
+ value: 0.59962
883
+ - task:
884
+ name: Translation por-spa
885
+ type: translation
886
+ args: por-spa
887
+ dataset:
888
+ name: flores200-devtest
889
+ type: flores200-devtest
890
+ args: por-spa
891
+ metrics:
892
+ - name: BLEU
893
+ type: bleu
894
+ value: 25.6
895
+ - name: chr-F
896
+ type: chrf
897
+ value: 0.53772
898
+ - task:
899
+ name: Translation por-srd
900
+ type: translation
901
+ args: por-srd
902
+ dataset:
903
+ name: flores200-devtest
904
+ type: flores200-devtest
905
+ args: por-srd
906
+ metrics:
907
+ - name: BLEU
908
+ type: bleu
909
+ value: 18.8
910
+ - name: chr-F
911
+ type: chrf
912
+ value: 0.48882
913
+ - task:
914
+ name: Translation spa-ast
915
+ type: translation
916
+ args: spa-ast
917
+ dataset:
918
+ name: flores200-devtest
919
+ type: flores200-devtest
920
+ args: spa-ast
921
+ metrics:
922
+ - name: BLEU
923
+ type: bleu
924
+ value: 16.3
925
+ - name: chr-F
926
+ type: chrf
927
+ value: 0.49512
928
+ - task:
929
+ name: Translation spa-cat
930
+ type: translation
931
+ args: spa-cat
932
+ dataset:
933
+ name: flores200-devtest
934
+ type: flores200-devtest
935
+ args: spa-cat
936
+ metrics:
937
+ - name: BLEU
938
+ type: bleu
939
+ value: 23.1
940
+ - name: chr-F
941
+ type: chrf
942
+ value: 0.53968
943
+ - task:
944
+ name: Translation spa-fra
945
+ type: translation
946
+ args: spa-fra
947
+ dataset:
948
+ name: flores200-devtest
949
+ type: flores200-devtest
950
+ args: spa-fra
951
+ metrics:
952
+ - name: BLEU
953
+ type: bleu
954
+ value: 27.9
955
+ - name: chr-F
956
+ type: chrf
957
+ value: 0.57461
958
+ - task:
959
+ name: Translation spa-fur
960
+ type: translation
961
+ args: spa-fur
962
+ dataset:
963
+ name: flores200-devtest
964
+ type: flores200-devtest
965
+ args: spa-fur
966
+ metrics:
967
+ - name: BLEU
968
+ type: bleu
969
+ value: 16.1
970
+ - name: chr-F
971
+ type: chrf
972
+ value: 0.45785
973
+ - task:
974
+ name: Translation spa-glg
975
+ type: translation
976
+ args: spa-glg
977
+ dataset:
978
+ name: flores200-devtest
979
+ type: flores200-devtest
980
+ args: spa-glg
981
+ metrics:
982
+ - name: BLEU
983
+ type: bleu
984
+ value: 22.2
985
+ - name: chr-F
986
+ type: chrf
987
+ value: 0.52933
988
+ - task:
989
+ name: Translation spa-hat
990
+ type: translation
991
+ args: spa-hat
992
+ dataset:
993
+ name: flores200-devtest
994
+ type: flores200-devtest
995
+ args: spa-hat
996
+ metrics:
997
+ - name: BLEU
998
+ type: bleu
999
+ value: 13.0
1000
+ - name: chr-F
1001
+ type: chrf
1002
+ value: 0.44627
1003
+ - task:
1004
+ name: Translation spa-ita
1005
+ type: translation
1006
+ args: spa-ita
1007
+ dataset:
1008
+ name: flores200-devtest
1009
+ type: flores200-devtest
1010
+ args: spa-ita
1011
+ metrics:
1012
+ - name: BLEU
1013
+ type: bleu
1014
+ value: 22.4
1015
+ - name: chr-F
1016
+ type: chrf
1017
+ value: 0.53063
1018
+ - task:
1019
+ name: Translation spa-lij
1020
+ type: translation
1021
+ args: spa-lij
1022
+ dataset:
1023
+ name: flores200-devtest
1024
+ type: flores200-devtest
1025
+ args: spa-lij
1026
+ metrics:
1027
+ - name: BLEU
1028
+ type: bleu
1029
+ value: 10.2
1030
+ - name: chr-F
1031
+ type: chrf
1032
+ value: 0.39784
1033
+ - task:
1034
+ name: Translation spa-oci
1035
+ type: translation
1036
+ args: spa-oci
1037
+ dataset:
1038
+ name: flores200-devtest
1039
+ type: flores200-devtest
1040
+ args: spa-oci
1041
+ metrics:
1042
+ - name: BLEU
1043
+ type: bleu
1044
+ value: 17.4
1045
+ - name: chr-F
1046
+ type: chrf
1047
+ value: 0.49293
1048
+ - task:
1049
+ name: Translation spa-pap
1050
+ type: translation
1051
+ args: spa-pap
1052
+ dataset:
1053
+ name: flores200-devtest
1054
+ type: flores200-devtest
1055
+ args: spa-pap
1056
+ metrics:
1057
+ - name: BLEU
1058
+ type: bleu
1059
+ value: 17.7
1060
+ - name: chr-F
1061
+ type: chrf
1062
+ value: 0.46595
1063
+ - task:
1064
+ name: Translation spa-por
1065
+ type: translation
1066
+ args: spa-por
1067
+ dataset:
1068
+ name: flores200-devtest
1069
+ type: flores200-devtest
1070
+ args: spa-por
1071
+ metrics:
1072
+ - name: BLEU
1073
+ type: bleu
1074
+ value: 25.9
1075
+ - name: chr-F
1076
+ type: chrf
1077
+ value: 0.56138
1078
+ - task:
1079
+ name: Translation spa-ron
1080
+ type: translation
1081
+ args: spa-ron
1082
+ dataset:
1083
+ name: flores200-devtest
1084
+ type: flores200-devtest
1085
+ args: spa-ron
1086
+ metrics:
1087
+ - name: BLEU
1088
+ type: bleu
1089
+ value: 23.8
1090
+ - name: chr-F
1091
+ type: chrf
1092
+ value: 0.53609
1093
+ - task:
1094
+ name: Translation spa-srd
1095
+ type: translation
1096
+ args: spa-srd
1097
+ dataset:
1098
+ name: flores200-devtest
1099
+ type: flores200-devtest
1100
+ args: spa-srd
1101
+ metrics:
1102
+ - name: BLEU
1103
+ type: bleu
1104
+ value: 13.3
1105
+ - name: chr-F
1106
+ type: chrf
1107
+ value: 0.44898
1108
+ - task:
1109
+ name: Translation deu-ast
1110
+ type: translation
1111
+ args: deu-ast
1112
+ dataset:
1113
+ name: flores101-devtest
1114
+ type: flores_101
1115
+ args: deu ast devtest
1116
+ metrics:
1117
+ - name: BLEU
1118
+ type: bleu
1119
+ value: 21.5
1120
+ - name: chr-F
1121
+ type: chrf
1122
+ value: 0.53230
1123
+ - task:
1124
+ name: Translation deu-cat
1125
+ type: translation
1126
+ args: deu-cat
1127
+ dataset:
1128
+ name: flores101-devtest
1129
+ type: flores_101
1130
+ args: deu cat devtest
1131
+ metrics:
1132
+ - name: BLEU
1133
+ type: bleu
1134
+ value: 31.6
1135
+ - name: chr-F
1136
+ type: chrf
1137
+ value: 0.58466
1138
+ - task:
1139
+ name: Translation deu-fra
1140
+ type: translation
1141
+ args: deu-fra
1142
+ dataset:
1143
+ name: flores101-devtest
1144
+ type: flores_101
1145
+ args: deu fra devtest
1146
+ metrics:
1147
+ - name: BLEU
1148
+ type: bleu
1149
+ value: 36.5
1150
+ - name: chr-F
1151
+ type: chrf
1152
+ value: 0.62370
1153
+ - task:
1154
+ name: Translation deu-glg
1155
+ type: translation
1156
+ args: deu-glg
1157
+ dataset:
1158
+ name: flores101-devtest
1159
+ type: flores_101
1160
+ args: deu glg devtest
1161
+ metrics:
1162
+ - name: BLEU
1163
+ type: bleu
1164
+ value: 28.0
1165
+ - name: chr-F
1166
+ type: chrf
1167
+ value: 0.55693
1168
+ - task:
1169
+ name: Translation deu-oci
1170
+ type: translation
1171
+ args: deu-oci
1172
+ dataset:
1173
+ name: flores101-devtest
1174
+ type: flores_101
1175
+ args: deu oci devtest
1176
+ metrics:
1177
+ - name: BLEU
1178
+ type: bleu
1179
+ value: 22.3
1180
+ - name: chr-F
1181
+ type: chrf
1182
+ value: 0.52253
1183
+ - task:
1184
+ name: Translation deu-por
1185
+ type: translation
1186
+ args: deu-por
1187
+ dataset:
1188
+ name: flores101-devtest
1189
+ type: flores_101
1190
+ args: deu por devtest
1191
+ metrics:
1192
+ - name: BLEU
1193
+ type: bleu
1194
+ value: 34.8
1195
+ - name: chr-F
1196
+ type: chrf
1197
+ value: 0.60688
1198
+ - task:
1199
+ name: Translation deu-ron
1200
+ type: translation
1201
+ args: deu-ron
1202
+ dataset:
1203
+ name: flores101-devtest
1204
+ type: flores_101
1205
+ args: deu ron devtest
1206
+ metrics:
1207
+ - name: BLEU
1208
+ type: bleu
1209
+ value: 30.3
1210
+ - name: chr-F
1211
+ type: chrf
1212
+ value: 0.57333
1213
+ - task:
1214
+ name: Translation eng-cat
1215
+ type: translation
1216
+ args: eng-cat
1217
+ dataset:
1218
+ name: flores101-devtest
1219
+ type: flores_101
1220
+ args: eng cat devtest
1221
+ metrics:
1222
+ - name: BLEU
1223
+ type: bleu
1224
+ value: 42.5
1225
+ - name: chr-F
1226
+ type: chrf
1227
+ value: 0.66607
1228
+ - task:
1229
+ name: Translation eng-fra
1230
+ type: translation
1231
+ args: eng-fra
1232
+ dataset:
1233
+ name: flores101-devtest
1234
+ type: flores_101
1235
+ args: eng fra devtest
1236
+ metrics:
1237
+ - name: BLEU
1238
+ type: bleu
1239
+ value: 48.8
1240
+ - name: chr-F
1241
+ type: chrf
1242
+ value: 0.70492
1243
+ - task:
1244
+ name: Translation eng-kea
1245
+ type: translation
1246
+ args: eng-kea
1247
+ dataset:
1248
+ name: flores101-devtest
1249
+ type: flores_101
1250
+ args: eng kea devtest
1251
+ metrics:
1252
+ - name: BLEU
1253
+ type: bleu
1254
+ value: 10.7
1255
+ - name: chr-F
1256
+ type: chrf
1257
+ value: 0.34867
1258
+ - task:
1259
+ name: Translation eng-por
1260
+ type: translation
1261
+ args: eng-por
1262
+ dataset:
1263
+ name: flores101-devtest
1264
+ type: flores_101
1265
+ args: eng por devtest
1266
+ metrics:
1267
+ - name: BLEU
1268
+ type: bleu
1269
+ value: 49.3
1270
+ - name: chr-F
1271
+ type: chrf
1272
+ value: 0.71112
1273
+ - task:
1274
+ name: Translation eng-ron
1275
+ type: translation
1276
+ args: eng-ron
1277
+ dataset:
1278
+ name: flores101-devtest
1279
+ type: flores_101
1280
+ args: eng ron devtest
1281
+ metrics:
1282
+ - name: BLEU
1283
+ type: bleu
1284
+ value: 40.3
1285
+ - name: chr-F
1286
+ type: chrf
1287
+ value: 0.64856
1288
+ - task:
1289
+ name: Translation fra-oci
1290
+ type: translation
1291
+ args: fra-oci
1292
+ dataset:
1293
+ name: flores101-devtest
1294
+ type: flores_101
1295
+ args: fra oci devtest
1296
+ metrics:
1297
+ - name: BLEU
1298
+ type: bleu
1299
+ value: 29.2
1300
+ - name: chr-F
1301
+ type: chrf
1302
+ value: 0.58559
1303
+ - task:
1304
+ name: Translation fra-ron
1305
+ type: translation
1306
+ args: fra-ron
1307
+ dataset:
1308
+ name: flores101-devtest
1309
+ type: flores_101
1310
+ args: fra ron devtest
1311
+ metrics:
1312
+ - name: BLEU
1313
+ type: bleu
1314
+ value: 32.1
1315
+ - name: chr-F
1316
+ type: chrf
1317
+ value: 0.58922
1318
+ - task:
1319
+ name: Translation por-kea
1320
+ type: translation
1321
+ args: por-kea
1322
+ dataset:
1323
+ name: flores101-devtest
1324
+ type: flores_101
1325
+ args: por kea devtest
1326
+ metrics:
1327
+ - name: BLEU
1328
+ type: bleu
1329
+ value: 12.8
1330
+ - name: chr-F
1331
+ type: chrf
1332
+ value: 0.40779
1333
+ - task:
1334
+ name: Translation por-oci
1335
+ type: translation
1336
+ args: por-oci
1337
+ dataset:
1338
+ name: flores101-devtest
1339
+ type: flores_101
1340
+ args: por oci devtest
1341
+ metrics:
1342
+ - name: BLEU
1343
+ type: bleu
1344
+ value: 27.5
1345
+ - name: chr-F
1346
+ type: chrf
1347
+ value: 0.57016
1348
+ - task:
1349
+ name: Translation spa-ast
1350
+ type: translation
1351
+ args: spa-ast
1352
+ dataset:
1353
+ name: flores101-devtest
1354
+ type: flores_101
1355
+ args: spa ast devtest
1356
+ metrics:
1357
+ - name: BLEU
1358
+ type: bleu
1359
+ value: 16.3
1360
+ - name: chr-F
1361
+ type: chrf
1362
+ value: 0.49666
1363
+ - task:
1364
+ name: Translation spa-cat
1365
+ type: translation
1366
+ args: spa-cat
1367
+ dataset:
1368
+ name: flores101-devtest
1369
+ type: flores_101
1370
+ args: spa cat devtest
1371
+ metrics:
1372
+ - name: BLEU
1373
+ type: bleu
1374
+ value: 23.2
1375
+ - name: chr-F
1376
+ type: chrf
1377
+ value: 0.54015
1378
+ - task:
1379
+ name: Translation spa-glg
1380
+ type: translation
1381
+ args: spa-glg
1382
+ dataset:
1383
+ name: flores101-devtest
1384
+ type: flores_101
1385
+ args: spa glg devtest
1386
+ metrics:
1387
+ - name: BLEU
1388
+ type: bleu
1389
+ value: 22.1
1390
+ - name: chr-F
1391
+ type: chrf
1392
+ value: 0.52923
1393
+ - task:
1394
+ name: Translation spa-oci
1395
+ type: translation
1396
+ args: spa-oci
1397
+ dataset:
1398
+ name: flores101-devtest
1399
+ type: flores_101
1400
+ args: spa oci devtest
1401
+ metrics:
1402
+ - name: BLEU
1403
+ type: bleu
1404
+ value: 17.2
1405
+ - name: chr-F
1406
+ type: chrf
1407
+ value: 0.49285
1408
+ - task:
1409
+ name: Translation spa-por
1410
+ type: translation
1411
+ args: spa-por
1412
+ dataset:
1413
+ name: flores101-devtest
1414
+ type: flores_101
1415
+ args: spa por devtest
1416
+ metrics:
1417
+ - name: BLEU
1418
+ type: bleu
1419
+ value: 25.7
1420
+ - name: chr-F
1421
+ type: chrf
1422
+ value: 0.55944
1423
+ - task:
1424
+ name: Translation spa-ron
1425
+ type: translation
1426
+ args: spa-ron
1427
+ dataset:
1428
+ name: flores101-devtest
1429
+ type: flores_101
1430
+ args: spa ron devtest
1431
+ metrics:
1432
+ - name: BLEU
1433
+ type: bleu
1434
+ value: 23.3
1435
+ - name: chr-F
1436
+ type: chrf
1437
+ value: 0.53282
1438
+ - task:
1439
+ name: Translation deu-fra
1440
+ type: translation
1441
+ args: deu-fra
1442
+ dataset:
1443
+ name: generaltest2022
1444
+ type: generaltest2022
1445
+ args: deu-fra
1446
+ metrics:
1447
+ - name: BLEU
1448
+ type: bleu
1449
+ value: 37.4
1450
+ - name: chr-F
1451
+ type: chrf
1452
+ value: 0.60634
1453
+ - task:
1454
+ name: Translation deu-fra
1455
+ type: translation
1456
+ args: deu-fra
1457
+ dataset:
1458
+ name: multi30k_test_2016_flickr
1459
+ type: multi30k-2016_flickr
1460
+ args: deu-fra
1461
+ metrics:
1462
+ - name: BLEU
1463
+ type: bleu
1464
+ value: 38.5
1465
+ - name: chr-F
1466
+ type: chrf
1467
+ value: 0.62595
1468
+ - task:
1469
+ name: Translation eng-fra
1470
+ type: translation
1471
+ args: eng-fra
1472
+ dataset:
1473
+ name: multi30k_test_2016_flickr
1474
+ type: multi30k-2016_flickr
1475
+ args: eng-fra
1476
+ metrics:
1477
+ - name: BLEU
1478
+ type: bleu
1479
+ value: 51.4
1480
+ - name: chr-F
1481
+ type: chrf
1482
+ value: 0.71630
1483
+ - task:
1484
+ name: Translation deu-fra
1485
+ type: translation
1486
+ args: deu-fra
1487
+ dataset:
1488
+ name: multi30k_test_2017_flickr
1489
+ type: multi30k-2017_flickr
1490
+ args: deu-fra
1491
+ metrics:
1492
+ - name: BLEU
1493
+ type: bleu
1494
+ value: 37.3
1495
+ - name: chr-F
1496
+ type: chrf
1497
+ value: 0.62733
1498
+ - task:
1499
+ name: Translation eng-fra
1500
+ type: translation
1501
+ args: eng-fra
1502
+ dataset:
1503
+ name: multi30k_test_2017_flickr
1504
+ type: multi30k-2017_flickr
1505
+ args: eng-fra
1506
+ metrics:
1507
+ - name: BLEU
1508
+ type: bleu
1509
+ value: 50.8
1510
+ - name: chr-F
1511
+ type: chrf
1512
+ value: 0.71850
1513
+ - task:
1514
+ name: Translation deu-fra
1515
+ type: translation
1516
+ args: deu-fra
1517
+ dataset:
1518
+ name: multi30k_test_2017_mscoco
1519
+ type: multi30k-2017_mscoco
1520
+ args: deu-fra
1521
+ metrics:
1522
+ - name: BLEU
1523
+ type: bleu
1524
+ value: 33.8
1525
+ - name: chr-F
1526
+ type: chrf
1527
+ value: 0.59089
1528
+ - task:
1529
+ name: Translation eng-fra
1530
+ type: translation
1531
+ args: eng-fra
1532
+ dataset:
1533
+ name: multi30k_test_2017_mscoco
1534
+ type: multi30k-2017_mscoco
1535
+ args: eng-fra
1536
+ metrics:
1537
+ - name: BLEU
1538
+ type: bleu
1539
+ value: 54.1
1540
+ - name: chr-F
1541
+ type: chrf
1542
+ value: 0.73129
1543
+ - task:
1544
+ name: Translation deu-fra
1545
+ type: translation
1546
+ args: deu-fra
1547
+ dataset:
1548
+ name: multi30k_test_2018_flickr
1549
+ type: multi30k-2018_flickr
1550
+ args: deu-fra
1551
+ metrics:
1552
+ - name: BLEU
1553
+ type: bleu
1554
+ value: 30.9
1555
+ - name: chr-F
1556
+ type: chrf
1557
+ value: 0.57155
1558
+ - task:
1559
+ name: Translation eng-fra
1560
+ type: translation
1561
+ args: eng-fra
1562
+ dataset:
1563
+ name: multi30k_test_2018_flickr
1564
+ type: multi30k-2018_flickr
1565
+ args: eng-fra
1566
+ metrics:
1567
+ - name: BLEU
1568
+ type: bleu
1569
+ value: 41.9
1570
+ - name: chr-F
1571
+ type: chrf
1572
+ value: 0.65461
1573
+ - task:
1574
+ name: Translation eng-fra
1575
+ type: translation
1576
+ args: eng-fra
1577
+ dataset:
1578
+ name: newsdiscusstest2015
1579
+ type: newsdiscusstest2015
1580
+ args: eng-fra
1581
+ metrics:
1582
+ - name: BLEU
1583
+ type: bleu
1584
+ value: 38.5
1585
+ - name: chr-F
1586
+ type: chrf
1587
+ value: 0.63660
1588
+ - task:
1589
+ name: Translation deu-cat
1590
+ type: translation
1591
+ args: deu-cat
1592
+ dataset:
1593
+ name: ntrex128
1594
+ type: ntrex128
1595
+ args: deu-cat
1596
+ metrics:
1597
+ - name: BLEU
1598
+ type: bleu
1599
+ value: 28.2
1600
+ - name: chr-F
1601
+ type: chrf
1602
+ value: 0.55033
1603
+ - task:
1604
+ name: Translation deu-fra
1605
+ type: translation
1606
+ args: deu-fra
1607
+ dataset:
1608
+ name: ntrex128
1609
+ type: ntrex128
1610
+ args: deu-fra
1611
+ metrics:
1612
+ - name: BLEU
1613
+ type: bleu
1614
+ value: 28.5
1615
+ - name: chr-F
1616
+ type: chrf
1617
+ value: 0.55854
1618
+ - task:
1619
+ name: Translation deu-glg
1620
+ type: translation
1621
+ args: deu-glg
1622
+ dataset:
1623
+ name: ntrex128
1624
+ type: ntrex128
1625
+ args: deu-glg
1626
+ metrics:
1627
+ - name: BLEU
1628
+ type: bleu
1629
+ value: 27.8
1630
+ - name: chr-F
1631
+ type: chrf
1632
+ value: 0.55034
1633
+ - task:
1634
+ name: Translation deu-ita
1635
+ type: translation
1636
+ args: deu-ita
1637
+ dataset:
1638
+ name: ntrex128
1639
+ type: ntrex128
1640
+ args: deu-ita
1641
+ metrics:
1642
+ - name: BLEU
1643
+ type: bleu
1644
+ value: 26.6
1645
+ - name: chr-F
1646
+ type: chrf
1647
+ value: 0.55733
1648
+ - task:
1649
+ name: Translation deu-por
1650
+ type: translation
1651
+ args: deu-por
1652
+ dataset:
1653
+ name: ntrex128
1654
+ type: ntrex128
1655
+ args: deu-por
1656
+ metrics:
1657
+ - name: BLEU
1658
+ type: bleu
1659
+ value: 26.0
1660
+ - name: chr-F
1661
+ type: chrf
1662
+ value: 0.54208
1663
+ - task:
1664
+ name: Translation deu-ron
1665
+ type: translation
1666
+ args: deu-ron
1667
+ dataset:
1668
+ name: ntrex128
1669
+ type: ntrex128
1670
+ args: deu-ron
1671
+ metrics:
1672
+ - name: BLEU
1673
+ type: bleu
1674
+ value: 26.6
1675
+ - name: chr-F
1676
+ type: chrf
1677
+ value: 0.52839
1678
+ - task:
1679
+ name: Translation deu-spa
1680
+ type: translation
1681
+ args: deu-spa
1682
+ dataset:
1683
+ name: ntrex128
1684
+ type: ntrex128
1685
+ args: deu-spa
1686
+ metrics:
1687
+ - name: BLEU
1688
+ type: bleu
1689
+ value: 30.8
1690
+ - name: chr-F
1691
+ type: chrf
1692
+ value: 0.56966
1693
+ - task:
1694
+ name: Translation eng-cat
1695
+ type: translation
1696
+ args: eng-cat
1697
+ dataset:
1698
+ name: ntrex128
1699
+ type: ntrex128
1700
+ args: eng-cat
1701
+ metrics:
1702
+ - name: BLEU
1703
+ type: bleu
1704
+ value: 36.3
1705
+ - name: chr-F
1706
+ type: chrf
1707
+ value: 0.61431
1708
+ - task:
1709
+ name: Translation eng-fra
1710
+ type: translation
1711
+ args: eng-fra
1712
+ dataset:
1713
+ name: ntrex128
1714
+ type: ntrex128
1715
+ args: eng-fra
1716
+ metrics:
1717
+ - name: BLEU
1718
+ type: bleu
1719
+ value: 35.5
1720
+ - name: chr-F
1721
+ type: chrf
1722
+ value: 0.61695
1723
+ - task:
1724
+ name: Translation eng-glg
1725
+ type: translation
1726
+ args: eng-glg
1727
+ dataset:
1728
+ name: ntrex128
1729
+ type: ntrex128
1730
+ args: eng-glg
1731
+ metrics:
1732
+ - name: BLEU
1733
+ type: bleu
1734
+ value: 37.2
1735
+ - name: chr-F
1736
+ type: chrf
1737
+ value: 0.62390
1738
+ - task:
1739
+ name: Translation eng-ita
1740
+ type: translation
1741
+ args: eng-ita
1742
+ dataset:
1743
+ name: ntrex128
1744
+ type: ntrex128
1745
+ args: eng-ita
1746
+ metrics:
1747
+ - name: BLEU
1748
+ type: bleu
1749
+ value: 36.1
1750
+ - name: chr-F
1751
+ type: chrf
1752
+ value: 0.62209
1753
+ - task:
1754
+ name: Translation eng-por
1755
+ type: translation
1756
+ args: eng-por
1757
+ dataset:
1758
+ name: ntrex128
1759
+ type: ntrex128
1760
+ args: eng-por
1761
+ metrics:
1762
+ - name: BLEU
1763
+ type: bleu
1764
+ value: 33.5
1765
+ - name: chr-F
1766
+ type: chrf
1767
+ value: 0.59859
1768
+ - task:
1769
+ name: Translation eng-ron
1770
+ type: translation
1771
+ args: eng-ron
1772
+ dataset:
1773
+ name: ntrex128
1774
+ type: ntrex128
1775
+ args: eng-ron
1776
+ metrics:
1777
+ - name: BLEU
1778
+ type: bleu
1779
+ value: 33.4
1780
+ - name: chr-F
1781
+ type: chrf
1782
+ value: 0.58128
1783
+ - task:
1784
+ name: Translation eng-spa
1785
+ type: translation
1786
+ args: eng-spa
1787
+ dataset:
1788
+ name: ntrex128
1789
+ type: ntrex128
1790
+ args: eng-spa
1791
+ metrics:
1792
+ - name: BLEU
1793
+ type: bleu
1794
+ value: 40.3
1795
+ - name: chr-F
1796
+ type: chrf
1797
+ value: 0.64099
1798
+ - task:
1799
+ name: Translation fra-cat
1800
+ type: translation
1801
+ args: fra-cat
1802
+ dataset:
1803
+ name: ntrex128
1804
+ type: ntrex128
1805
+ args: fra-cat
1806
+ metrics:
1807
+ - name: BLEU
1808
+ type: bleu
1809
+ value: 28.1
1810
+ - name: chr-F
1811
+ type: chrf
1812
+ value: 0.55093
1813
+ - task:
1814
+ name: Translation fra-glg
1815
+ type: translation
1816
+ args: fra-glg
1817
+ dataset:
1818
+ name: ntrex128
1819
+ type: ntrex128
1820
+ args: fra-glg
1821
+ metrics:
1822
+ - name: BLEU
1823
+ type: bleu
1824
+ value: 28.0
1825
+ - name: chr-F
1826
+ type: chrf
1827
+ value: 0.55325
1828
+ - task:
1829
+ name: Translation fra-ita
1830
+ type: translation
1831
+ args: fra-ita
1832
+ dataset:
1833
+ name: ntrex128
1834
+ type: ntrex128
1835
+ args: fra-ita
1836
+ metrics:
1837
+ - name: BLEU
1838
+ type: bleu
1839
+ value: 27.4
1840
+ - name: chr-F
1841
+ type: chrf
1842
+ value: 0.56188
1843
+ - task:
1844
+ name: Translation fra-por
1845
+ type: translation
1846
+ args: fra-por
1847
+ dataset:
1848
+ name: ntrex128
1849
+ type: ntrex128
1850
+ args: fra-por
1851
+ metrics:
1852
+ - name: BLEU
1853
+ type: bleu
1854
+ value: 25.6
1855
+ - name: chr-F
1856
+ type: chrf
1857
+ value: 0.54001
1858
+ - task:
1859
+ name: Translation fra-ron
1860
+ type: translation
1861
+ args: fra-ron
1862
+ dataset:
1863
+ name: ntrex128
1864
+ type: ntrex128
1865
+ args: fra-ron
1866
+ metrics:
1867
+ - name: BLEU
1868
+ type: bleu
1869
+ value: 24.8
1870
+ - name: chr-F
1871
+ type: chrf
1872
+ value: 0.51853
1873
+ - task:
1874
+ name: Translation fra-spa
1875
+ type: translation
1876
+ args: fra-spa
1877
+ dataset:
1878
+ name: ntrex128
1879
+ type: ntrex128
1880
+ args: fra-spa
1881
+ metrics:
1882
+ - name: BLEU
1883
+ type: bleu
1884
+ value: 31.0
1885
+ - name: chr-F
1886
+ type: chrf
1887
+ value: 0.57116
1888
+ - task:
1889
+ name: Translation por-cat
1890
+ type: translation
1891
+ args: por-cat
1892
+ dataset:
1893
+ name: ntrex128
1894
+ type: ntrex128
1895
+ args: por-cat
1896
+ metrics:
1897
+ - name: BLEU
1898
+ type: bleu
1899
+ value: 31.6
1900
+ - name: chr-F
1901
+ type: chrf
1902
+ value: 0.57962
1903
+ - task:
1904
+ name: Translation por-fra
1905
+ type: translation
1906
+ args: por-fra
1907
+ dataset:
1908
+ name: ntrex128
1909
+ type: ntrex128
1910
+ args: por-fra
1911
+ metrics:
1912
+ - name: BLEU
1913
+ type: bleu
1914
+ value: 28.9
1915
+ - name: chr-F
1916
+ type: chrf
1917
+ value: 0.56910
1918
+ - task:
1919
+ name: Translation por-glg
1920
+ type: translation
1921
+ args: por-glg
1922
+ dataset:
1923
+ name: ntrex128
1924
+ type: ntrex128
1925
+ args: por-glg
1926
+ metrics:
1927
+ - name: BLEU
1928
+ type: bleu
1929
+ value: 30.3
1930
+ - name: chr-F
1931
+ type: chrf
1932
+ value: 0.57389
1933
+ - task:
1934
+ name: Translation por-ita
1935
+ type: translation
1936
+ args: por-ita
1937
+ dataset:
1938
+ name: ntrex128
1939
+ type: ntrex128
1940
+ args: por-ita
1941
+ metrics:
1942
+ - name: BLEU
1943
+ type: bleu
1944
+ value: 30.6
1945
+ - name: chr-F
1946
+ type: chrf
1947
+ value: 0.58788
1948
+ - task:
1949
+ name: Translation por-ron
1950
+ type: translation
1951
+ args: por-ron
1952
+ dataset:
1953
+ name: ntrex128
1954
+ type: ntrex128
1955
+ args: por-ron
1956
+ metrics:
1957
+ - name: BLEU
1958
+ type: bleu
1959
+ value: 28.0
1960
+ - name: chr-F
1961
+ type: chrf
1962
+ value: 0.54276
1963
+ - task:
1964
+ name: Translation por-spa
1965
+ type: translation
1966
+ args: por-spa
1967
+ dataset:
1968
+ name: ntrex128
1969
+ type: ntrex128
1970
+ args: por-spa
1971
+ metrics:
1972
+ - name: BLEU
1973
+ type: bleu
1974
+ value: 34.2
1975
+ - name: chr-F
1976
+ type: chrf
1977
+ value: 0.59565
1978
+ - task:
1979
+ name: Translation spa-cat
1980
+ type: translation
1981
+ args: spa-cat
1982
+ dataset:
1983
+ name: ntrex128
1984
+ type: ntrex128
1985
+ args: spa-cat
1986
+ metrics:
1987
+ - name: BLEU
1988
+ type: bleu
1989
+ value: 34.0
1990
+ - name: chr-F
1991
+ type: chrf
1992
+ value: 0.60605
1993
+ - task:
1994
+ name: Translation spa-fra
1995
+ type: translation
1996
+ args: spa-fra
1997
+ dataset:
1998
+ name: ntrex128
1999
+ type: ntrex128
2000
+ args: spa-fra
2001
+ metrics:
2002
+ - name: BLEU
2003
+ type: bleu
2004
+ value: 29.6
2005
+ - name: chr-F
2006
+ type: chrf
2007
+ value: 0.57501
2008
+ - task:
2009
+ name: Translation spa-glg
2010
+ type: translation
2011
+ args: spa-glg
2012
+ dataset:
2013
+ name: ntrex128
2014
+ type: ntrex128
2015
+ args: spa-glg
2016
+ metrics:
2017
+ - name: BLEU
2018
+ type: bleu
2019
+ value: 34.4
2020
+ - name: chr-F
2021
+ type: chrf
2022
+ value: 0.61300
2023
+ - task:
2024
+ name: Translation spa-ita
2025
+ type: translation
2026
+ args: spa-ita
2027
+ dataset:
2028
+ name: ntrex128
2029
+ type: ntrex128
2030
+ args: spa-ita
2031
+ metrics:
2032
+ - name: BLEU
2033
+ type: bleu
2034
+ value: 28.9
2035
+ - name: chr-F
2036
+ type: chrf
2037
+ value: 0.57868
2038
+ - task:
2039
+ name: Translation spa-por
2040
+ type: translation
2041
+ args: spa-por
2042
+ dataset:
2043
+ name: ntrex128
2044
+ type: ntrex128
2045
+ args: spa-por
2046
+ metrics:
2047
+ - name: BLEU
2048
+ type: bleu
2049
+ value: 29.1
2050
+ - name: chr-F
2051
+ type: chrf
2052
+ value: 0.56730
2053
+ - task:
2054
+ name: Translation spa-ron
2055
+ type: translation
2056
+ args: spa-ron
2057
+ dataset:
2058
+ name: ntrex128
2059
+ type: ntrex128
2060
+ args: spa-ron
2061
+ metrics:
2062
+ - name: BLEU
2063
+ type: bleu
2064
+ value: 27.9
2065
+ - name: chr-F
2066
+ type: chrf
2067
+ value: 0.54222
2068
+ - task:
2069
+ name: Translation deu-cat
2070
+ type: translation
2071
+ args: deu-cat
2072
+ dataset:
2073
+ name: tatoeba-test-v2021-08-07
2074
+ type: tatoeba_mt
2075
+ args: deu-cat
2076
+ metrics:
2077
+ - name: BLEU
2078
+ type: bleu
2079
+ value: 44.3
2080
+ - name: chr-F
2081
+ type: chrf
2082
+ value: 0.63465
2083
+ - task:
2084
+ name: Translation deu-fra
2085
+ type: translation
2086
+ args: deu-fra
2087
+ dataset:
2088
+ name: tatoeba-test-v2021-08-07
2089
+ type: tatoeba_mt
2090
+ args: deu-fra
2091
+ metrics:
2092
+ - name: BLEU
2093
+ type: bleu
2094
+ value: 50.7
2095
+ - name: chr-F
2096
+ type: chrf
2097
+ value: 0.68258
2098
+ - task:
2099
+ name: Translation deu-ita
2100
+ type: translation
2101
+ args: deu-ita
2102
+ dataset:
2103
+ name: tatoeba-test-v2021-08-07
2104
+ type: tatoeba_mt
2105
+ args: deu-ita
2106
+ metrics:
2107
+ - name: BLEU
2108
+ type: bleu
2109
+ value: 47.4
2110
+ - name: chr-F
2111
+ type: chrf
2112
+ value: 0.68502
2113
+ - task:
2114
+ name: Translation deu-lad
2115
+ type: translation
2116
+ args: deu-lad
2117
+ dataset:
2118
+ name: tatoeba-test-v2021-08-07
2119
+ type: tatoeba_mt
2120
+ args: deu-lad
2121
+ metrics:
2122
+ - name: BLEU
2123
+ type: bleu
2124
+ value: 22.0
2125
+ - name: chr-F
2126
+ type: chrf
2127
+ value: 0.38047
2128
+ - task:
2129
+ name: Translation deu-por
2130
+ type: translation
2131
+ args: deu-por
2132
+ dataset:
2133
+ name: tatoeba-test-v2021-08-07
2134
+ type: tatoeba_mt
2135
+ args: deu-por
2136
+ metrics:
2137
+ - name: BLEU
2138
+ type: bleu
2139
+ value: 43.1
2140
+ - name: chr-F
2141
+ type: chrf
2142
+ value: 0.63684
2143
+ - task:
2144
+ name: Translation deu-ron
2145
+ type: translation
2146
+ args: deu-ron
2147
+ dataset:
2148
+ name: tatoeba-test-v2021-08-07
2149
+ type: tatoeba_mt
2150
+ args: deu-ron
2151
+ metrics:
2152
+ - name: BLEU
2153
+ type: bleu
2154
+ value: 42.6
2155
+ - name: chr-F
2156
+ type: chrf
2157
+ value: 0.64207
2158
+ - task:
2159
+ name: Translation deu-spa
2160
+ type: translation
2161
+ args: deu-spa
2162
+ dataset:
2163
+ name: tatoeba-test-v2021-08-07
2164
+ type: tatoeba_mt
2165
+ args: deu-spa
2166
+ metrics:
2167
+ - name: BLEU
2168
+ type: bleu
2169
+ value: 49.4
2170
+ - name: chr-F
2171
+ type: chrf
2172
+ value: 0.68333
2173
+ - task:
2174
+ name: Translation eng-cat
2175
+ type: translation
2176
+ args: eng-cat
2177
+ dataset:
2178
+ name: tatoeba-test-v2021-08-07
2179
+ type: tatoeba_mt
2180
+ args: eng-cat
2181
+ metrics:
2182
+ - name: BLEU
2183
+ type: bleu
2184
+ value: 49.1
2185
+ - name: chr-F
2186
+ type: chrf
2187
+ value: 0.67724
2188
+ - task:
2189
+ name: Translation eng-fra
2190
+ type: translation
2191
+ args: eng-fra
2192
+ dataset:
2193
+ name: tatoeba-test-v2021-08-07
2194
+ type: tatoeba_mt
2195
+ args: eng-fra
2196
+ metrics:
2197
+ - name: BLEU
2198
+ type: bleu
2199
+ value: 51.6
2200
+ - name: chr-F
2201
+ type: chrf
2202
+ value: 0.68777
2203
+ - task:
2204
+ name: Translation eng-glg
2205
+ type: translation
2206
+ args: eng-glg
2207
+ dataset:
2208
+ name: tatoeba-test-v2021-08-07
2209
+ type: tatoeba_mt
2210
+ args: eng-glg
2211
+ metrics:
2212
+ - name: BLEU
2213
+ type: bleu
2214
+ value: 45.2
2215
+ - name: chr-F
2216
+ type: chrf
2217
+ value: 0.64530
2218
+ - task:
2219
+ name: Translation eng-ita
2220
+ type: translation
2221
+ args: eng-ita
2222
+ dataset:
2223
+ name: tatoeba-test-v2021-08-07
2224
+ type: tatoeba_mt
2225
+ args: eng-ita
2226
+ metrics:
2227
+ - name: BLEU
2228
+ type: bleu
2229
+ value: 53.3
2230
+ - name: chr-F
2231
+ type: chrf
2232
+ value: 0.72115
2233
+ - task:
2234
+ name: Translation eng-lad
2235
+ type: translation
2236
+ args: eng-lad
2237
+ dataset:
2238
+ name: tatoeba-test-v2021-08-07
2239
+ type: tatoeba_mt
2240
+ args: eng-lad
2241
+ metrics:
2242
+ - name: BLEU
2243
+ type: bleu
2244
+ value: 24.2
2245
+ - name: chr-F
2246
+ type: chrf
2247
+ value: 0.43857
2248
+ - task:
2249
+ name: Translation eng-lad_Latn
2250
+ type: translation
2251
+ args: eng-lad_Latn
2252
+ dataset:
2253
+ name: tatoeba-test-v2021-08-07
2254
+ type: tatoeba_mt
2255
+ args: eng-lad_Latn
2256
+ metrics:
2257
+ - name: BLEU
2258
+ type: bleu
2259
+ value: 27.6
2260
+ - name: chr-F
2261
+ type: chrf
2262
+ value: 0.50848
2263
+ - task:
2264
+ name: Translation eng-lat
2265
+ type: translation
2266
+ args: eng-lat
2267
+ dataset:
2268
+ name: tatoeba-test-v2021-08-07
2269
+ type: tatoeba_mt
2270
+ args: eng-lat
2271
+ metrics:
2272
+ - name: BLEU
2273
+ type: bleu
2274
+ value: 20.0
2275
+ - name: chr-F
2276
+ type: chrf
2277
+ value: 0.45710
2278
+ - task:
2279
+ name: Translation eng-por
2280
+ type: translation
2281
+ args: eng-por
2282
+ dataset:
2283
+ name: tatoeba-test-v2021-08-07
2284
+ type: tatoeba_mt
2285
+ args: eng-por
2286
+ metrics:
2287
+ - name: BLEU
2288
+ type: bleu
2289
+ value: 53.4
2290
+ - name: chr-F
2291
+ type: chrf
2292
+ value: 0.72159
2293
+ - task:
2294
+ name: Translation eng-ron
2295
+ type: translation
2296
+ args: eng-ron
2297
+ dataset:
2298
+ name: tatoeba-test-v2021-08-07
2299
+ type: tatoeba_mt
2300
+ args: eng-ron
2301
+ metrics:
2302
+ - name: BLEU
2303
+ type: bleu
2304
+ value: 47.1
2305
+ - name: chr-F
2306
+ type: chrf
2307
+ value: 0.67835
2308
+ - task:
2309
+ name: Translation eng-spa
2310
+ type: translation
2311
+ args: eng-spa
2312
+ dataset:
2313
+ name: tatoeba-test-v2021-08-07
2314
+ type: tatoeba_mt
2315
+ args: eng-spa
2316
+ metrics:
2317
+ - name: BLEU
2318
+ type: bleu
2319
+ value: 55.8
2320
+ - name: chr-F
2321
+ type: chrf
2322
+ value: 0.72875
2323
+ - task:
2324
+ name: Translation fra-cat
2325
+ type: translation
2326
+ args: fra-cat
2327
+ dataset:
2328
+ name: tatoeba-test-v2021-08-07
2329
+ type: tatoeba_mt
2330
+ args: fra-cat
2331
+ metrics:
2332
+ - name: BLEU
2333
+ type: bleu
2334
+ value: 44.6
2335
+ - name: chr-F
2336
+ type: chrf
2337
+ value: 0.65547
2338
+ - task:
2339
+ name: Translation fra-fra
2340
+ type: translation
2341
+ args: fra-fra
2342
+ dataset:
2343
+ name: tatoeba-test-v2021-08-07
2344
+ type: tatoeba_mt
2345
+ args: fra-fra
2346
+ metrics:
2347
+ - name: BLEU
2348
+ type: bleu
2349
+ value: 39.9
2350
+ - name: chr-F
2351
+ type: chrf
2352
+ value: 0.61650
2353
+ - task:
2354
+ name: Translation fra-ita
2355
+ type: translation
2356
+ args: fra-ita
2357
+ dataset:
2358
+ name: tatoeba-test-v2021-08-07
2359
+ type: tatoeba_mt
2360
+ args: fra-ita
2361
+ metrics:
2362
+ - name: BLEU
2363
+ type: bleu
2364
+ value: 53.5
2365
+ - name: chr-F
2366
+ type: chrf
2367
+ value: 0.72739
2368
+ - task:
2369
+ name: Translation fra-por
2370
+ type: translation
2371
+ args: fra-por
2372
+ dataset:
2373
+ name: tatoeba-test-v2021-08-07
2374
+ type: tatoeba_mt
2375
+ args: fra-por
2376
+ metrics:
2377
+ - name: BLEU
2378
+ type: bleu
2379
+ value: 52.0
2380
+ - name: chr-F
2381
+ type: chrf
2382
+ value: 0.70655
2383
+ - task:
2384
+ name: Translation fra-ron
2385
+ type: translation
2386
+ args: fra-ron
2387
+ dataset:
2388
+ name: tatoeba-test-v2021-08-07
2389
+ type: tatoeba_mt
2390
+ args: fra-ron
2391
+ metrics:
2392
+ - name: BLEU
2393
+ type: bleu
2394
+ value: 43.7
2395
+ - name: chr-F
2396
+ type: chrf
2397
+ value: 0.65399
2398
+ - task:
2399
+ name: Translation fra-spa
2400
+ type: translation
2401
+ args: fra-spa
2402
+ dataset:
2403
+ name: tatoeba-test-v2021-08-07
2404
+ type: tatoeba_mt
2405
+ args: fra-spa
2406
+ metrics:
2407
+ - name: BLEU
2408
+ type: bleu
2409
+ value: 54.8
2410
+ - name: chr-F
2411
+ type: chrf
2412
+ value: 0.72083
2413
+ - task:
2414
+ name: Translation multi-multi
2415
+ type: translation
2416
+ args: multi-multi
2417
+ dataset:
2418
+ name: tatoeba-test-v2020-07-28-v2023-09-26
2419
+ type: tatoeba_mt
2420
+ args: multi-multi
2421
+ metrics:
2422
+ - name: BLEU
2423
+ type: bleu
2424
+ value: 49.7
2425
+ - name: chr-F
2426
+ type: chrf
2427
+ value: 0.67768
2428
+ - task:
2429
+ name: Translation por-cat
2430
+ type: translation
2431
+ args: por-cat
2432
+ dataset:
2433
+ name: tatoeba-test-v2021-08-07
2434
+ type: tatoeba_mt
2435
+ args: por-cat
2436
+ metrics:
2437
+ - name: BLEU
2438
+ type: bleu
2439
+ value: 52.0
2440
+ - name: chr-F
2441
+ type: chrf
2442
+ value: 0.71178
2443
+ - task:
2444
+ name: Translation por-fra
2445
+ type: translation
2446
+ args: por-fra
2447
+ dataset:
2448
+ name: tatoeba-test-v2021-08-07
2449
+ type: tatoeba_mt
2450
+ args: por-fra
2451
+ metrics:
2452
+ - name: BLEU
2453
+ type: bleu
2454
+ value: 60.4
2455
+ - name: chr-F
2456
+ type: chrf
2457
+ value: 0.75691
2458
+ - task:
2459
+ name: Translation por-glg
2460
+ type: translation
2461
+ args: por-glg
2462
+ dataset:
2463
+ name: tatoeba-test-v2021-08-07
2464
+ type: tatoeba_mt
2465
+ args: por-glg
2466
+ metrics:
2467
+ - name: BLEU
2468
+ type: bleu
2469
+ value: 57.6
2470
+ - name: chr-F
2471
+ type: chrf
2472
+ value: 0.74818
2473
+ - task:
2474
+ name: Translation por-ita
2475
+ type: translation
2476
+ args: por-ita
2477
+ dataset:
2478
+ name: tatoeba-test-v2021-08-07
2479
+ type: tatoeba_mt
2480
+ args: por-ita
2481
+ metrics:
2482
+ - name: BLEU
2483
+ type: bleu
2484
+ value: 58.7
2485
+ - name: chr-F
2486
+ type: chrf
2487
+ value: 0.76899
2488
+ - task:
2489
+ name: Translation por-por
2490
+ type: translation
2491
+ args: por-por
2492
+ dataset:
2493
+ name: tatoeba-test-v2021-08-07
2494
+ type: tatoeba_mt
2495
+ args: por-por
2496
+ metrics:
2497
+ - name: BLEU
2498
+ type: bleu
2499
+ value: 51.0
2500
+ - name: chr-F
2501
+ type: chrf
2502
+ value: 0.71775
2503
+ - task:
2504
+ name: Translation por-ron
2505
+ type: translation
2506
+ args: por-ron
2507
+ dataset:
2508
+ name: tatoeba-test-v2021-08-07
2509
+ type: tatoeba_mt
2510
+ args: por-ron
2511
+ metrics:
2512
+ - name: BLEU
2513
+ type: bleu
2514
+ value: 47.8
2515
+ - name: chr-F
2516
+ type: chrf
2517
+ value: 0.69517
2518
+ - task:
2519
+ name: Translation por-spa
2520
+ type: translation
2521
+ args: por-spa
2522
+ dataset:
2523
+ name: tatoeba-test-v2021-08-07
2524
+ type: tatoeba_mt
2525
+ args: por-spa
2526
+ metrics:
2527
+ - name: BLEU
2528
+ type: bleu
2529
+ value: 64.9
2530
+ - name: chr-F
2531
+ type: chrf
2532
+ value: 0.79442
2533
+ - task:
2534
+ name: Translation spa-cat
2535
+ type: translation
2536
+ args: spa-cat
2537
+ dataset:
2538
+ name: tatoeba-test-v2021-08-07
2539
+ type: tatoeba_mt
2540
+ args: spa-cat
2541
+ metrics:
2542
+ - name: BLEU
2543
+ type: bleu
2544
+ value: 66.3
2545
+ - name: chr-F
2546
+ type: chrf
2547
+ value: 0.81845
2548
+ - task:
2549
+ name: Translation spa-fra
2550
+ type: translation
2551
+ args: spa-fra
2552
+ dataset:
2553
+ name: tatoeba-test-v2021-08-07
2554
+ type: tatoeba_mt
2555
+ args: spa-fra
2556
+ metrics:
2557
+ - name: BLEU
2558
+ type: bleu
2559
+ value: 57.4
2560
+ - name: chr-F
2561
+ type: chrf
2562
+ value: 0.73277
2563
+ - task:
2564
+ name: Translation spa-glg
2565
+ type: translation
2566
+ args: spa-glg
2567
+ dataset:
2568
+ name: tatoeba-test-v2021-08-07
2569
+ type: tatoeba_mt
2570
+ args: spa-glg
2571
+ metrics:
2572
+ - name: BLEU
2573
+ type: bleu
2574
+ value: 61.5
2575
+ - name: chr-F
2576
+ type: chrf
2577
+ value: 0.76118
2578
+ - task:
2579
+ name: Translation spa-ita
2580
+ type: translation
2581
+ args: spa-ita
2582
+ dataset:
2583
+ name: tatoeba-test-v2021-08-07
2584
+ type: tatoeba_mt
2585
+ args: spa-ita
2586
+ metrics:
2587
+ - name: BLEU
2588
+ type: bleu
2589
+ value: 59.5
2590
+ - name: chr-F
2591
+ type: chrf
2592
+ value: 0.76742
2593
+ - task:
2594
+ name: Translation spa-lad
2595
+ type: translation
2596
+ args: spa-lad
2597
+ dataset:
2598
+ name: tatoeba-test-v2021-08-07
2599
+ type: tatoeba_mt
2600
+ args: spa-lad
2601
+ metrics:
2602
+ - name: BLEU
2603
+ type: bleu
2604
+ value: 23.4
2605
+ - name: chr-F
2606
+ type: chrf
2607
+ value: 0.43064
2608
+ - task:
2609
+ name: Translation spa-lad_Latn
2610
+ type: translation
2611
+ args: spa-lad_Latn
2612
+ dataset:
2613
+ name: tatoeba-test-v2021-08-07
2614
+ type: tatoeba_mt
2615
+ args: spa-lad_Latn
2616
+ metrics:
2617
+ - name: BLEU
2618
+ type: bleu
2619
+ value: 27.1
2620
+ - name: chr-F
2621
+ type: chrf
2622
+ value: 0.50795
2623
+ - task:
2624
+ name: Translation spa-por
2625
+ type: translation
2626
+ args: spa-por
2627
+ dataset:
2628
+ name: tatoeba-test-v2021-08-07
2629
+ type: tatoeba_mt
2630
+ args: spa-por
2631
+ metrics:
2632
+ - name: BLEU
2633
+ type: bleu
2634
+ value: 60.7
2635
+ - name: chr-F
2636
+ type: chrf
2637
+ value: 0.76951
2638
+ - task:
2639
+ name: Translation spa-ron
2640
+ type: translation
2641
+ args: spa-ron
2642
+ dataset:
2643
+ name: tatoeba-test-v2021-08-07
2644
+ type: tatoeba_mt
2645
+ args: spa-ron
2646
+ metrics:
2647
+ - name: BLEU
2648
+ type: bleu
2649
+ value: 45.9
2650
+ - name: chr-F
2651
+ type: chrf
2652
+ value: 0.67782
2653
+ - task:
2654
+ name: Translation spa-spa
2655
+ type: translation
2656
+ args: spa-spa
2657
+ dataset:
2658
+ name: tatoeba-test-v2021-08-07
2659
+ type: tatoeba_mt
2660
+ args: spa-spa
2661
+ metrics:
2662
+ - name: BLEU
2663
+ type: bleu
2664
+ value: 49.6
2665
+ - name: chr-F
2666
+ type: chrf
2667
+ value: 0.67346
2668
+ - task:
2669
+ name: Translation eng-fra
2670
+ type: translation
2671
+ args: eng-fra
2672
+ dataset:
2673
+ name: tico19-test
2674
+ type: tico19-test
2675
+ args: eng-fra
2676
+ metrics:
2677
+ - name: BLEU
2678
+ type: bleu
2679
+ value: 40.1
2680
+ - name: chr-F
2681
+ type: chrf
2682
+ value: 0.62989
2683
+ - task:
2684
+ name: Translation eng-por
2685
+ type: translation
2686
+ args: eng-por
2687
+ dataset:
2688
+ name: tico19-test
2689
+ type: tico19-test
2690
+ args: eng-por
2691
+ metrics:
2692
+ - name: BLEU
2693
+ type: bleu
2694
+ value: 50.0
2695
+ - name: chr-F
2696
+ type: chrf
2697
+ value: 0.72708
2698
+ - task:
2699
+ name: Translation eng-spa
2700
+ type: translation
2701
+ args: eng-spa
2702
+ dataset:
2703
+ name: tico19-test
2704
+ type: tico19-test
2705
+ args: eng-spa
2706
+ metrics:
2707
+ - name: BLEU
2708
+ type: bleu
2709
+ value: 52.0
2710
+ - name: chr-F
2711
+ type: chrf
2712
+ value: 0.73154
2713
+ - task:
2714
+ name: Translation fra-por
2715
+ type: translation
2716
+ args: fra-por
2717
+ dataset:
2718
+ name: tico19-test
2719
+ type: tico19-test
2720
+ args: fra-por
2721
+ metrics:
2722
+ - name: BLEU
2723
+ type: bleu
2724
+ value: 34.1
2725
+ - name: chr-F
2726
+ type: chrf
2727
+ value: 0.58383
2728
+ - task:
2729
+ name: Translation fra-spa
2730
+ type: translation
2731
+ args: fra-spa
2732
+ dataset:
2733
+ name: tico19-test
2734
+ type: tico19-test
2735
+ args: fra-spa
2736
+ metrics:
2737
+ - name: BLEU
2738
+ type: bleu
2739
+ value: 37.0
2740
+ - name: chr-F
2741
+ type: chrf
2742
+ value: 0.59581
2743
+ - task:
2744
+ name: Translation por-fra
2745
+ type: translation
2746
+ args: por-fra
2747
+ dataset:
2748
+ name: tico19-test
2749
+ type: tico19-test
2750
+ args: por-fra
2751
+ metrics:
2752
+ - name: BLEU
2753
+ type: bleu
2754
+ value: 34.4
2755
+ - name: chr-F
2756
+ type: chrf
2757
+ value: 0.59798
2758
+ - task:
2759
+ name: Translation por-spa
2760
+ type: translation
2761
+ args: por-spa
2762
+ dataset:
2763
+ name: tico19-test
2764
+ type: tico19-test
2765
+ args: por-spa
2766
+ metrics:
2767
+ - name: BLEU
2768
+ type: bleu
2769
+ value: 45.4
2770
+ - name: chr-F
2771
+ type: chrf
2772
+ value: 0.68332
2773
+ - task:
2774
+ name: Translation spa-fra
2775
+ type: translation
2776
+ args: spa-fra
2777
+ dataset:
2778
+ name: tico19-test
2779
+ type: tico19-test
2780
+ args: spa-fra
2781
+ metrics:
2782
+ - name: BLEU
2783
+ type: bleu
2784
+ value: 35.5
2785
+ - name: chr-F
2786
+ type: chrf
2787
+ value: 0.60469
2788
+ - task:
2789
+ name: Translation spa-por
2790
+ type: translation
2791
+ args: spa-por
2792
+ dataset:
2793
+ name: tico19-test
2794
+ type: tico19-test
2795
+ args: spa-por
2796
+ metrics:
2797
+ - name: BLEU
2798
+ type: bleu
2799
+ value: 42.8
2800
+ - name: chr-F
2801
+ type: chrf
2802
+ value: 0.67898
2803
+ - task:
2804
+ name: Translation deu-fra
2805
+ type: translation
2806
+ args: deu-fra
2807
+ dataset:
2808
+ name: newstest2008
2809
+ type: wmt-2008-news
2810
+ args: deu-fra
2811
+ metrics:
2812
+ - name: BLEU
2813
+ type: bleu
2814
+ value: 26.3
2815
+ - name: chr-F
2816
+ type: chrf
2817
+ value: 0.54926
2818
+ - task:
2819
+ name: Translation deu-spa
2820
+ type: translation
2821
+ args: deu-spa
2822
+ dataset:
2823
+ name: newstest2008
2824
+ type: wmt-2008-news
2825
+ args: deu-spa
2826
+ metrics:
2827
+ - name: BLEU
2828
+ type: bleu
2829
+ value: 25.5
2830
+ - name: chr-F
2831
+ type: chrf
2832
+ value: 0.53902
2833
+ - task:
2834
+ name: Translation eng-fra
2835
+ type: translation
2836
+ args: eng-fra
2837
+ dataset:
2838
+ name: newstest2008
2839
+ type: wmt-2008-news
2840
+ args: eng-fra
2841
+ metrics:
2842
+ - name: BLEU
2843
+ type: bleu
2844
+ value: 26.8
2845
+ - name: chr-F
2846
+ type: chrf
2847
+ value: 0.55358
2848
+ - task:
2849
+ name: Translation eng-spa
2850
+ type: translation
2851
+ args: eng-spa
2852
+ dataset:
2853
+ name: newstest2008
2854
+ type: wmt-2008-news
2855
+ args: eng-spa
2856
+ metrics:
2857
+ - name: BLEU
2858
+ type: bleu
2859
+ value: 29.5
2860
+ - name: chr-F
2861
+ type: chrf
2862
+ value: 0.56491
2863
+ - task:
2864
+ name: Translation fra-spa
2865
+ type: translation
2866
+ args: fra-spa
2867
+ dataset:
2868
+ name: newstest2008
2869
+ type: wmt-2008-news
2870
+ args: fra-spa
2871
+ metrics:
2872
+ - name: BLEU
2873
+ type: bleu
2874
+ value: 33.0
2875
+ - name: chr-F
2876
+ type: chrf
2877
+ value: 0.58764
2878
+ - task:
2879
+ name: Translation spa-fra
2880
+ type: translation
2881
+ args: spa-fra
2882
+ dataset:
2883
+ name: newstest2008
2884
+ type: wmt-2008-news
2885
+ args: spa-fra
2886
+ metrics:
2887
+ - name: BLEU
2888
+ type: bleu
2889
+ value: 32.4
2890
+ - name: chr-F
2891
+ type: chrf
2892
+ value: 0.58848
2893
+ - task:
2894
+ name: Translation deu-fra
2895
+ type: translation
2896
+ args: deu-fra
2897
+ dataset:
2898
+ name: newstest2009
2899
+ type: wmt-2009-news
2900
+ args: deu-fra
2901
+ metrics:
2902
+ - name: BLEU
2903
+ type: bleu
2904
+ value: 25.4
2905
+ - name: chr-F
2906
+ type: chrf
2907
+ value: 0.53870
2908
+ - task:
2909
+ name: Translation deu-ita
2910
+ type: translation
2911
+ args: deu-ita
2912
+ dataset:
2913
+ name: newstest2009
2914
+ type: wmt-2009-news
2915
+ args: deu-ita
2916
+ metrics:
2917
+ - name: BLEU
2918
+ type: bleu
2919
+ value: 24.4
2920
+ - name: chr-F
2921
+ type: chrf
2922
+ value: 0.54509
2923
+ - task:
2924
+ name: Translation deu-spa
2925
+ type: translation
2926
+ args: deu-spa
2927
+ dataset:
2928
+ name: newstest2009
2929
+ type: wmt-2009-news
2930
+ args: deu-spa
2931
+ metrics:
2932
+ - name: BLEU
2933
+ type: bleu
2934
+ value: 25.7
2935
+ - name: chr-F
2936
+ type: chrf
2937
+ value: 0.53769
2938
+ - task:
2939
+ name: Translation eng-fra
2940
+ type: translation
2941
+ args: eng-fra
2942
+ dataset:
2943
+ name: newstest2009
2944
+ type: wmt-2009-news
2945
+ args: eng-fra
2946
+ metrics:
2947
+ - name: BLEU
2948
+ type: bleu
2949
+ value: 29.3
2950
+ - name: chr-F
2951
+ type: chrf
2952
+ value: 0.57566
2953
+ - task:
2954
+ name: Translation eng-ita
2955
+ type: translation
2956
+ args: eng-ita
2957
+ dataset:
2958
+ name: newstest2009
2959
+ type: wmt-2009-news
2960
+ args: eng-ita
2961
+ metrics:
2962
+ - name: BLEU
2963
+ type: bleu
2964
+ value: 31.4
2965
+ - name: chr-F
2966
+ type: chrf
2967
+ value: 0.60372
2968
+ - task:
2969
+ name: Translation eng-spa
2970
+ type: translation
2971
+ args: eng-spa
2972
+ dataset:
2973
+ name: newstest2009
2974
+ type: wmt-2009-news
2975
+ args: eng-spa
2976
+ metrics:
2977
+ - name: BLEU
2978
+ type: bleu
2979
+ value: 30.0
2980
+ - name: chr-F
2981
+ type: chrf
2982
+ value: 0.57913
2983
+ - task:
2984
+ name: Translation fra-ita
2985
+ type: translation
2986
+ args: fra-ita
2987
+ dataset:
2988
+ name: newstest2009
2989
+ type: wmt-2009-news
2990
+ args: fra-ita
2991
+ metrics:
2992
+ - name: BLEU
2993
+ type: bleu
2994
+ value: 30.5
2995
+ - name: chr-F
2996
+ type: chrf
2997
+ value: 0.59749
2998
+ - task:
2999
+ name: Translation fra-spa
3000
+ type: translation
3001
+ args: fra-spa
3002
+ dataset:
3003
+ name: newstest2009
3004
+ type: wmt-2009-news
3005
+ args: fra-spa
3006
+ metrics:
3007
+ - name: BLEU
3008
+ type: bleu
3009
+ value: 32.1
3010
+ - name: chr-F
3011
+ type: chrf
3012
+ value: 0.58921
3013
+ - task:
3014
+ name: Translation spa-fra
3015
+ type: translation
3016
+ args: spa-fra
3017
+ dataset:
3018
+ name: newstest2009
3019
+ type: wmt-2009-news
3020
+ args: spa-fra
3021
+ metrics:
3022
+ - name: BLEU
3023
+ type: bleu
3024
+ value: 32.3
3025
+ - name: chr-F
3026
+ type: chrf
3027
+ value: 0.59195
3028
+ - task:
3029
+ name: Translation spa-ita
3030
+ type: translation
3031
+ args: spa-ita
3032
+ dataset:
3033
+ name: newstest2009
3034
+ type: wmt-2009-news
3035
+ args: spa-ita
3036
+ metrics:
3037
+ - name: BLEU
3038
+ type: bleu
3039
+ value: 33.0
3040
+ - name: chr-F
3041
+ type: chrf
3042
+ value: 0.61007
3043
+ - task:
3044
+ name: Translation deu-fra
3045
+ type: translation
3046
+ args: deu-fra
3047
+ dataset:
3048
+ name: newstest2010
3049
+ type: wmt-2010-news
3050
+ args: deu-fra
3051
+ metrics:
3052
+ - name: BLEU
3053
+ type: bleu
3054
+ value: 29.5
3055
+ - name: chr-F
3056
+ type: chrf
3057
+ value: 0.57888
3058
+ - task:
3059
+ name: Translation deu-spa
3060
+ type: translation
3061
+ args: deu-spa
3062
+ dataset:
3063
+ name: newstest2010
3064
+ type: wmt-2010-news
3065
+ args: deu-spa
3066
+ metrics:
3067
+ - name: BLEU
3068
+ type: bleu
3069
+ value: 32.7
3070
+ - name: chr-F
3071
+ type: chrf
3072
+ value: 0.59408
3073
+ - task:
3074
+ name: Translation eng-fra
3075
+ type: translation
3076
+ args: eng-fra
3077
+ dataset:
3078
+ name: newstest2010
3079
+ type: wmt-2010-news
3080
+ args: eng-fra
3081
+ metrics:
3082
+ - name: BLEU
3083
+ type: bleu
3084
+ value: 32.4
3085
+ - name: chr-F
3086
+ type: chrf
3087
+ value: 0.59588
3088
+ - task:
3089
+ name: Translation eng-spa
3090
+ type: translation
3091
+ args: eng-spa
3092
+ dataset:
3093
+ name: newstest2010
3094
+ type: wmt-2010-news
3095
+ args: eng-spa
3096
+ metrics:
3097
+ - name: BLEU
3098
+ type: bleu
3099
+ value: 36.6
3100
+ - name: chr-F
3101
+ type: chrf
3102
+ value: 0.61978
3103
+ - task:
3104
+ name: Translation fra-spa
3105
+ type: translation
3106
+ args: fra-spa
3107
+ dataset:
3108
+ name: newstest2010
3109
+ type: wmt-2010-news
3110
+ args: fra-spa
3111
+ metrics:
3112
+ - name: BLEU
3113
+ type: bleu
3114
+ value: 37.7
3115
+ - name: chr-F
3116
+ type: chrf
3117
+ value: 0.62513
3118
+ - task:
3119
+ name: Translation spa-fra
3120
+ type: translation
3121
+ args: spa-fra
3122
+ dataset:
3123
+ name: newstest2010
3124
+ type: wmt-2010-news
3125
+ args: spa-fra
3126
+ metrics:
3127
+ - name: BLEU
3128
+ type: bleu
3129
+ value: 36.1
3130
+ - name: chr-F
3131
+ type: chrf
3132
+ value: 0.62193
3133
+ - task:
3134
+ name: Translation deu-fra
3135
+ type: translation
3136
+ args: deu-fra
3137
+ dataset:
3138
+ name: newstest2011
3139
+ type: wmt-2011-news
3140
+ args: deu-fra
3141
+ metrics:
3142
+ - name: BLEU
3143
+ type: bleu
3144
+ value: 27.5
3145
+ - name: chr-F
3146
+ type: chrf
3147
+ value: 0.55704
3148
+ - task:
3149
+ name: Translation deu-spa
3150
+ type: translation
3151
+ args: deu-spa
3152
+ dataset:
3153
+ name: newstest2011
3154
+ type: wmt-2011-news
3155
+ args: deu-spa
3156
+ metrics:
3157
+ - name: BLEU
3158
+ type: bleu
3159
+ value: 30.4
3160
+ - name: chr-F
3161
+ type: chrf
3162
+ value: 0.56696
3163
+ - task:
3164
+ name: Translation eng-fra
3165
+ type: translation
3166
+ args: eng-fra
3167
+ dataset:
3168
+ name: newstest2011
3169
+ type: wmt-2011-news
3170
+ args: eng-fra
3171
+ metrics:
3172
+ - name: BLEU
3173
+ type: bleu
3174
+ value: 34.3
3175
+ - name: chr-F
3176
+ type: chrf
3177
+ value: 0.61071
3178
+ - task:
3179
+ name: Translation eng-spa
3180
+ type: translation
3181
+ args: eng-spa
3182
+ dataset:
3183
+ name: newstest2011
3184
+ type: wmt-2011-news
3185
+ args: eng-spa
3186
+ metrics:
3187
+ - name: BLEU
3188
+ type: bleu
3189
+ value: 38.7
3190
+ - name: chr-F
3191
+ type: chrf
3192
+ value: 0.62126
3193
+ - task:
3194
+ name: Translation fra-spa
3195
+ type: translation
3196
+ args: fra-spa
3197
+ dataset:
3198
+ name: newstest2011
3199
+ type: wmt-2011-news
3200
+ args: fra-spa
3201
+ metrics:
3202
+ - name: BLEU
3203
+ type: bleu
3204
+ value: 40.0
3205
+ - name: chr-F
3206
+ type: chrf
3207
+ value: 0.63139
3208
+ - task:
3209
+ name: Translation spa-fra
3210
+ type: translation
3211
+ args: spa-fra
3212
+ dataset:
3213
+ name: newstest2011
3214
+ type: wmt-2011-news
3215
+ args: spa-fra
3216
+ metrics:
3217
+ - name: BLEU
3218
+ type: bleu
3219
+ value: 35.2
3220
+ - name: chr-F
3221
+ type: chrf
3222
+ value: 0.61258
3223
+ - task:
3224
+ name: Translation deu-fra
3225
+ type: translation
3226
+ args: deu-fra
3227
+ dataset:
3228
+ name: newstest2012
3229
+ type: wmt-2012-news
3230
+ args: deu-fra
3231
+ metrics:
3232
+ - name: BLEU
3233
+ type: bleu
3234
+ value: 27.6
3235
+ - name: chr-F
3236
+ type: chrf
3237
+ value: 0.56034
3238
+ - task:
3239
+ name: Translation deu-spa
3240
+ type: translation
3241
+ args: deu-spa
3242
+ dataset:
3243
+ name: newstest2012
3244
+ type: wmt-2012-news
3245
+ args: deu-spa
3246
+ metrics:
3247
+ - name: BLEU
3248
+ type: bleu
3249
+ value: 31.6
3250
+ - name: chr-F
3251
+ type: chrf
3252
+ value: 0.57336
3253
+ - task:
3254
+ name: Translation eng-fra
3255
+ type: translation
3256
+ args: eng-fra
3257
+ dataset:
3258
+ name: newstest2012
3259
+ type: wmt-2012-news
3260
+ args: eng-fra
3261
+ metrics:
3262
+ - name: BLEU
3263
+ type: bleu
3264
+ value: 31.9
3265
+ - name: chr-F
3266
+ type: chrf
3267
+ value: 0.59264
3268
+ - task:
3269
+ name: Translation eng-spa
3270
+ type: translation
3271
+ args: eng-spa
3272
+ dataset:
3273
+ name: newstest2012
3274
+ type: wmt-2012-news
3275
+ args: eng-spa
3276
+ metrics:
3277
+ - name: BLEU
3278
+ type: bleu
3279
+ value: 39.1
3280
+ - name: chr-F
3281
+ type: chrf
3282
+ value: 0.62568
3283
+ - task:
3284
+ name: Translation fra-spa
3285
+ type: translation
3286
+ args: fra-spa
3287
+ dataset:
3288
+ name: newstest2012
3289
+ type: wmt-2012-news
3290
+ args: fra-spa
3291
+ metrics:
3292
+ - name: BLEU
3293
+ type: bleu
3294
+ value: 39.5
3295
+ - name: chr-F
3296
+ type: chrf
3297
+ value: 0.62725
3298
+ - task:
3299
+ name: Translation spa-fra
3300
+ type: translation
3301
+ args: spa-fra
3302
+ dataset:
3303
+ name: newstest2012
3304
+ type: wmt-2012-news
3305
+ args: spa-fra
3306
+ metrics:
3307
+ - name: BLEU
3308
+ type: bleu
3309
+ value: 34.2
3310
+ - name: chr-F
3311
+ type: chrf
3312
+ value: 0.61177
3313
+ - task:
3314
+ name: Translation deu-fra
3315
+ type: translation
3316
+ args: deu-fra
3317
+ dataset:
3318
+ name: newstest2013
3319
+ type: wmt-2013-news
3320
+ args: deu-fra
3321
+ metrics:
3322
+ - name: BLEU
3323
+ type: bleu
3324
+ value: 29.9
3325
+ - name: chr-F
3326
+ type: chrf
3327
+ value: 0.56475
3328
+ - task:
3329
+ name: Translation deu-spa
3330
+ type: translation
3331
+ args: deu-spa
3332
+ dataset:
3333
+ name: newstest2013
3334
+ type: wmt-2013-news
3335
+ args: deu-spa
3336
+ metrics:
3337
+ - name: BLEU
3338
+ type: bleu
3339
+ value: 31.9
3340
+ - name: chr-F
3341
+ type: chrf
3342
+ value: 0.57187
3343
+ - task:
3344
+ name: Translation eng-fra
3345
+ type: translation
3346
+ args: eng-fra
3347
+ dataset:
3348
+ name: newstest2013
3349
+ type: wmt-2013-news
3350
+ args: eng-fra
3351
+ metrics:
3352
+ - name: BLEU
3353
+ type: bleu
3354
+ value: 33.3
3355
+ - name: chr-F
3356
+ type: chrf
3357
+ value: 0.58938
3358
+ - task:
3359
+ name: Translation eng-spa
3360
+ type: translation
3361
+ args: eng-spa
3362
+ dataset:
3363
+ name: newstest2013
3364
+ type: wmt-2013-news
3365
+ args: eng-spa
3366
+ metrics:
3367
+ - name: BLEU
3368
+ type: bleu
3369
+ value: 35.2
3370
+ - name: chr-F
3371
+ type: chrf
3372
+ value: 0.59817
3373
+ - task:
3374
+ name: Translation fra-spa
3375
+ type: translation
3376
+ args: fra-spa
3377
+ dataset:
3378
+ name: newstest2013
3379
+ type: wmt-2013-news
3380
+ args: fra-spa
3381
+ metrics:
3382
+ - name: BLEU
3383
+ type: bleu
3384
+ value: 35.1
3385
+ - name: chr-F
3386
+ type: chrf
3387
+ value: 0.59482
3388
+ - task:
3389
+ name: Translation spa-fra
3390
+ type: translation
3391
+ args: spa-fra
3392
+ dataset:
3393
+ name: newstest2013
3394
+ type: wmt-2013-news
3395
+ args: spa-fra
3396
+ metrics:
3397
+ - name: BLEU
3398
+ type: bleu
3399
+ value: 33.9
3400
+ - name: chr-F
3401
+ type: chrf
3402
+ value: 0.59825
3403
+ - task:
3404
+ name: Translation eng-fra
3405
+ type: translation
3406
+ args: eng-fra
3407
+ dataset:
3408
+ name: newstest2014
3409
+ type: wmt-2014-news
3410
+ args: eng-fra
3411
+ metrics:
3412
+ - name: BLEU
3413
+ type: bleu
3414
+ value: 40.2
3415
+ - name: chr-F
3416
+ type: chrf
3417
+ value: 0.65438
3418
+ - task:
3419
+ name: Translation eng-ron
3420
+ type: translation
3421
+ args: eng-ron
3422
+ dataset:
3423
+ name: newstest2016
3424
+ type: wmt-2016-news
3425
+ args: eng-ron
3426
+ metrics:
3427
+ - name: BLEU
3428
+ type: bleu
3429
+ value: 32.2
3430
+ - name: chr-F
3431
+ type: chrf
3432
+ value: 0.59473
3433
+ - task:
3434
+ name: Translation deu-fra
3435
+ type: translation
3436
+ args: deu-fra
3437
+ dataset:
3438
+ name: newstest2019
3439
+ type: wmt-2019-news
3440
+ args: deu-fra
3441
+ metrics:
3442
+ - name: BLEU
3443
+ type: bleu
3444
+ value: 35.9
3445
+ - name: chr-F
3446
+ type: chrf
3447
+ value: 0.62831
3448
+ - task:
3449
+ name: Translation deu-fra
3450
+ type: translation
3451
+ args: deu-fra
3452
+ dataset:
3453
+ name: newstest2020
3454
+ type: wmt-2020-news
3455
+ args: deu-fra
3456
+ metrics:
3457
+ - name: BLEU
3458
+ type: bleu
3459
+ value: 33.0
3460
+ - name: chr-F
3461
+ type: chrf
3462
+ value: 0.60408
3463
+ - task:
3464
+ name: Translation deu-fra
3465
+ type: translation
3466
+ args: deu-fra
3467
+ dataset:
3468
+ name: newstest2021
3469
+ type: wmt-2021-news
3470
+ args: deu-fra
3471
+ metrics:
3472
+ - name: BLEU
3473
+ type: bleu
3474
+ value: 31.3
3475
+ - name: chr-F
3476
+ type: chrf
3477
+ value: 0.58913
3478
+ ---
3479
+ # opus-mt-tc-bible-big-deu_eng_fra_por_spa-itc
3480
+
3481
+ ## Table of Contents
3482
+ - [Model Details](#model-details)
3483
+ - [Uses](#uses)
3484
+ - [Risks, Limitations and Biases](#risks-limitations-and-biases)
3485
+ - [How to Get Started With the Model](#how-to-get-started-with-the-model)
3486
+ - [Training](#training)
3487
+ - [Evaluation](#evaluation)
3488
+ - [Citation Information](#citation-information)
3489
+ - [Acknowledgements](#acknowledgements)
3490
+
3491
+ ## Model Details
3492
+
3493
+ Neural machine translation model for translating from unknown (deu+eng+fra+por+spa) to Italic languages (itc).
3494
+
3495
+ This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
3496
+ **Model Description:**
3497
+ - **Developed by:** Language Technology Research Group at the University of Helsinki
3498
+ - **Model Type:** Translation (transformer-big)
3499
+ - **Release**: 2024-05-30
3500
+ - **License:** Apache-2.0
3501
+ - **Language(s):**
3502
+ - Source Language(s): deu eng fra por spa
3503
+ - Target Language(s): acf arg ast cat cbk cos crs egl ext fra frm fro frp fur gcf glg hat ita kea lad lat lij lld lmo lou mfe mol mwl nap oci osp pap pcd pms por roh ron rup scn spa srd vec wln
3504
+ - Valid Target Language Labels: >>acf<< >>aoa<< >>arg<< >>ast<< >>cat<< >>cbk<< >>cbk_Latn<< >>ccd<< >>cks<< >>cos<< >>cri<< >>crs<< >>dlm<< >>drc<< >>egl<< >>ext<< >>fab<< >>fax<< >>fra<< >>frc<< >>frm<< >>frm_Latn<< >>fro<< >>fro_Latn<< >>frp<< >>fur<< >>gcf<< >>gcf_Latn<< >>gcr<< >>glg<< >>hat<< >>idb<< >>ist<< >>ita<< >>itk<< >>kea<< >>kmv<< >>lad<< >>lad_Latn<< >>lat<< >>lat_Latn<< >>lij<< >>lld<< >>lld_Latn<< >>lmo<< >>lou<< >>lou_Latn<< >>mcm<< >>mfe<< >>mol<< >>mwl<< >>mxi<< >>mzs<< >>nap<< >>nrf<< >>oci<< >>osc<< >>osp<< >>osp_Latn<< >>pap<< >>pcd<< >>pln<< >>pms<< >>por<< >>pov<< >>pre<< >>pro<< >>rcf<< >>rgn<< >>roh<< >>ron<< >>ruo<< >>rup<< >>ruq<< >>scf<< >>scn<< >>spa<< >>spq<< >>spx<< >>srd<< >>tmg<< >>tvy<< >>vec<< >>vkp<< >>wln<< >>xfa<< >>xum<< >>xxx<<
3505
+ - **Original Model**: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/deu+eng+fra+por+spa-itc/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip)
3506
+ - **Resources for more information:**
3507
+ - [OPUS-MT dashboard](https://opus.nlpl.eu/dashboard/index.php?pkg=opusmt&test=all&scoreslang=all&chart=standard&model=Tatoeba-MT-models/deu%2Beng%2Bfra%2Bpor%2Bspa-itc/opusTCv20230926max50%2Bbt%2Bjhubc_transformer-big_2024-05-30)
3508
+ - [OPUS-MT-train GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
3509
+ - [More information about MarianNMT models in the transformers library](https://huggingface.co/docs/transformers/model_doc/marian)
3510
+ - [Tatoeba Translation Challenge](https://github.com/Helsinki-NLP/Tatoeba-Challenge/)
3511
+ - [HPLT bilingual data v1 (as part of the Tatoeba Translation Challenge dataset)](https://hplt-project.org/datasets/v1)
3512
+ - [A massively parallel Bible corpus](https://aclanthology.org/L14-1215/)
3513
+
3514
+ This is a multilingual translation model with multiple target languages. A sentence initial language token is required in the form of `>>id<<` (id = valid target language ID), e.g. `>>acf<<`
3515
+
3516
+ ## Uses
3517
+
3518
+ This model can be used for translation and text-to-text generation.
3519
+
3520
+ ## Risks, Limitations and Biases
3521
+
3522
+ **CONTENT WARNING: Readers should be aware that the model is trained on various public data sets that may contain content that is disturbing, offensive, and can propagate historical and current stereotypes.**
3523
+
3524
+ Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
3525
+
3526
+ ## How to Get Started With the Model
3527
+
3528
+ A short example code:
3529
+
3530
+ ```python
3531
+ from transformers import MarianMTModel, MarianTokenizer
3532
+
3533
+ src_text = [
3534
+ ">>acf<< Replace this with text in an accepted source language.",
3535
+ ">>wln<< This is the second sentence."
3536
+ ]
3537
+
3538
+ model_name = "pytorch-models/opus-mt-tc-bible-big-deu_eng_fra_por_spa-itc"
3539
+ tokenizer = MarianTokenizer.from_pretrained(model_name)
3540
+ model = MarianMTModel.from_pretrained(model_name)
3541
+ translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True))
3542
+
3543
+ for t in translated:
3544
+ print( tokenizer.decode(t, skip_special_tokens=True) )
3545
+ ```
3546
+
3547
+ You can also use OPUS-MT models with the transformers pipelines, for example:
3548
+
3549
+ ```python
3550
+ from transformers import pipeline
3551
+ pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-bible-big-deu_eng_fra_por_spa-itc")
3552
+ print(pipe(">>acf<< Replace this with text in an accepted source language."))
3553
+ ```
3554
+
3555
+ ## Training
3556
+
3557
+ - **Data**: opusTCv20230926max50+bt+jhubc ([source](https://github.com/Helsinki-NLP/Tatoeba-Challenge))
3558
+ - **Pre-processing**: SentencePiece (spm32k,spm32k)
3559
+ - **Model Type:** transformer-big
3560
+ - **Original MarianNMT Model**: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/deu+eng+fra+por+spa-itc/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip)
3561
+ - **Training Scripts**: [GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
3562
+
3563
+ ## Evaluation
3564
+
3565
+ * [Model scores at the OPUS-MT dashboard](https://opus.nlpl.eu/dashboard/index.php?pkg=opusmt&test=all&scoreslang=all&chart=standard&model=Tatoeba-MT-models/deu%2Beng%2Bfra%2Bpor%2Bspa-itc/opusTCv20230926max50%2Bbt%2Bjhubc_transformer-big_2024-05-30)
3566
+ * test set translations: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/deu+eng+fra+por+spa-itc/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.test.txt)
3567
+ * test set scores: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/deu+eng+fra+por+spa-itc/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.eval.txt)
3568
+ * benchmark results: [benchmark_results.txt](benchmark_results.txt)
3569
+ * benchmark output: [benchmark_translations.zip](benchmark_translations.zip)
3570
+
3571
+ | langpair | testset | chr-F | BLEU | #sent | #words |
3572
+ |----------|---------|-------|-------|-------|--------|
3573
+ | deu-cat | tatoeba-test-v2021-08-07 | 0.63465 | 44.3 | 723 | 5539 |
3574
+ | deu-fra | tatoeba-test-v2021-08-07 | 0.68258 | 50.7 | 12418 | 102721 |
3575
+ | deu-ita | tatoeba-test-v2021-08-07 | 0.68502 | 47.4 | 10094 | 75504 |
3576
+ | deu-lad | tatoeba-test-v2021-08-07 | 0.38047 | 22.0 | 220 | 1130 |
3577
+ | deu-lat | tatoeba-test-v2021-08-07 | 0.42567 | 16.2 | 2016 | 10538 |
3578
+ | deu-por | tatoeba-test-v2021-08-07 | 0.63684 | 43.1 | 10000 | 81482 |
3579
+ | deu-ron | tatoeba-test-v2021-08-07 | 0.64207 | 42.6 | 1141 | 7432 |
3580
+ | deu-spa | tatoeba-test-v2021-08-07 | 0.68333 | 49.4 | 10521 | 82570 |
3581
+ | eng-cat | tatoeba-test-v2021-08-07 | 0.67724 | 49.1 | 1631 | 12344 |
3582
+ | eng-fra | tatoeba-test-v2021-08-07 | 0.68777 | 51.6 | 12681 | 106378 |
3583
+ | eng-glg | tatoeba-test-v2021-08-07 | 0.64530 | 45.2 | 1015 | 7881 |
3584
+ | eng-ita | tatoeba-test-v2021-08-07 | 0.72115 | 53.3 | 17320 | 116336 |
3585
+ | eng-lad | tatoeba-test-v2021-08-07 | 0.43857 | 24.2 | 768 | 4105 |
3586
+ | eng-lad_Latn | tatoeba-test-v2021-08-07 | 0.50848 | 27.6 | 672 | 3580 |
3587
+ | eng-lat | tatoeba-test-v2021-08-07 | 0.45710 | 20.0 | 10298 | 76510 |
3588
+ | eng-por | tatoeba-test-v2021-08-07 | 0.72159 | 53.4 | 13222 | 105265 |
3589
+ | eng-ron | tatoeba-test-v2021-08-07 | 0.67835 | 47.1 | 5508 | 40367 |
3590
+ | eng-spa | tatoeba-test-v2021-08-07 | 0.72875 | 55.8 | 16583 | 134710 |
3591
+ | fra-cat | tatoeba-test-v2021-08-07 | 0.65547 | 44.6 | 700 | 5342 |
3592
+ | fra-fra | tatoeba-test-v2021-08-07 | 0.61650 | 39.9 | 1000 | 7757 |
3593
+ | fra-ita | tatoeba-test-v2021-08-07 | 0.72739 | 53.5 | 10091 | 62060 |
3594
+ | fra-por | tatoeba-test-v2021-08-07 | 0.70655 | 52.0 | 10518 | 77650 |
3595
+ | fra-ron | tatoeba-test-v2021-08-07 | 0.65399 | 43.7 | 1925 | 12252 |
3596
+ | fra-spa | tatoeba-test-v2021-08-07 | 0.72083 | 54.8 | 10294 | 78406 |
3597
+ | por-cat | tatoeba-test-v2021-08-07 | 0.71178 | 52.0 | 747 | 6149 |
3598
+ | por-fra | tatoeba-test-v2021-08-07 | 0.75691 | 60.4 | 10518 | 80459 |
3599
+ | por-glg | tatoeba-test-v2021-08-07 | 0.74818 | 57.6 | 433 | 3016 |
3600
+ | por-ita | tatoeba-test-v2021-08-07 | 0.76899 | 58.7 | 3066 | 24897 |
3601
+ | por-por | tatoeba-test-v2021-08-07 | 0.71775 | 51.0 | 2500 | 19220 |
3602
+ | por-ron | tatoeba-test-v2021-08-07 | 0.69517 | 47.8 | 681 | 4521 |
3603
+ | por-spa | tatoeba-test-v2021-08-07 | 0.79442 | 64.9 | 10947 | 87335 |
3604
+ | spa-cat | tatoeba-test-v2021-08-07 | 0.81845 | 66.3 | 1534 | 12343 |
3605
+ | spa-fra | tatoeba-test-v2021-08-07 | 0.73277 | 57.4 | 10294 | 83501 |
3606
+ | spa-glg | tatoeba-test-v2021-08-07 | 0.76118 | 61.5 | 2121 | 16581 |
3607
+ | spa-ita | tatoeba-test-v2021-08-07 | 0.76742 | 59.5 | 5000 | 34515 |
3608
+ | spa-lad | tatoeba-test-v2021-08-07 | 0.43064 | 23.4 | 276 | 1464 |
3609
+ | spa-lad_Latn | tatoeba-test-v2021-08-07 | 0.50795 | 27.1 | 239 | 1254 |
3610
+ | spa-lat | tatoeba-test-v2021-08-07 | 0.44044 | 18.8 | 3129 | 27685 |
3611
+ | spa-por | tatoeba-test-v2021-08-07 | 0.76951 | 60.7 | 10947 | 87610 |
3612
+ | spa-ron | tatoeba-test-v2021-08-07 | 0.67782 | 45.9 | 1959 | 12503 |
3613
+ | spa-spa | tatoeba-test-v2021-08-07 | 0.67346 | 49.6 | 2500 | 21469 |
3614
+ | deu-ast | flores101-devtest | 0.53230 | 21.5 | 1012 | 24572 |
3615
+ | deu-cat | flores101-devtest | 0.58466 | 31.6 | 1012 | 27304 |
3616
+ | deu-fra | flores101-devtest | 0.62370 | 36.5 | 1012 | 28343 |
3617
+ | deu-glg | flores101-devtest | 0.55693 | 28.0 | 1012 | 26582 |
3618
+ | deu-oci | flores101-devtest | 0.52253 | 22.3 | 1012 | 27305 |
3619
+ | deu-por | flores101-devtest | 0.60688 | 34.8 | 1012 | 26519 |
3620
+ | deu-ron | flores101-devtest | 0.57333 | 30.3 | 1012 | 26799 |
3621
+ | eng-cat | flores101-devtest | 0.66607 | 42.5 | 1012 | 27304 |
3622
+ | eng-fra | flores101-devtest | 0.70492 | 48.8 | 1012 | 28343 |
3623
+ | eng-por | flores101-devtest | 0.71112 | 49.3 | 1012 | 26519 |
3624
+ | eng-ron | flores101-devtest | 0.64856 | 40.3 | 1012 | 26799 |
3625
+ | fra-oci | flores101-devtest | 0.58559 | 29.2 | 1012 | 27305 |
3626
+ | fra-ron | flores101-devtest | 0.58922 | 32.1 | 1012 | 26799 |
3627
+ | por-kea | flores101-devtest | 0.40779 | 12.8 | 1012 | 25540 |
3628
+ | por-oci | flores101-devtest | 0.57016 | 27.5 | 1012 | 27305 |
3629
+ | spa-ast | flores101-devtest | 0.49666 | 16.3 | 1012 | 24572 |
3630
+ | spa-cat | flores101-devtest | 0.54015 | 23.2 | 1012 | 27304 |
3631
+ | spa-glg | flores101-devtest | 0.52923 | 22.1 | 1012 | 26582 |
3632
+ | spa-oci | flores101-devtest | 0.49285 | 17.2 | 1012 | 27305 |
3633
+ | spa-por | flores101-devtest | 0.55944 | 25.7 | 1012 | 26519 |
3634
+ | spa-ron | flores101-devtest | 0.53282 | 23.3 | 1012 | 26799 |
3635
+ | deu-ast | flores200-devtest | 0.53782 | 22.1 | 1012 | 24572 |
3636
+ | deu-cat | flores200-devtest | 0.58846 | 32.2 | 1012 | 27304 |
3637
+ | deu-fra | flores200-devtest | 0.62803 | 37.2 | 1012 | 28343 |
3638
+ | deu-fur | flores200-devtest | 0.46372 | 18.7 | 1012 | 29171 |
3639
+ | deu-glg | flores200-devtest | 0.56229 | 28.7 | 1012 | 26582 |
3640
+ | deu-hat | flores200-devtest | 0.46752 | 15.7 | 1012 | 25833 |
3641
+ | deu-ita | flores200-devtest | 0.55344 | 25.8 | 1012 | 27306 |
3642
+ | deu-lij | flores200-devtest | 0.40732 | 11.8 | 1012 | 28625 |
3643
+ | deu-oci | flores200-devtest | 0.52749 | 23.1 | 1012 | 27305 |
3644
+ | deu-pap | flores200-devtest | 0.49721 | 22.4 | 1012 | 28016 |
3645
+ | deu-por | flores200-devtest | 0.60818 | 34.7 | 1012 | 26519 |
3646
+ | deu-ron | flores200-devtest | 0.57873 | 31.1 | 1012 | 26799 |
3647
+ | deu-spa | flores200-devtest | 0.52442 | 24.4 | 1012 | 29199 |
3648
+ | deu-srd | flores200-devtest | 0.45629 | 16.1 | 1012 | 28322 |
3649
+ | eng-ast | flores200-devtest | 0.59255 | 27.8 | 1012 | 24572 |
3650
+ | eng-cat | flores200-devtest | 0.66809 | 42.8 | 1012 | 27304 |
3651
+ | eng-fra | flores200-devtest | 0.71001 | 49.5 | 1012 | 28343 |
3652
+ | eng-fur | flores200-devtest | 0.49164 | 23.0 | 1012 | 29171 |
3653
+ | eng-glg | flores200-devtest | 0.62349 | 36.1 | 1012 | 26582 |
3654
+ | eng-hat | flores200-devtest | 0.51720 | 21.3 | 1012 | 25833 |
3655
+ | eng-ita | flores200-devtest | 0.58898 | 29.7 | 1012 | 27306 |
3656
+ | eng-lij | flores200-devtest | 0.43644 | 14.8 | 1012 | 28625 |
3657
+ | eng-oci | flores200-devtest | 0.63245 | 35.2 | 1012 | 27305 |
3658
+ | eng-pap | flores200-devtest | 0.56775 | 30.4 | 1012 | 28016 |
3659
+ | eng-por | flores200-devtest | 0.71438 | 50.0 | 1012 | 26519 |
3660
+ | eng-ron | flores200-devtest | 0.65373 | 41.2 | 1012 | 26799 |
3661
+ | eng-spa | flores200-devtest | 0.55784 | 27.6 | 1012 | 29199 |
3662
+ | eng-srd | flores200-devtest | 0.49876 | 21.0 | 1012 | 28322 |
3663
+ | fra-ast | flores200-devtest | 0.53904 | 22.0 | 1012 | 24572 |
3664
+ | fra-cat | flores200-devtest | 0.60549 | 34.5 | 1012 | 27304 |
3665
+ | fra-fur | flores200-devtest | 0.49119 | 21.4 | 1012 | 29171 |
3666
+ | fra-glg | flores200-devtest | 0.57998 | 31.3 | 1012 | 26582 |
3667
+ | fra-hat | flores200-devtest | 0.52018 | 20.7 | 1012 | 25833 |
3668
+ | fra-ita | flores200-devtest | 0.56470 | 27.0 | 1012 | 27306 |
3669
+ | fra-lij | flores200-devtest | 0.43180 | 13.6 | 1012 | 28625 |
3670
+ | fra-oci | flores200-devtest | 0.58268 | 29.2 | 1012 | 27305 |
3671
+ | fra-pap | flores200-devtest | 0.51029 | 23.6 | 1012 | 28016 |
3672
+ | fra-por | flores200-devtest | 0.62540 | 37.5 | 1012 | 26519 |
3673
+ | fra-ron | flores200-devtest | 0.59255 | 32.7 | 1012 | 26799 |
3674
+ | fra-spa | flores200-devtest | 0.53001 | 24.4 | 1012 | 29199 |
3675
+ | fra-srd | flores200-devtest | 0.47645 | 17.9 | 1012 | 28322 |
3676
+ | por-ast | flores200-devtest | 0.55369 | 23.9 | 1012 | 24572 |
3677
+ | por-cat | flores200-devtest | 0.61981 | 36.4 | 1012 | 27304 |
3678
+ | por-fra | flores200-devtest | 0.64654 | 40.4 | 1012 | 28343 |
3679
+ | por-fur | flores200-devtest | 0.50078 | 22.1 | 1012 | 29171 |
3680
+ | por-glg | flores200-devtest | 0.58336 | 31.1 | 1012 | 26582 |
3681
+ | por-hat | flores200-devtest | 0.48834 | 18.0 | 1012 | 25833 |
3682
+ | por-ita | flores200-devtest | 0.56077 | 26.7 | 1012 | 27306 |
3683
+ | por-kea | flores200-devtest | 0.42451 | 13.6 | 1012 | 25540 |
3684
+ | por-lij | flores200-devtest | 0.43715 | 13.4 | 1012 | 28625 |
3685
+ | por-oci | flores200-devtest | 0.57143 | 28.1 | 1012 | 27305 |
3686
+ | por-pap | flores200-devtest | 0.52192 | 25.0 | 1012 | 28016 |
3687
+ | por-ron | flores200-devtest | 0.59962 | 34.2 | 1012 | 26799 |
3688
+ | por-spa | flores200-devtest | 0.53772 | 25.6 | 1012 | 29199 |
3689
+ | por-srd | flores200-devtest | 0.48882 | 18.8 | 1012 | 28322 |
3690
+ | spa-ast | flores200-devtest | 0.49512 | 16.3 | 1012 | 24572 |
3691
+ | spa-cat | flores200-devtest | 0.53968 | 23.1 | 1012 | 27304 |
3692
+ | spa-fra | flores200-devtest | 0.57461 | 27.9 | 1012 | 28343 |
3693
+ | spa-fur | flores200-devtest | 0.45785 | 16.1 | 1012 | 29171 |
3694
+ | spa-glg | flores200-devtest | 0.52933 | 22.2 | 1012 | 26582 |
3695
+ | spa-hat | flores200-devtest | 0.44627 | 13.0 | 1012 | 25833 |
3696
+ | spa-ita | flores200-devtest | 0.53063 | 22.4 | 1012 | 27306 |
3697
+ | spa-oci | flores200-devtest | 0.49293 | 17.4 | 1012 | 27305 |
3698
+ | spa-pap | flores200-devtest | 0.46595 | 17.7 | 1012 | 28016 |
3699
+ | spa-por | flores200-devtest | 0.56138 | 25.9 | 1012 | 26519 |
3700
+ | spa-ron | flores200-devtest | 0.53609 | 23.8 | 1012 | 26799 |
3701
+ | spa-srd | flores200-devtest | 0.44898 | 13.3 | 1012 | 28322 |
3702
+ | deu-fra | generaltest2022 | 0.60634 | 37.4 | 1984 | 38276 |
3703
+ | deu-fra | multi30k_test_2016_flickr | 0.62595 | 38.5 | 1000 | 13505 |
3704
+ | eng-fra | multi30k_test_2016_flickr | 0.71630 | 51.4 | 1000 | 13505 |
3705
+ | deu-fra | multi30k_test_2017_flickr | 0.62733 | 37.3 | 1000 | 12118 |
3706
+ | eng-fra | multi30k_test_2017_flickr | 0.71850 | 50.8 | 1000 | 12118 |
3707
+ | deu-fra | multi30k_test_2017_mscoco | 0.59089 | 33.8 | 461 | 5484 |
3708
+ | eng-fra | multi30k_test_2017_mscoco | 0.73129 | 54.1 | 461 | 5484 |
3709
+ | deu-fra | multi30k_test_2018_flickr | 0.57155 | 30.9 | 1071 | 15867 |
3710
+ | eng-fra | multi30k_test_2018_flickr | 0.65461 | 41.9 | 1071 | 15867 |
3711
+ | eng-fra | newsdiscusstest2015 | 0.63660 | 38.5 | 1500 | 27975 |
3712
+ | deu-fra | newssyscomb2009 | 0.56035 | 27.6 | 502 | 12331 |
3713
+ | deu-ita | newssyscomb2009 | 0.55722 | 25.1 | 502 | 11551 |
3714
+ | deu-spa | newssyscomb2009 | 0.55595 | 28.5 | 502 | 12503 |
3715
+ | eng-fra | newssyscomb2009 | 0.58465 | 29.5 | 502 | 12331 |
3716
+ | eng-ita | newssyscomb2009 | 0.60792 | 31.3 | 502 | 11551 |
3717
+ | eng-spa | newssyscomb2009 | 0.58219 | 31.0 | 502 | 12503 |
3718
+ | fra-ita | newssyscomb2009 | 0.61352 | 31.9 | 502 | 11551 |
3719
+ | fra-spa | newssyscomb2009 | 0.60430 | 34.3 | 502 | 12503 |
3720
+ | spa-fra | newssyscomb2009 | 0.61491 | 34.6 | 502 | 12331 |
3721
+ | spa-ita | newssyscomb2009 | 0.61861 | 33.7 | 502 | 11551 |
3722
+ | deu-fra | newstest2008 | 0.54926 | 26.3 | 2051 | 52685 |
3723
+ | deu-spa | newstest2008 | 0.53902 | 25.5 | 2051 | 52586 |
3724
+ | eng-fra | newstest2008 | 0.55358 | 26.8 | 2051 | 52685 |
3725
+ | eng-spa | newstest2008 | 0.56491 | 29.5 | 2051 | 52586 |
3726
+ | fra-spa | newstest2008 | 0.58764 | 33.0 | 2051 | 52586 |
3727
+ | spa-fra | newstest2008 | 0.58848 | 32.4 | 2051 | 52685 |
3728
+ | deu-fra | newstest2009 | 0.53870 | 25.4 | 2525 | 69263 |
3729
+ | deu-ita | newstest2009 | 0.54509 | 24.4 | 2525 | 63466 |
3730
+ | deu-spa | newstest2009 | 0.53769 | 25.7 | 2525 | 68111 |
3731
+ | eng-fra | newstest2009 | 0.57566 | 29.3 | 2525 | 69263 |
3732
+ | eng-ita | newstest2009 | 0.60372 | 31.4 | 2525 | 63466 |
3733
+ | eng-spa | newstest2009 | 0.57913 | 30.0 | 2525 | 68111 |
3734
+ | fra-ita | newstest2009 | 0.59749 | 30.5 | 2525 | 63466 |
3735
+ | fra-spa | newstest2009 | 0.58921 | 32.1 | 2525 | 68111 |
3736
+ | spa-fra | newstest2009 | 0.59195 | 32.3 | 2525 | 69263 |
3737
+ | spa-ita | newstest2009 | 0.61007 | 33.0 | 2525 | 63466 |
3738
+ | deu-fra | newstest2010 | 0.57888 | 29.5 | 2489 | 66022 |
3739
+ | deu-spa | newstest2010 | 0.59408 | 32.7 | 2489 | 65480 |
3740
+ | eng-fra | newstest2010 | 0.59588 | 32.4 | 2489 | 66022 |
3741
+ | eng-spa | newstest2010 | 0.61978 | 36.6 | 2489 | 65480 |
3742
+ | fra-spa | newstest2010 | 0.62513 | 37.7 | 2489 | 65480 |
3743
+ | spa-fra | newstest2010 | 0.62193 | 36.1 | 2489 | 66022 |
3744
+ | deu-fra | newstest2011 | 0.55704 | 27.5 | 3003 | 80626 |
3745
+ | deu-spa | newstest2011 | 0.56696 | 30.4 | 3003 | 79476 |
3746
+ | eng-fra | newstest2011 | 0.61071 | 34.3 | 3003 | 80626 |
3747
+ | eng-spa | newstest2011 | 0.62126 | 38.7 | 3003 | 79476 |
3748
+ | fra-spa | newstest2011 | 0.63139 | 40.0 | 3003 | 79476 |
3749
+ | spa-fra | newstest2011 | 0.61258 | 35.2 | 3003 | 80626 |
3750
+ | deu-fra | newstest2012 | 0.56034 | 27.6 | 3003 | 78011 |
3751
+ | deu-spa | newstest2012 | 0.57336 | 31.6 | 3003 | 79006 |
3752
+ | eng-fra | newstest2012 | 0.59264 | 31.9 | 3003 | 78011 |
3753
+ | eng-spa | newstest2012 | 0.62568 | 39.1 | 3003 | 79006 |
3754
+ | fra-spa | newstest2012 | 0.62725 | 39.5 | 3003 | 79006 |
3755
+ | spa-fra | newstest2012 | 0.61177 | 34.2 | 3003 | 78011 |
3756
+ | deu-fra | newstest2013 | 0.56475 | 29.9 | 3000 | 70037 |
3757
+ | deu-spa | newstest2013 | 0.57187 | 31.9 | 3000 | 70528 |
3758
+ | eng-fra | newstest2013 | 0.58938 | 33.3 | 3000 | 70037 |
3759
+ | eng-spa | newstest2013 | 0.59817 | 35.2 | 3000 | 70528 |
3760
+ | fra-spa | newstest2013 | 0.59482 | 35.1 | 3000 | 70528 |
3761
+ | spa-fra | newstest2013 | 0.59825 | 33.9 | 3000 | 70037 |
3762
+ | eng-fra | newstest2014 | 0.65438 | 40.2 | 3003 | 77306 |
3763
+ | eng-ron | newstest2016 | 0.59473 | 32.2 | 1999 | 48945 |
3764
+ | deu-fra | newstest2019 | 0.62831 | 35.9 | 1701 | 42509 |
3765
+ | deu-fra | newstest2020 | 0.60408 | 33.0 | 1619 | 36890 |
3766
+ | deu-fra | newstest2021 | 0.58913 | 31.3 | 1000 | 23757 |
3767
+ | deu-cat | ntrex128 | 0.55033 | 28.2 | 1997 | 53438 |
3768
+ | deu-fra | ntrex128 | 0.55854 | 28.5 | 1997 | 53481 |
3769
+ | deu-glg | ntrex128 | 0.55034 | 27.8 | 1997 | 50432 |
3770
+ | deu-ita | ntrex128 | 0.55733 | 26.6 | 1997 | 50759 |
3771
+ | deu-por | ntrex128 | 0.54208 | 26.0 | 1997 | 51631 |
3772
+ | deu-ron | ntrex128 | 0.52839 | 26.6 | 1997 | 53498 |
3773
+ | deu-spa | ntrex128 | 0.56966 | 30.8 | 1997 | 54107 |
3774
+ | eng-cat | ntrex128 | 0.61431 | 36.3 | 1997 | 53438 |
3775
+ | eng-fra | ntrex128 | 0.61695 | 35.5 | 1997 | 53481 |
3776
+ | eng-glg | ntrex128 | 0.62390 | 37.2 | 1997 | 50432 |
3777
+ | eng-ita | ntrex128 | 0.62209 | 36.1 | 1997 | 50759 |
3778
+ | eng-por | ntrex128 | 0.59859 | 33.5 | 1997 | 51631 |
3779
+ | eng-ron | ntrex128 | 0.58128 | 33.4 | 1997 | 53498 |
3780
+ | eng-spa | ntrex128 | 0.64099 | 40.3 | 1997 | 54107 |
3781
+ | fra-cat | ntrex128 | 0.55093 | 28.1 | 1997 | 53438 |
3782
+ | fra-glg | ntrex128 | 0.55325 | 28.0 | 1997 | 50432 |
3783
+ | fra-ita | ntrex128 | 0.56188 | 27.4 | 1997 | 50759 |
3784
+ | fra-por | ntrex128 | 0.54001 | 25.6 | 1997 | 51631 |
3785
+ | fra-ron | ntrex128 | 0.51853 | 24.8 | 1997 | 53498 |
3786
+ | fra-spa | ntrex128 | 0.57116 | 31.0 | 1997 | 54107 |
3787
+ | por-cat | ntrex128 | 0.57962 | 31.6 | 1997 | 53438 |
3788
+ | por-fra | ntrex128 | 0.56910 | 28.9 | 1997 | 53481 |
3789
+ | por-glg | ntrex128 | 0.57389 | 30.3 | 1997 | 50432 |
3790
+ | por-ita | ntrex128 | 0.58788 | 30.6 | 1997 | 50759 |
3791
+ | por-ron | ntrex128 | 0.54276 | 28.0 | 1997 | 53498 |
3792
+ | por-spa | ntrex128 | 0.59565 | 34.2 | 1997 | 54107 |
3793
+ | spa-cat | ntrex128 | 0.60605 | 34.0 | 1997 | 53438 |
3794
+ | spa-fra | ntrex128 | 0.57501 | 29.6 | 1997 | 53481 |
3795
+ | spa-glg | ntrex128 | 0.61300 | 34.4 | 1997 | 50432 |
3796
+ | spa-ita | ntrex128 | 0.57868 | 28.9 | 1997 | 50759 |
3797
+ | spa-por | ntrex128 | 0.56730 | 29.1 | 1997 | 51631 |
3798
+ | spa-ron | ntrex128 | 0.54222 | 27.9 | 1997 | 53498 |
3799
+ | eng-fra | tico19-test | 0.62989 | 40.1 | 2100 | 64661 |
3800
+ | eng-por | tico19-test | 0.72708 | 50.0 | 2100 | 62729 |
3801
+ | eng-spa | tico19-test | 0.73154 | 52.0 | 2100 | 66563 |
3802
+ | fra-por | tico19-test | 0.58383 | 34.1 | 2100 | 62729 |
3803
+ | fra-spa | tico19-test | 0.59581 | 37.0 | 2100 | 66563 |
3804
+ | por-fra | tico19-test | 0.59798 | 34.4 | 2100 | 64661 |
3805
+ | por-spa | tico19-test | 0.68332 | 45.4 | 2100 | 66563 |
3806
+ | spa-fra | tico19-test | 0.60469 | 35.5 | 2100 | 64661 |
3807
+ | spa-por | tico19-test | 0.67898 | 42.8 | 2100 | 62729 |
3808
+
3809
+ ## Citation Information
3810
+
3811
+ * Publications: [Democratizing neural machine translation with OPUS-MT](https://doi.org/10.1007/s10579-023-09704-w) and [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
3812
+
3813
+ ```bibtex
3814
+ @article{tiedemann2023democratizing,
3815
+ title={Democratizing neural machine translation with {OPUS-MT}},
3816
+ author={Tiedemann, J{\"o}rg and Aulamo, Mikko and Bakshandaeva, Daria and Boggia, Michele and Gr{\"o}nroos, Stig-Arne and Nieminen, Tommi and Raganato, Alessandro and Scherrer, Yves and Vazquez, Raul and Virpioja, Sami},
3817
+ journal={Language Resources and Evaluation},
3818
+ number={58},
3819
+ pages={713--755},
3820
+ year={2023},
3821
+ publisher={Springer Nature},
3822
+ issn={1574-0218},
3823
+ doi={10.1007/s10579-023-09704-w}
3824
+ }
3825
+
3826
+ @inproceedings{tiedemann-thottingal-2020-opus,
3827
+ title = "{OPUS}-{MT} {--} Building open translation services for the World",
3828
+ author = {Tiedemann, J{\"o}rg and Thottingal, Santhosh},
3829
+ booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
3830
+ month = nov,
3831
+ year = "2020",
3832
+ address = "Lisboa, Portugal",
3833
+ publisher = "European Association for Machine Translation",
3834
+ url = "https://aclanthology.org/2020.eamt-1.61",
3835
+ pages = "479--480",
3836
+ }
3837
+
3838
+ @inproceedings{tiedemann-2020-tatoeba,
3839
+ title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
3840
+ author = {Tiedemann, J{\"o}rg},
3841
+ booktitle = "Proceedings of the Fifth Conference on Machine Translation",
3842
+ month = nov,
3843
+ year = "2020",
3844
+ address = "Online",
3845
+ publisher = "Association for Computational Linguistics",
3846
+ url = "https://aclanthology.org/2020.wmt-1.139",
3847
+ pages = "1174--1182",
3848
+ }
3849
+ ```
3850
+
3851
+ ## Acknowledgements
3852
+
3853
+ The work is supported by the [HPLT project](https://hplt-project.org/), funded by the European Union’s Horizon Europe research and innovation programme under grant agreement No 101070350. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland, and the [EuroHPC supercomputer LUMI](https://www.lumi-supercomputer.eu/).
3854
+
3855
+ ## Model conversion info
3856
+
3857
+ * transformers version: 4.45.1
3858
+ * OPUS-MT git hash: 0882077
3859
+ * port time: Tue Oct 8 10:16:22 EEST 2024
3860
+ * port machine: LM0-400-22516.local
benchmark_results.txt ADDED
@@ -0,0 +1,323 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ multi-multi tatoeba-test-v2020-07-28-v2023-09-26 0.67768 49.7 10000 81875
2
+ deu-ast flores101-devtest 0.53230 21.5 1012 24572
3
+ deu-cat flores101-devtest 0.58466 31.6 1012 27304
4
+ deu-fra flores101-devtest 0.62370 36.5 1012 28343
5
+ deu-glg flores101-devtest 0.55693 28.0 1012 26582
6
+ deu-oci flores101-devtest 0.52253 22.3 1012 27305
7
+ deu-por flores101-devtest 0.60688 34.8 1012 26519
8
+ deu-ron flores101-devtest 0.57333 30.3 1012 26799
9
+ eng-cat flores101-devtest 0.66607 42.5 1012 27304
10
+ eng-fra flores101-devtest 0.70492 48.8 1012 28343
11
+ eng-kea flores101-devtest 0.34867 10.7 1012 25540
12
+ eng-por flores101-devtest 0.71112 49.3 1012 26519
13
+ eng-ron flores101-devtest 0.64856 40.3 1012 26799
14
+ fra-oci flores101-devtest 0.58559 29.2 1012 27305
15
+ fra-ron flores101-devtest 0.58922 32.1 1012 26799
16
+ por-kea flores101-devtest 0.40779 12.8 1012 25540
17
+ por-oci flores101-devtest 0.57016 27.5 1012 27305
18
+ spa-ast flores101-devtest 0.49666 16.3 1012 24572
19
+ spa-cat flores101-devtest 0.54015 23.2 1012 27304
20
+ spa-glg flores101-devtest 0.52923 22.1 1012 26582
21
+ spa-kea flores101-devtest 0.36479 8.7 1012 25540
22
+ spa-oci flores101-devtest 0.49285 17.2 1012 27305
23
+ spa-por flores101-devtest 0.55944 25.7 1012 26519
24
+ spa-ron flores101-devtest 0.53282 23.3 1012 26799
25
+ deu-ast flores200-devtest 0.53782 22.1 1012 24572
26
+ deu-cat flores200-devtest 0.58846 32.2 1012 27304
27
+ deu-fra flores200-devtest 0.62803 37.2 1012 28343
28
+ deu-fur flores200-devtest 0.46372 18.7 1012 29171
29
+ deu-glg flores200-devtest 0.56229 28.7 1012 26582
30
+ deu-hat flores200-devtest 0.46752 15.7 1012 25833
31
+ deu-ita flores200-devtest 0.55344 25.8 1012 27306
32
+ deu-kea flores200-devtest 0.34337 9.1 1012 25540
33
+ deu-lij flores200-devtest 0.40732 11.8 1012 28625
34
+ deu-lmo flores200-devtest 0.27882 4.0 1012 29441
35
+ deu-oci flores200-devtest 0.52749 23.1 1012 27305
36
+ deu-pap flores200-devtest 0.49721 22.4 1012 28016
37
+ deu-por flores200-devtest 0.60818 34.7 1012 26519
38
+ deu-ron flores200-devtest 0.57873 31.1 1012 26799
39
+ deu-scn flores200-devtest 0.37299 6.7 1012 25160
40
+ deu-spa flores200-devtest 0.52442 24.4 1012 29199
41
+ deu-srd flores200-devtest 0.45629 16.1 1012 28322
42
+ deu-vec flores200-devtest 0.34140 4.3 1012 26257
43
+ eng-ast flores200-devtest 0.59255 27.8 1012 24572
44
+ eng-cat flores200-devtest 0.66809 42.8 1012 27304
45
+ eng-fra flores200-devtest 0.71001 49.5 1012 28343
46
+ eng-fur flores200-devtest 0.49164 23.0 1012 29171
47
+ eng-glg flores200-devtest 0.62349 36.1 1012 26582
48
+ eng-hat flores200-devtest 0.51720 21.3 1012 25833
49
+ eng-ita flores200-devtest 0.58898 29.7 1012 27306
50
+ eng-kea flores200-devtest 0.34963 11.0 1012 25540
51
+ eng-lij flores200-devtest 0.43644 14.8 1012 28625
52
+ eng-lmo flores200-devtest 0.28466 4.5 1012 29441
53
+ eng-oci flores200-devtest 0.63245 35.2 1012 27305
54
+ eng-pap flores200-devtest 0.56775 30.4 1012 28016
55
+ eng-por flores200-devtest 0.71438 50.0 1012 26519
56
+ eng-ron flores200-devtest 0.65373 41.2 1012 26799
57
+ eng-scn flores200-devtest 0.35944 4.9 1012 25160
58
+ eng-spa flores200-devtest 0.55784 27.6 1012 29199
59
+ eng-srd flores200-devtest 0.49876 21.0 1012 28322
60
+ eng-vec flores200-devtest 0.37256 6.0 1012 26257
61
+ fra-ast flores200-devtest 0.53904 22.0 1012 24572
62
+ fra-cat flores200-devtest 0.60549 34.5 1012 27304
63
+ fra-fur flores200-devtest 0.49119 21.4 1012 29171
64
+ fra-glg flores200-devtest 0.57998 31.3 1012 26582
65
+ fra-hat flores200-devtest 0.52018 20.7 1012 25833
66
+ fra-ita flores200-devtest 0.56470 27.0 1012 27306
67
+ fra-kea flores200-devtest 0.38741 11.2 1012 25540
68
+ fra-lij flores200-devtest 0.43180 13.6 1012 28625
69
+ fra-lmo flores200-devtest 0.30675 4.6 1012 29441
70
+ fra-oci flores200-devtest 0.58268 29.2 1012 27305
71
+ fra-pap flores200-devtest 0.51029 23.6 1012 28016
72
+ fra-por flores200-devtest 0.62540 37.5 1012 26519
73
+ fra-ron flores200-devtest 0.59255 32.7 1012 26799
74
+ fra-scn flores200-devtest 0.34251 3.9 1012 25160
75
+ fra-spa flores200-devtest 0.53001 24.4 1012 29199
76
+ fra-srd flores200-devtest 0.47645 17.9 1012 28322
77
+ fra-vec flores200-devtest 0.38718 5.7 1012 26257
78
+ por-ast flores200-devtest 0.55369 23.9 1012 24572
79
+ por-cat flores200-devtest 0.61981 36.4 1012 27304
80
+ por-fra flores200-devtest 0.64654 40.4 1012 28343
81
+ por-fur flores200-devtest 0.50078 22.1 1012 29171
82
+ por-glg flores200-devtest 0.58336 31.1 1012 26582
83
+ por-hat flores200-devtest 0.48834 18.0 1012 25833
84
+ por-ita flores200-devtest 0.56077 26.7 1012 27306
85
+ por-kea flores200-devtest 0.42451 13.6 1012 25540
86
+ por-lij flores200-devtest 0.43715 13.4 1012 28625
87
+ por-lmo flores200-devtest 0.30787 4.6 1012 29441
88
+ por-oci flores200-devtest 0.57143 28.1 1012 27305
89
+ por-pap flores200-devtest 0.52192 25.0 1012 28016
90
+ por-ron flores200-devtest 0.59962 34.2 1012 26799
91
+ por-scn flores200-devtest 0.36678 5.4 1012 25160
92
+ por-spa flores200-devtest 0.53772 25.6 1012 29199
93
+ por-srd flores200-devtest 0.48882 18.8 1012 28322
94
+ por-vec flores200-devtest 0.37954 5.8 1012 26257
95
+ spa-ast flores200-devtest 0.49512 16.3 1012 24572
96
+ spa-cat flores200-devtest 0.53968 23.1 1012 27304
97
+ spa-fra flores200-devtest 0.57461 27.9 1012 28343
98
+ spa-fur flores200-devtest 0.45785 16.1 1012 29171
99
+ spa-glg flores200-devtest 0.52933 22.2 1012 26582
100
+ spa-hat flores200-devtest 0.44627 13.0 1012 25833
101
+ spa-ita flores200-devtest 0.53063 22.4 1012 27306
102
+ spa-kea flores200-devtest 0.37488 9.2 1012 25540
103
+ spa-lij flores200-devtest 0.39784 10.2 1012 28625
104
+ spa-lmo flores200-devtest 0.28852 3.5 1012 29441
105
+ spa-oci flores200-devtest 0.49293 17.4 1012 27305
106
+ spa-pap flores200-devtest 0.46595 17.7 1012 28016
107
+ spa-por flores200-devtest 0.56138 25.9 1012 26519
108
+ spa-ron flores200-devtest 0.53609 23.8 1012 26799
109
+ spa-scn flores200-devtest 0.36589 5.3 1012 25160
110
+ spa-srd flores200-devtest 0.44898 13.3 1012 28322
111
+ spa-vec flores200-devtest 0.35740 4.5 1012 26257
112
+ deu-fra generaltest2022 0.60634 37.4 1984 38276
113
+ deu-fra multi30k_test_2016_flickr 0.62595 38.5 1000 13505
114
+ eng-fra multi30k_test_2016_flickr 0.71630 51.4 1000 13505
115
+ deu-fra multi30k_test_2017_flickr 0.62733 37.3 1000 12118
116
+ eng-fra multi30k_test_2017_flickr 0.71850 50.8 1000 12118
117
+ deu-fra multi30k_test_2017_mscoco 0.59089 33.8 461 5484
118
+ eng-fra multi30k_test_2017_mscoco 0.73129 54.1 461 5484
119
+ deu-fra multi30k_test_2018_flickr 0.57155 30.9 1071 15867
120
+ eng-fra multi30k_test_2018_flickr 0.65461 41.9 1071 15867
121
+ eng-fra newsdiscusstest2015 0.63660 38.5 1500 27975
122
+ deu-fra newssyscomb2009 0.56035 27.6 502 12331
123
+ deu-ita newssyscomb2009 0.55722 25.1 502 11551
124
+ deu-spa newssyscomb2009 0.55595 28.5 502 12503
125
+ eng-fra newssyscomb2009 0.58465 29.5 502 12331
126
+ eng-ita newssyscomb2009 0.60792 31.3 502 11551
127
+ eng-spa newssyscomb2009 0.58219 31.0 502 12503
128
+ fra-ita newssyscomb2009 0.61352 31.9 502 11551
129
+ fra-spa newssyscomb2009 0.60430 34.3 502 12503
130
+ spa-fra newssyscomb2009 0.61491 34.6 502 12331
131
+ spa-ita newssyscomb2009 0.61861 33.7 502 11551
132
+ deu-fra newstest2008 0.54926 26.3 2051 52685
133
+ deu-spa newstest2008 0.53902 25.5 2051 52586
134
+ eng-fra newstest2008 0.55358 26.8 2051 52685
135
+ eng-spa newstest2008 0.56491 29.5 2051 52586
136
+ fra-spa newstest2008 0.58764 33.0 2051 52586
137
+ spa-fra newstest2008 0.58848 32.4 2051 52685
138
+ deu-fra newstest2009 0.53870 25.4 2525 69263
139
+ deu-ita newstest2009 0.54509 24.4 2525 63466
140
+ deu-spa newstest2009 0.53769 25.7 2525 68111
141
+ eng-fra newstest2009 0.57566 29.3 2525 69263
142
+ eng-ita newstest2009 0.60372 31.4 2525 63466
143
+ eng-spa newstest2009 0.57913 30.0 2525 68111
144
+ fra-ita newstest2009 0.59749 30.5 2525 63466
145
+ fra-spa newstest2009 0.58921 32.1 2525 68111
146
+ spa-fra newstest2009 0.59195 32.3 2525 69263
147
+ spa-ita newstest2009 0.61007 33.0 2525 63466
148
+ deu-fra newstest2010 0.57888 29.5 2489 66022
149
+ deu-spa newstest2010 0.59408 32.7 2489 65480
150
+ eng-fra newstest2010 0.59588 32.4 2489 66022
151
+ eng-spa newstest2010 0.61978 36.6 2489 65480
152
+ fra-spa newstest2010 0.62513 37.7 2489 65480
153
+ spa-fra newstest2010 0.62193 36.1 2489 66022
154
+ deu-fra newstest2011 0.55704 27.5 3003 80626
155
+ deu-spa newstest2011 0.56696 30.4 3003 79476
156
+ eng-fra newstest2011 0.61071 34.3 3003 80626
157
+ eng-spa newstest2011 0.62126 38.7 3003 79476
158
+ fra-spa newstest2011 0.63139 40.0 3003 79476
159
+ spa-fra newstest2011 0.61258 35.2 3003 80626
160
+ deu-fra newstest2012 0.56034 27.6 3003 78011
161
+ deu-spa newstest2012 0.57336 31.6 3003 79006
162
+ eng-fra newstest2012 0.59264 31.9 3003 78011
163
+ eng-spa newstest2012 0.62568 39.1 3003 79006
164
+ fra-spa newstest2012 0.62725 39.5 3003 79006
165
+ spa-fra newstest2012 0.61177 34.2 3003 78011
166
+ deu-fra newstest2013 0.56475 29.9 3000 70037
167
+ deu-spa newstest2013 0.57187 31.9 3000 70528
168
+ eng-fra newstest2013 0.58938 33.3 3000 70037
169
+ eng-spa newstest2013 0.59817 35.2 3000 70528
170
+ fra-spa newstest2013 0.59482 35.1 3000 70528
171
+ spa-fra newstest2013 0.59825 33.9 3000 70037
172
+ eng-fra newstest2014 0.65438 40.2 3003 77306
173
+ eng-ron newstest2016 0.59473 32.2 1999 48945
174
+ deu-fra newstest2019 0.62831 35.9 1701 42509
175
+ deu-fra newstest2020 0.60408 33.0 1619 36890
176
+ deu-fra newstest2021 0.58913 31.3 1000 23757
177
+ deu-cat ntrex128 0.55033 28.2 1997 53438
178
+ deu-fra ntrex128 0.55854 28.5 1997 53481
179
+ deu-glg ntrex128 0.55034 27.8 1997 50432
180
+ deu-ita ntrex128 0.55733 26.6 1997 50759
181
+ deu-por ntrex128 0.54208 26.0 1997 51631
182
+ deu-ron ntrex128 0.52839 26.6 1997 53498
183
+ deu-spa ntrex128 0.56966 30.8 1997 54107
184
+ eng-cat ntrex128 0.61431 36.3 1997 53438
185
+ eng-fra ntrex128 0.61695 35.5 1997 53481
186
+ eng-glg ntrex128 0.62390 37.2 1997 50432
187
+ eng-ita ntrex128 0.62209 36.1 1997 50759
188
+ eng-por ntrex128 0.59859 33.5 1997 51631
189
+ eng-ron ntrex128 0.58128 33.4 1997 53498
190
+ eng-spa ntrex128 0.64099 40.3 1997 54107
191
+ fra-cat ntrex128 0.55093 28.1 1997 53438
192
+ fra-glg ntrex128 0.55325 28.0 1997 50432
193
+ fra-ita ntrex128 0.56188 27.4 1997 50759
194
+ fra-por ntrex128 0.54001 25.6 1997 51631
195
+ fra-ron ntrex128 0.51853 24.8 1997 53498
196
+ fra-spa ntrex128 0.57116 31.0 1997 54107
197
+ por-cat ntrex128 0.57962 31.6 1997 53438
198
+ por-fra ntrex128 0.56910 28.9 1997 53481
199
+ por-glg ntrex128 0.57389 30.3 1997 50432
200
+ por-ita ntrex128 0.58788 30.6 1997 50759
201
+ por-ron ntrex128 0.54276 28.0 1997 53498
202
+ por-spa ntrex128 0.59565 34.2 1997 54107
203
+ spa-cat ntrex128 0.60605 34.0 1997 53438
204
+ spa-fra ntrex128 0.57501 29.6 1997 53481
205
+ spa-glg ntrex128 0.61300 34.4 1997 50432
206
+ spa-ita ntrex128 0.57868 28.9 1997 50759
207
+ spa-por ntrex128 0.56730 29.1 1997 51631
208
+ spa-ron ntrex128 0.54222 27.9 1997 53498
209
+ deu-spa tatoeba-test-v2020-07-28 0.67669 48.5 10000 77529
210
+ eng-fra tatoeba-test-v2020-07-28 0.67371 50.0 10000 80769
211
+ eng-glg tatoeba-test-v2020-07-28 0.64146 44.9 1008 7830
212
+ eng-lad tatoeba-test-v2020-07-28 0.45044 24.8 629 3354
213
+ eng-lad_Latn tatoeba-test-v2020-07-28 0.49208 26.7 582 3097
214
+ eng-lat tatoeba-test-v2020-07-28 0.44396 17.8 10000 74905
215
+ eng-por tatoeba-test-v2020-07-28 0.71571 52.6 10000 75371
216
+ eng-ron tatoeba-test-v2020-07-28 0.66732 45.9 5000 36851
217
+ eng-spa tatoeba-test-v2020-07-28 0.70968 53.3 10000 77311
218
+ fra-cat tatoeba-test-v2020-07-28 0.65364 44.5 686 5214
219
+ fra-lat tatoeba-test-v2020-07-28 0.30695 5.7 2917 26768
220
+ fra-spa tatoeba-test-v2020-07-28 0.71822 54.4 10000 76002
221
+ por-fra tatoeba-test-v2020-07-28 0.75293 59.9 10000 73898
222
+ por-glg tatoeba-test-v2020-07-28 0.74574 58.3 430 2989
223
+ por-ita tatoeba-test-v2020-07-28 0.75871 56.6 2500 18301
224
+ por-lat tatoeba-test-v2020-07-28 0.36888 8.7 5000 49181
225
+ por-ron tatoeba-test-v2020-07-28 0.69267 47.8 681 4529
226
+ por-spa tatoeba-test-v2020-07-28 0.79367 64.6 10000 77915
227
+ spa-fra tatoeba-test-v2020-07-28 0.72889 56.8 10000 80915
228
+ spa-lad tatoeba-test-v2020-07-28 0.47372 25.3 207 1090
229
+ spa-lat tatoeba-test-v2020-07-28 0.42922 16.9 3131 27725
230
+ spa-por tatoeba-test-v2020-07-28 0.77158 60.9 10000 77911
231
+ spa-ron tatoeba-test-v2020-07-28 0.67539 45.7 1961 12518
232
+ deu-cat tatoeba-test-v2021-03-30 0.62929 43.6 727 5582
233
+ deu-fra tatoeba-test-v2021-03-30 0.67633 50.0 11388 93145
234
+ deu-lad tatoeba-test-v2021-03-30 0.39524 21.5 229 1183
235
+ deu-spa tatoeba-test-v2021-03-30 0.67713 48.6 10138 78787
236
+ eng-fra tatoeba-test-v2021-03-30 0.68058 50.8 10892 89269
237
+ eng-ita tatoeba-test-v2021-03-30 0.71096 51.4 13443 89836
238
+ eng-lad tatoeba-test-v2021-03-30 0.43752 24.3 781 4186
239
+ eng-lad_Latn tatoeba-test-v2021-03-30 0.49904 27.3 696 3712
240
+ eng-lat tatoeba-test-v2021-03-30 0.44360 17.7 10130 75612
241
+ eng-pms tatoeba-test-v2021-03-30 0.38841 15.2 270 2249
242
+ eng-por tatoeba-test-v2021-03-30 0.71793 52.9 11574 87572
243
+ eng-ron tatoeba-test-v2021-03-30 0.67139 46.5 10019 73774
244
+ eng-spa tatoeba-test-v2021-03-30 0.71501 54.1 11940 93423
245
+ fra-cat tatoeba-test-v2021-03-30 0.65356 44.3 705 5398
246
+ fra-gcf tatoeba-test-v2021-03-30 0.10334 0.1 1166 6131
247
+ fra-ita tatoeba-test-v2021-03-30 0.72572 53.2 10041 61568
248
+ fra-lat tatoeba-test-v2021-03-30 0.30695 5.7 2917 26768
249
+ fra-oci tatoeba-test-v2021-03-30 0.34570 10.0 807 6057
250
+ fra-pcd tatoeba-test-v2021-03-30 0.15219 1.2 268 1578
251
+ fra-por tatoeba-test-v2021-03-30 0.70473 51.7 10151 72451
252
+ fra-ron tatoeba-test-v2021-03-30 0.65512 44.1 1952 12383
253
+ fra-spa tatoeba-test-v2021-03-30 0.71879 54.4 10122 77059
254
+ por-glg tatoeba-test-v2021-03-30 0.74660 58.2 438 3048
255
+ por-ita tatoeba-test-v2021-03-30 0.76153 57.2 5069 37479
256
+ por-lat_Latn tatoeba-test-v2021-03-30 0.36907 8.8 5001 49208
257
+ por-por tatoeba-test-v2021-03-30 0.71803 50.8 2500 19220
258
+ por-ron tatoeba-test-v2021-03-30 0.69222 48.0 715 4725
259
+ por-spa tatoeba-test-v2021-03-30 0.79309 64.6 10395 80844
260
+ spa-fra tatoeba-test-v2021-03-30 0.72922 56.8 10122 82050
261
+ spa-lad_Latn tatoeba-test-v2021-03-30 0.50634 28.1 242 1306
262
+ spa-lat tatoeba-test-v2021-03-30 0.42922 16.9 3131 27725
263
+ spa-por tatoeba-test-v2021-03-30 0.77034 60.7 10395 81022
264
+ spa-ron tatoeba-test-v2021-03-30 0.67539 45.7 1961 12518
265
+ spa-spa tatoeba-test-v2021-03-30 0.67994 51.1 2500 21469
266
+ deu-cat tatoeba-test-v2021-08-07 0.63465 44.3 723 5539
267
+ deu-fra tatoeba-test-v2021-08-07 0.68258 50.7 12418 102721
268
+ deu-ita tatoeba-test-v2021-08-07 0.68502 47.4 10094 75504
269
+ deu-lad tatoeba-test-v2021-08-07 0.38047 22.0 220 1130
270
+ deu-lat tatoeba-test-v2021-08-07 0.42567 16.2 2016 10538
271
+ deu-por tatoeba-test-v2021-08-07 0.63684 43.1 10000 81482
272
+ deu-ron tatoeba-test-v2021-08-07 0.64207 42.6 1141 7432
273
+ deu-spa tatoeba-test-v2021-08-07 0.68333 49.4 10521 82570
274
+ eng-cat tatoeba-test-v2021-08-07 0.67724 49.1 1631 12344
275
+ eng-fra tatoeba-test-v2021-08-07 0.68777 51.6 12681 106378
276
+ eng-glg tatoeba-test-v2021-08-07 0.64530 45.2 1015 7881
277
+ eng-ita tatoeba-test-v2021-08-07 0.72115 53.3 17320 116336
278
+ eng-lad tatoeba-test-v2021-08-07 0.43857 24.2 768 4105
279
+ eng-lad_Latn tatoeba-test-v2021-08-07 0.50848 27.6 672 3580
280
+ eng-lat tatoeba-test-v2021-08-07 0.45710 20.0 10298 76510
281
+ eng-oci tatoeba-test-v2021-08-07 0.30533 9.2 841 5219
282
+ eng-pms tatoeba-test-v2021-08-07 0.38693 14.6 269 2247
283
+ eng-por tatoeba-test-v2021-08-07 0.72159 53.4 13222 105265
284
+ eng-ron tatoeba-test-v2021-08-07 0.67835 47.1 5508 40367
285
+ eng-spa tatoeba-test-v2021-08-07 0.72875 55.8 16583 134710
286
+ fra-cat tatoeba-test-v2021-08-07 0.65547 44.6 700 5342
287
+ fra-fra tatoeba-test-v2021-08-07 0.61650 39.9 1000 7757
288
+ fra-gcf tatoeba-test-v2021-08-07 0.10274 0.1 1164 6118
289
+ fra-ita tatoeba-test-v2021-08-07 0.72739 53.5 10091 62060
290
+ fra-lat tatoeba-test-v2021-08-07 0.30392 5.5 2915 26763
291
+ fra-oci tatoeba-test-v2021-08-07 0.34408 10.2 806 6047
292
+ fra-pcd tatoeba-test-v2021-08-07 0.15259 1.0 266 1569
293
+ fra-por tatoeba-test-v2021-08-07 0.70655 52.0 10518 77650
294
+ fra-ron tatoeba-test-v2021-08-07 0.65399 43.7 1925 12252
295
+ fra-spa tatoeba-test-v2021-08-07 0.72083 54.8 10294 78406
296
+ por-cat tatoeba-test-v2021-08-07 0.71178 52.0 747 6149
297
+ por-fra tatoeba-test-v2021-08-07 0.75691 60.4 10518 80459
298
+ por-glg tatoeba-test-v2021-08-07 0.74818 57.6 433 3016
299
+ por-ita tatoeba-test-v2021-08-07 0.76899 58.7 3066 24897
300
+ por-lat tatoeba-test-v2021-08-07 0.37430 9.7 5001 49190
301
+ por-lat_Latn tatoeba-test-v2021-08-07 0.37435 9.7 5000 49182
302
+ por-por tatoeba-test-v2021-08-07 0.71775 51.0 2500 19220
303
+ por-ron tatoeba-test-v2021-08-07 0.69517 47.8 681 4521
304
+ por-spa tatoeba-test-v2021-08-07 0.79442 64.9 10947 87335
305
+ spa-cat tatoeba-test-v2021-08-07 0.81845 66.3 1534 12343
306
+ spa-fra tatoeba-test-v2021-08-07 0.73277 57.4 10294 83501
307
+ spa-glg tatoeba-test-v2021-08-07 0.76118 61.5 2121 16581
308
+ spa-ita tatoeba-test-v2021-08-07 0.76742 59.5 5000 34515
309
+ spa-lad tatoeba-test-v2021-08-07 0.43064 23.4 276 1464
310
+ spa-lad_Latn tatoeba-test-v2021-08-07 0.50795 27.1 239 1254
311
+ spa-lat tatoeba-test-v2021-08-07 0.44044 18.8 3129 27685
312
+ spa-por tatoeba-test-v2021-08-07 0.76951 60.7 10947 87610
313
+ spa-ron tatoeba-test-v2021-08-07 0.67782 45.9 1959 12503
314
+ spa-spa tatoeba-test-v2021-08-07 0.67346 49.6 2500 21469
315
+ eng-fra tico19-test 0.62989 40.1 2100 64661
316
+ eng-por tico19-test 0.72708 50.0 2100 62729
317
+ eng-spa tico19-test 0.73154 52.0 2100 66563
318
+ fra-por tico19-test 0.58383 34.1 2100 62729
319
+ fra-spa tico19-test 0.59581 37.0 2100 66563
320
+ por-fra tico19-test 0.59798 34.4 2100 64661
321
+ por-spa tico19-test 0.68332 45.4 2100 66563
322
+ spa-fra tico19-test 0.60469 35.5 2100 64661
323
+ spa-por tico19-test 0.67898 42.8 2100 62729
benchmark_translations.zip ADDED
File without changes
config.json ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "pytorch-models/opus-mt-tc-bible-big-deu_eng_fra_por_spa-itc",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "relu",
5
+ "architectures": [
6
+ "MarianMTModel"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "bos_token_id": 0,
10
+ "classifier_dropout": 0.0,
11
+ "d_model": 1024,
12
+ "decoder_attention_heads": 16,
13
+ "decoder_ffn_dim": 4096,
14
+ "decoder_layerdrop": 0.0,
15
+ "decoder_layers": 6,
16
+ "decoder_start_token_id": 45447,
17
+ "decoder_vocab_size": 45448,
18
+ "dropout": 0.1,
19
+ "encoder_attention_heads": 16,
20
+ "encoder_ffn_dim": 4096,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 6,
23
+ "eos_token_id": 458,
24
+ "forced_eos_token_id": null,
25
+ "init_std": 0.02,
26
+ "is_encoder_decoder": true,
27
+ "max_length": null,
28
+ "max_position_embeddings": 1024,
29
+ "model_type": "marian",
30
+ "normalize_embedding": false,
31
+ "num_beams": null,
32
+ "num_hidden_layers": 6,
33
+ "pad_token_id": 45447,
34
+ "scale_embedding": true,
35
+ "share_encoder_decoder_embeddings": true,
36
+ "static_position_embeddings": true,
37
+ "torch_dtype": "float32",
38
+ "transformers_version": "4.45.1",
39
+ "use_cache": true,
40
+ "vocab_size": 45448
41
+ }
generation_config.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bad_words_ids": [
4
+ [
5
+ 45447
6
+ ]
7
+ ],
8
+ "bos_token_id": 0,
9
+ "decoder_start_token_id": 45447,
10
+ "eos_token_id": 458,
11
+ "forced_eos_token_id": 458,
12
+ "max_length": 512,
13
+ "num_beams": 4,
14
+ "pad_token_id": 45447,
15
+ "transformers_version": "4.45.1"
16
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aacd099df16848b218c2db3558d3ce201d4471fa6b29fa50dfa08767ae86a53a
3
+ size 891795920
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9681c26060823d28aae2bd305feb16c19b6aeee0164e121167b5de2024358a83
3
+ size 891847173
source.spm ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:54be9a51c39e97d551ad21e4c7fd753ef18176f36e7a4546092cc038229a86de
3
+ size 806917
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"eos_token": "</s>", "unk_token": "<unk>", "pad_token": "<pad>"}
target.spm ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:62211bb05219df8623a25ba622e64d96f33fbb0675bf44668f2b0a476af5f895
3
+ size 805370
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"source_lang": "deu+eng+fra+por+spa", "target_lang": "itc", "unk_token": "<unk>", "eos_token": "</s>", "pad_token": "<pad>", "model_max_length": 512, "sp_model_kwargs": {}, "separate_vocabs": false, "special_tokens_map_file": null, "name_or_path": "marian-models/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30/deu+eng+fra+por+spa-itc", "tokenizer_class": "MarianTokenizer"}
vocab.json ADDED
The diff for this file is too large to render. See raw diff