tiedeman commited on
Commit
843954f
1 Parent(s): d3c8cb0

Initial commit

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ *.spm filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,3782 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ language:
4
+ - anp
5
+ - as
6
+ - awa
7
+ - bal
8
+ - bho
9
+ - bn
10
+ - bpy
11
+ - de
12
+ - diq
13
+ - dv
14
+ - en
15
+ - es
16
+ - fa
17
+ - fr
18
+ - gbm
19
+ - glk
20
+ - gu
21
+ - hi
22
+ - hif
23
+ - hne
24
+ - hns
25
+ - jdt
26
+ - kok
27
+ - ks
28
+ - ku
29
+ - lah
30
+ - lrc
31
+ - mag
32
+ - mai
33
+ - mr
34
+ - mzn
35
+ - ne
36
+ - or
37
+ - os
38
+ - pa
39
+ - pal
40
+ - pi
41
+ - ps
42
+ - pt
43
+ - rhg
44
+ - rmy
45
+ - rom
46
+ - sa
47
+ - sd
48
+ - si
49
+ - skr
50
+ - syl
51
+ - tg
52
+ - tly
53
+ - ur
54
+ - zza
55
+ language_bcp47:
56
+ - ku_Latn
57
+
58
+ tags:
59
+ - translation
60
+ - opus-mt-tc-bible
61
+
62
+ license: apache-2.0
63
+ model-index:
64
+ - name: opus-mt-tc-bible-big-iir-deu_eng_fra_por_spa
65
+ results:
66
+ - task:
67
+ name: Translation asm-eng
68
+ type: translation
69
+ args: asm-eng
70
+ dataset:
71
+ name: flores200-devtest
72
+ type: flores200-devtest
73
+ args: asm-eng
74
+ metrics:
75
+ - name: BLEU
76
+ type: bleu
77
+ value: 21.5
78
+ - name: chr-F
79
+ type: chrf
80
+ value: 0.48589
81
+ - task:
82
+ name: Translation asm-fra
83
+ type: translation
84
+ args: asm-fra
85
+ dataset:
86
+ name: flores200-devtest
87
+ type: flores200-devtest
88
+ args: asm-fra
89
+ metrics:
90
+ - name: BLEU
91
+ type: bleu
92
+ value: 11.7
93
+ - name: chr-F
94
+ type: chrf
95
+ value: 0.35394
96
+ - task:
97
+ name: Translation asm-por
98
+ type: translation
99
+ args: asm-por
100
+ dataset:
101
+ name: flores200-devtest
102
+ type: flores200-devtest
103
+ args: asm-por
104
+ metrics:
105
+ - name: BLEU
106
+ type: bleu
107
+ value: 12.3
108
+ - name: chr-F
109
+ type: chrf
110
+ value: 0.37939
111
+ - task:
112
+ name: Translation awa-deu
113
+ type: translation
114
+ args: awa-deu
115
+ dataset:
116
+ name: flores200-devtest
117
+ type: flores200-devtest
118
+ args: awa-deu
119
+ metrics:
120
+ - name: BLEU
121
+ type: bleu
122
+ value: 16.0
123
+ - name: chr-F
124
+ type: chrf
125
+ value: 0.47071
126
+ - task:
127
+ name: Translation awa-eng
128
+ type: translation
129
+ args: awa-eng
130
+ dataset:
131
+ name: flores200-devtest
132
+ type: flores200-devtest
133
+ args: awa-eng
134
+ metrics:
135
+ - name: BLEU
136
+ type: bleu
137
+ value: 26.6
138
+ - name: chr-F
139
+ type: chrf
140
+ value: 0.53069
141
+ - task:
142
+ name: Translation awa-fra
143
+ type: translation
144
+ args: awa-fra
145
+ dataset:
146
+ name: flores200-devtest
147
+ type: flores200-devtest
148
+ args: awa-fra
149
+ metrics:
150
+ - name: BLEU
151
+ type: bleu
152
+ value: 21.1
153
+ - name: chr-F
154
+ type: chrf
155
+ value: 0.49700
156
+ - task:
157
+ name: Translation awa-por
158
+ type: translation
159
+ args: awa-por
160
+ dataset:
161
+ name: flores200-devtest
162
+ type: flores200-devtest
163
+ args: awa-por
164
+ metrics:
165
+ - name: BLEU
166
+ type: bleu
167
+ value: 21.8
168
+ - name: chr-F
169
+ type: chrf
170
+ value: 0.49950
171
+ - task:
172
+ name: Translation awa-spa
173
+ type: translation
174
+ args: awa-spa
175
+ dataset:
176
+ name: flores200-devtest
177
+ type: flores200-devtest
178
+ args: awa-spa
179
+ metrics:
180
+ - name: BLEU
181
+ type: bleu
182
+ value: 15.4
183
+ - name: chr-F
184
+ type: chrf
185
+ value: 0.43831
186
+ - task:
187
+ name: Translation ben-deu
188
+ type: translation
189
+ args: ben-deu
190
+ dataset:
191
+ name: flores200-devtest
192
+ type: flores200-devtest
193
+ args: ben-deu
194
+ metrics:
195
+ - name: BLEU
196
+ type: bleu
197
+ value: 17.0
198
+ - name: chr-F
199
+ type: chrf
200
+ value: 0.47434
201
+ - task:
202
+ name: Translation ben-eng
203
+ type: translation
204
+ args: ben-eng
205
+ dataset:
206
+ name: flores200-devtest
207
+ type: flores200-devtest
208
+ args: ben-eng
209
+ metrics:
210
+ - name: BLEU
211
+ type: bleu
212
+ value: 31.4
213
+ - name: chr-F
214
+ type: chrf
215
+ value: 0.58408
216
+ - task:
217
+ name: Translation ben-fra
218
+ type: translation
219
+ args: ben-fra
220
+ dataset:
221
+ name: flores200-devtest
222
+ type: flores200-devtest
223
+ args: ben-fra
224
+ metrics:
225
+ - name: BLEU
226
+ type: bleu
227
+ value: 23.2
228
+ - name: chr-F
229
+ type: chrf
230
+ value: 0.50930
231
+ - task:
232
+ name: Translation ben-por
233
+ type: translation
234
+ args: ben-por
235
+ dataset:
236
+ name: flores200-devtest
237
+ type: flores200-devtest
238
+ args: ben-por
239
+ metrics:
240
+ - name: BLEU
241
+ type: bleu
242
+ value: 22.4
243
+ - name: chr-F
244
+ type: chrf
245
+ value: 0.50661
246
+ - task:
247
+ name: Translation ben-spa
248
+ type: translation
249
+ args: ben-spa
250
+ dataset:
251
+ name: flores200-devtest
252
+ type: flores200-devtest
253
+ args: ben-spa
254
+ metrics:
255
+ - name: BLEU
256
+ type: bleu
257
+ value: 15.7
258
+ - name: chr-F
259
+ type: chrf
260
+ value: 0.44485
261
+ - task:
262
+ name: Translation bho-deu
263
+ type: translation
264
+ args: bho-deu
265
+ dataset:
266
+ name: flores200-devtest
267
+ type: flores200-devtest
268
+ args: bho-deu
269
+ metrics:
270
+ - name: BLEU
271
+ type: bleu
272
+ value: 12.8
273
+ - name: chr-F
274
+ type: chrf
275
+ value: 0.42463
276
+ - task:
277
+ name: Translation bho-eng
278
+ type: translation
279
+ args: bho-eng
280
+ dataset:
281
+ name: flores200-devtest
282
+ type: flores200-devtest
283
+ args: bho-eng
284
+ metrics:
285
+ - name: BLEU
286
+ type: bleu
287
+ value: 22.6
288
+ - name: chr-F
289
+ type: chrf
290
+ value: 0.50545
291
+ - task:
292
+ name: Translation bho-fra
293
+ type: translation
294
+ args: bho-fra
295
+ dataset:
296
+ name: flores200-devtest
297
+ type: flores200-devtest
298
+ args: bho-fra
299
+ metrics:
300
+ - name: BLEU
301
+ type: bleu
302
+ value: 17.4
303
+ - name: chr-F
304
+ type: chrf
305
+ value: 0.45264
306
+ - task:
307
+ name: Translation bho-por
308
+ type: translation
309
+ args: bho-por
310
+ dataset:
311
+ name: flores200-devtest
312
+ type: flores200-devtest
313
+ args: bho-por
314
+ metrics:
315
+ - name: BLEU
316
+ type: bleu
317
+ value: 17.0
318
+ - name: chr-F
319
+ type: chrf
320
+ value: 0.44737
321
+ - task:
322
+ name: Translation bho-spa
323
+ type: translation
324
+ args: bho-spa
325
+ dataset:
326
+ name: flores200-devtest
327
+ type: flores200-devtest
328
+ args: bho-spa
329
+ metrics:
330
+ - name: BLEU
331
+ type: bleu
332
+ value: 13.0
333
+ - name: chr-F
334
+ type: chrf
335
+ value: 0.40585
336
+ - task:
337
+ name: Translation ckb-deu
338
+ type: translation
339
+ args: ckb-deu
340
+ dataset:
341
+ name: flores200-devtest
342
+ type: flores200-devtest
343
+ args: ckb-deu
344
+ metrics:
345
+ - name: BLEU
346
+ type: bleu
347
+ value: 13.6
348
+ - name: chr-F
349
+ type: chrf
350
+ value: 0.42110
351
+ - task:
352
+ name: Translation ckb-eng
353
+ type: translation
354
+ args: ckb-eng
355
+ dataset:
356
+ name: flores200-devtest
357
+ type: flores200-devtest
358
+ args: ckb-eng
359
+ metrics:
360
+ - name: BLEU
361
+ type: bleu
362
+ value: 24.7
363
+ - name: chr-F
364
+ type: chrf
365
+ value: 0.50543
366
+ - task:
367
+ name: Translation ckb-fra
368
+ type: translation
369
+ args: ckb-fra
370
+ dataset:
371
+ name: flores200-devtest
372
+ type: flores200-devtest
373
+ args: ckb-fra
374
+ metrics:
375
+ - name: BLEU
376
+ type: bleu
377
+ value: 19.1
378
+ - name: chr-F
379
+ type: chrf
380
+ value: 0.45847
381
+ - task:
382
+ name: Translation ckb-por
383
+ type: translation
384
+ args: ckb-por
385
+ dataset:
386
+ name: flores200-devtest
387
+ type: flores200-devtest
388
+ args: ckb-por
389
+ metrics:
390
+ - name: BLEU
391
+ type: bleu
392
+ value: 17.8
393
+ - name: chr-F
394
+ type: chrf
395
+ value: 0.44567
396
+ - task:
397
+ name: Translation ckb-spa
398
+ type: translation
399
+ args: ckb-spa
400
+ dataset:
401
+ name: flores200-devtest
402
+ type: flores200-devtest
403
+ args: ckb-spa
404
+ metrics:
405
+ - name: BLEU
406
+ type: bleu
407
+ value: 13.0
408
+ - name: chr-F
409
+ type: chrf
410
+ value: 0.39955
411
+ - task:
412
+ name: Translation guj-deu
413
+ type: translation
414
+ args: guj-deu
415
+ dataset:
416
+ name: flores200-devtest
417
+ type: flores200-devtest
418
+ args: guj-deu
419
+ metrics:
420
+ - name: BLEU
421
+ type: bleu
422
+ value: 17.3
423
+ - name: chr-F
424
+ type: chrf
425
+ value: 0.46758
426
+ - task:
427
+ name: Translation guj-eng
428
+ type: translation
429
+ args: guj-eng
430
+ dataset:
431
+ name: flores200-devtest
432
+ type: flores200-devtest
433
+ args: guj-eng
434
+ metrics:
435
+ - name: BLEU
436
+ type: bleu
437
+ value: 34.4
438
+ - name: chr-F
439
+ type: chrf
440
+ value: 0.61139
441
+ - task:
442
+ name: Translation guj-fra
443
+ type: translation
444
+ args: guj-fra
445
+ dataset:
446
+ name: flores200-devtest
447
+ type: flores200-devtest
448
+ args: guj-fra
449
+ metrics:
450
+ - name: BLEU
451
+ type: bleu
452
+ value: 22.5
453
+ - name: chr-F
454
+ type: chrf
455
+ value: 0.50349
456
+ - task:
457
+ name: Translation guj-por
458
+ type: translation
459
+ args: guj-por
460
+ dataset:
461
+ name: flores200-devtest
462
+ type: flores200-devtest
463
+ args: guj-por
464
+ metrics:
465
+ - name: BLEU
466
+ type: bleu
467
+ value: 22.4
468
+ - name: chr-F
469
+ type: chrf
470
+ value: 0.49828
471
+ - task:
472
+ name: Translation guj-spa
473
+ type: translation
474
+ args: guj-spa
475
+ dataset:
476
+ name: flores200-devtest
477
+ type: flores200-devtest
478
+ args: guj-spa
479
+ metrics:
480
+ - name: BLEU
481
+ type: bleu
482
+ value: 15.5
483
+ - name: chr-F
484
+ type: chrf
485
+ value: 0.44472
486
+ - task:
487
+ name: Translation hin-deu
488
+ type: translation
489
+ args: hin-deu
490
+ dataset:
491
+ name: flores200-devtest
492
+ type: flores200-devtest
493
+ args: hin-deu
494
+ metrics:
495
+ - name: BLEU
496
+ type: bleu
497
+ value: 20.8
498
+ - name: chr-F
499
+ type: chrf
500
+ value: 0.50772
501
+ - task:
502
+ name: Translation hin-eng
503
+ type: translation
504
+ args: hin-eng
505
+ dataset:
506
+ name: flores200-devtest
507
+ type: flores200-devtest
508
+ args: hin-eng
509
+ metrics:
510
+ - name: BLEU
511
+ type: bleu
512
+ value: 37.3
513
+ - name: chr-F
514
+ type: chrf
515
+ value: 0.63234
516
+ - task:
517
+ name: Translation hin-fra
518
+ type: translation
519
+ args: hin-fra
520
+ dataset:
521
+ name: flores200-devtest
522
+ type: flores200-devtest
523
+ args: hin-fra
524
+ metrics:
525
+ - name: BLEU
526
+ type: bleu
527
+ value: 26.5
528
+ - name: chr-F
529
+ type: chrf
530
+ value: 0.53933
531
+ - task:
532
+ name: Translation hin-por
533
+ type: translation
534
+ args: hin-por
535
+ dataset:
536
+ name: flores200-devtest
537
+ type: flores200-devtest
538
+ args: hin-por
539
+ metrics:
540
+ - name: BLEU
541
+ type: bleu
542
+ value: 26.1
543
+ - name: chr-F
544
+ type: chrf
545
+ value: 0.53523
546
+ - task:
547
+ name: Translation hin-spa
548
+ type: translation
549
+ args: hin-spa
550
+ dataset:
551
+ name: flores200-devtest
552
+ type: flores200-devtest
553
+ args: hin-spa
554
+ metrics:
555
+ - name: BLEU
556
+ type: bleu
557
+ value: 17.4
558
+ - name: chr-F
559
+ type: chrf
560
+ value: 0.46183
561
+ - task:
562
+ name: Translation hne-deu
563
+ type: translation
564
+ args: hne-deu
565
+ dataset:
566
+ name: flores200-devtest
567
+ type: flores200-devtest
568
+ args: hne-deu
569
+ metrics:
570
+ - name: BLEU
571
+ type: bleu
572
+ value: 19.0
573
+ - name: chr-F
574
+ type: chrf
575
+ value: 0.49946
576
+ - task:
577
+ name: Translation hne-eng
578
+ type: translation
579
+ args: hne-eng
580
+ dataset:
581
+ name: flores200-devtest
582
+ type: flores200-devtest
583
+ args: hne-eng
584
+ metrics:
585
+ - name: BLEU
586
+ type: bleu
587
+ value: 38.1
588
+ - name: chr-F
589
+ type: chrf
590
+ value: 0.63640
591
+ - task:
592
+ name: Translation hne-fra
593
+ type: translation
594
+ args: hne-fra
595
+ dataset:
596
+ name: flores200-devtest
597
+ type: flores200-devtest
598
+ args: hne-fra
599
+ metrics:
600
+ - name: BLEU
601
+ type: bleu
602
+ value: 25.7
603
+ - name: chr-F
604
+ type: chrf
605
+ value: 0.53419
606
+ - task:
607
+ name: Translation hne-por
608
+ type: translation
609
+ args: hne-por
610
+ dataset:
611
+ name: flores200-devtest
612
+ type: flores200-devtest
613
+ args: hne-por
614
+ metrics:
615
+ - name: BLEU
616
+ type: bleu
617
+ value: 25.9
618
+ - name: chr-F
619
+ type: chrf
620
+ value: 0.53735
621
+ - task:
622
+ name: Translation hne-spa
623
+ type: translation
624
+ args: hne-spa
625
+ dataset:
626
+ name: flores200-devtest
627
+ type: flores200-devtest
628
+ args: hne-spa
629
+ metrics:
630
+ - name: BLEU
631
+ type: bleu
632
+ value: 16.9
633
+ - name: chr-F
634
+ type: chrf
635
+ value: 0.45610
636
+ - task:
637
+ name: Translation kmr-eng
638
+ type: translation
639
+ args: kmr-eng
640
+ dataset:
641
+ name: flores200-devtest
642
+ type: flores200-devtest
643
+ args: kmr-eng
644
+ metrics:
645
+ - name: BLEU
646
+ type: bleu
647
+ value: 12.8
648
+ - name: chr-F
649
+ type: chrf
650
+ value: 0.37717
651
+ - task:
652
+ name: Translation mag-deu
653
+ type: translation
654
+ args: mag-deu
655
+ dataset:
656
+ name: flores200-devtest
657
+ type: flores200-devtest
658
+ args: mag-deu
659
+ metrics:
660
+ - name: BLEU
661
+ type: bleu
662
+ value: 20.0
663
+ - name: chr-F
664
+ type: chrf
665
+ value: 0.50681
666
+ - task:
667
+ name: Translation mag-eng
668
+ type: translation
669
+ args: mag-eng
670
+ dataset:
671
+ name: flores200-devtest
672
+ type: flores200-devtest
673
+ args: mag-eng
674
+ metrics:
675
+ - name: BLEU
676
+ type: bleu
677
+ value: 38.0
678
+ - name: chr-F
679
+ type: chrf
680
+ value: 0.63966
681
+ - task:
682
+ name: Translation mag-fra
683
+ type: translation
684
+ args: mag-fra
685
+ dataset:
686
+ name: flores200-devtest
687
+ type: flores200-devtest
688
+ args: mag-fra
689
+ metrics:
690
+ - name: BLEU
691
+ type: bleu
692
+ value: 25.9
693
+ - name: chr-F
694
+ type: chrf
695
+ value: 0.53810
696
+ - task:
697
+ name: Translation mag-por
698
+ type: translation
699
+ args: mag-por
700
+ dataset:
701
+ name: flores200-devtest
702
+ type: flores200-devtest
703
+ args: mag-por
704
+ metrics:
705
+ - name: BLEU
706
+ type: bleu
707
+ value: 26.6
708
+ - name: chr-F
709
+ type: chrf
710
+ value: 0.54065
711
+ - task:
712
+ name: Translation mag-spa
713
+ type: translation
714
+ args: mag-spa
715
+ dataset:
716
+ name: flores200-devtest
717
+ type: flores200-devtest
718
+ args: mag-spa
719
+ metrics:
720
+ - name: BLEU
721
+ type: bleu
722
+ value: 17.1
723
+ - name: chr-F
724
+ type: chrf
725
+ value: 0.46131
726
+ - task:
727
+ name: Translation mai-deu
728
+ type: translation
729
+ args: mai-deu
730
+ dataset:
731
+ name: flores200-devtest
732
+ type: flores200-devtest
733
+ args: mai-deu
734
+ metrics:
735
+ - name: BLEU
736
+ type: bleu
737
+ value: 16.8
738
+ - name: chr-F
739
+ type: chrf
740
+ value: 0.47686
741
+ - task:
742
+ name: Translation mai-eng
743
+ type: translation
744
+ args: mai-eng
745
+ dataset:
746
+ name: flores200-devtest
747
+ type: flores200-devtest
748
+ args: mai-eng
749
+ metrics:
750
+ - name: BLEU
751
+ type: bleu
752
+ value: 30.2
753
+ - name: chr-F
754
+ type: chrf
755
+ value: 0.57552
756
+ - task:
757
+ name: Translation mai-fra
758
+ type: translation
759
+ args: mai-fra
760
+ dataset:
761
+ name: flores200-devtest
762
+ type: flores200-devtest
763
+ args: mai-fra
764
+ metrics:
765
+ - name: BLEU
766
+ type: bleu
767
+ value: 22.4
768
+ - name: chr-F
769
+ type: chrf
770
+ value: 0.50909
771
+ - task:
772
+ name: Translation mai-por
773
+ type: translation
774
+ args: mai-por
775
+ dataset:
776
+ name: flores200-devtest
777
+ type: flores200-devtest
778
+ args: mai-por
779
+ metrics:
780
+ - name: BLEU
781
+ type: bleu
782
+ value: 22.9
783
+ - name: chr-F
784
+ type: chrf
785
+ value: 0.51249
786
+ - task:
787
+ name: Translation mai-spa
788
+ type: translation
789
+ args: mai-spa
790
+ dataset:
791
+ name: flores200-devtest
792
+ type: flores200-devtest
793
+ args: mai-spa
794
+ metrics:
795
+ - name: BLEU
796
+ type: bleu
797
+ value: 15.9
798
+ - name: chr-F
799
+ type: chrf
800
+ value: 0.44694
801
+ - task:
802
+ name: Translation mar-deu
803
+ type: translation
804
+ args: mar-deu
805
+ dataset:
806
+ name: flores200-devtest
807
+ type: flores200-devtest
808
+ args: mar-deu
809
+ metrics:
810
+ - name: BLEU
811
+ type: bleu
812
+ value: 14.8
813
+ - name: chr-F
814
+ type: chrf
815
+ value: 0.45295
816
+ - task:
817
+ name: Translation mar-eng
818
+ type: translation
819
+ args: mar-eng
820
+ dataset:
821
+ name: flores200-devtest
822
+ type: flores200-devtest
823
+ args: mar-eng
824
+ metrics:
825
+ - name: BLEU
826
+ type: bleu
827
+ value: 31.0
828
+ - name: chr-F
829
+ type: chrf
830
+ value: 0.58203
831
+ - task:
832
+ name: Translation mar-fra
833
+ type: translation
834
+ args: mar-fra
835
+ dataset:
836
+ name: flores200-devtest
837
+ type: flores200-devtest
838
+ args: mar-fra
839
+ metrics:
840
+ - name: BLEU
841
+ type: bleu
842
+ value: 20.4
843
+ - name: chr-F
844
+ type: chrf
845
+ value: 0.48254
846
+ - task:
847
+ name: Translation mar-por
848
+ type: translation
849
+ args: mar-por
850
+ dataset:
851
+ name: flores200-devtest
852
+ type: flores200-devtest
853
+ args: mar-por
854
+ metrics:
855
+ - name: BLEU
856
+ type: bleu
857
+ value: 20.4
858
+ - name: chr-F
859
+ type: chrf
860
+ value: 0.48368
861
+ - task:
862
+ name: Translation mar-spa
863
+ type: translation
864
+ args: mar-spa
865
+ dataset:
866
+ name: flores200-devtest
867
+ type: flores200-devtest
868
+ args: mar-spa
869
+ metrics:
870
+ - name: BLEU
871
+ type: bleu
872
+ value: 14.7
873
+ - name: chr-F
874
+ type: chrf
875
+ value: 0.42799
876
+ - task:
877
+ name: Translation npi-deu
878
+ type: translation
879
+ args: npi-deu
880
+ dataset:
881
+ name: flores200-devtest
882
+ type: flores200-devtest
883
+ args: npi-deu
884
+ metrics:
885
+ - name: BLEU
886
+ type: bleu
887
+ value: 17.2
888
+ - name: chr-F
889
+ type: chrf
890
+ value: 0.47267
891
+ - task:
892
+ name: Translation npi-eng
893
+ type: translation
894
+ args: npi-eng
895
+ dataset:
896
+ name: flores200-devtest
897
+ type: flores200-devtest
898
+ args: npi-eng
899
+ metrics:
900
+ - name: BLEU
901
+ type: bleu
902
+ value: 32.5
903
+ - name: chr-F
904
+ type: chrf
905
+ value: 0.59559
906
+ - task:
907
+ name: Translation npi-fra
908
+ type: translation
909
+ args: npi-fra
910
+ dataset:
911
+ name: flores200-devtest
912
+ type: flores200-devtest
913
+ args: npi-fra
914
+ metrics:
915
+ - name: BLEU
916
+ type: bleu
917
+ value: 22.5
918
+ - name: chr-F
919
+ type: chrf
920
+ value: 0.50869
921
+ - task:
922
+ name: Translation npi-por
923
+ type: translation
924
+ args: npi-por
925
+ dataset:
926
+ name: flores200-devtest
927
+ type: flores200-devtest
928
+ args: npi-por
929
+ metrics:
930
+ - name: BLEU
931
+ type: bleu
932
+ value: 22.5
933
+ - name: chr-F
934
+ type: chrf
935
+ value: 0.50900
936
+ - task:
937
+ name: Translation npi-spa
938
+ type: translation
939
+ args: npi-spa
940
+ dataset:
941
+ name: flores200-devtest
942
+ type: flores200-devtest
943
+ args: npi-spa
944
+ metrics:
945
+ - name: BLEU
946
+ type: bleu
947
+ value: 15.6
948
+ - name: chr-F
949
+ type: chrf
950
+ value: 0.44304
951
+ - task:
952
+ name: Translation pan-deu
953
+ type: translation
954
+ args: pan-deu
955
+ dataset:
956
+ name: flores200-devtest
957
+ type: flores200-devtest
958
+ args: pan-deu
959
+ metrics:
960
+ - name: BLEU
961
+ type: bleu
962
+ value: 18.6
963
+ - name: chr-F
964
+ type: chrf
965
+ value: 0.48342
966
+ - task:
967
+ name: Translation pan-eng
968
+ type: translation
969
+ args: pan-eng
970
+ dataset:
971
+ name: flores200-devtest
972
+ type: flores200-devtest
973
+ args: pan-eng
974
+ metrics:
975
+ - name: BLEU
976
+ type: bleu
977
+ value: 33.4
978
+ - name: chr-F
979
+ type: chrf
980
+ value: 0.60328
981
+ - task:
982
+ name: Translation pan-fra
983
+ type: translation
984
+ args: pan-fra
985
+ dataset:
986
+ name: flores200-devtest
987
+ type: flores200-devtest
988
+ args: pan-fra
989
+ metrics:
990
+ - name: BLEU
991
+ type: bleu
992
+ value: 24.4
993
+ - name: chr-F
994
+ type: chrf
995
+ value: 0.51953
996
+ - task:
997
+ name: Translation pan-por
998
+ type: translation
999
+ args: pan-por
1000
+ dataset:
1001
+ name: flores200-devtest
1002
+ type: flores200-devtest
1003
+ args: pan-por
1004
+ metrics:
1005
+ - name: BLEU
1006
+ type: bleu
1007
+ value: 23.9
1008
+ - name: chr-F
1009
+ type: chrf
1010
+ value: 0.51428
1011
+ - task:
1012
+ name: Translation pan-spa
1013
+ type: translation
1014
+ args: pan-spa
1015
+ dataset:
1016
+ name: flores200-devtest
1017
+ type: flores200-devtest
1018
+ args: pan-spa
1019
+ metrics:
1020
+ - name: BLEU
1021
+ type: bleu
1022
+ value: 16.3
1023
+ - name: chr-F
1024
+ type: chrf
1025
+ value: 0.44615
1026
+ - task:
1027
+ name: Translation pes-deu
1028
+ type: translation
1029
+ args: pes-deu
1030
+ dataset:
1031
+ name: flores200-devtest
1032
+ type: flores200-devtest
1033
+ args: pes-deu
1034
+ metrics:
1035
+ - name: BLEU
1036
+ type: bleu
1037
+ value: 21.0
1038
+ - name: chr-F
1039
+ type: chrf
1040
+ value: 0.51124
1041
+ - task:
1042
+ name: Translation pes-eng
1043
+ type: translation
1044
+ args: pes-eng
1045
+ dataset:
1046
+ name: flores200-devtest
1047
+ type: flores200-devtest
1048
+ args: pes-eng
1049
+ metrics:
1050
+ - name: BLEU
1051
+ type: bleu
1052
+ value: 33.7
1053
+ - name: chr-F
1054
+ type: chrf
1055
+ value: 0.60538
1056
+ - task:
1057
+ name: Translation pes-fra
1058
+ type: translation
1059
+ args: pes-fra
1060
+ dataset:
1061
+ name: flores200-devtest
1062
+ type: flores200-devtest
1063
+ args: pes-fra
1064
+ metrics:
1065
+ - name: BLEU
1066
+ type: bleu
1067
+ value: 27.8
1068
+ - name: chr-F
1069
+ type: chrf
1070
+ value: 0.55157
1071
+ - task:
1072
+ name: Translation pes-por
1073
+ type: translation
1074
+ args: pes-por
1075
+ dataset:
1076
+ name: flores200-devtest
1077
+ type: flores200-devtest
1078
+ args: pes-por
1079
+ metrics:
1080
+ - name: BLEU
1081
+ type: bleu
1082
+ value: 26.6
1083
+ - name: chr-F
1084
+ type: chrf
1085
+ value: 0.54372
1086
+ - task:
1087
+ name: Translation pes-spa
1088
+ type: translation
1089
+ args: pes-spa
1090
+ dataset:
1091
+ name: flores200-devtest
1092
+ type: flores200-devtest
1093
+ args: pes-spa
1094
+ metrics:
1095
+ - name: BLEU
1096
+ type: bleu
1097
+ value: 18.8
1098
+ - name: chr-F
1099
+ type: chrf
1100
+ value: 0.47561
1101
+ - task:
1102
+ name: Translation prs-deu
1103
+ type: translation
1104
+ args: prs-deu
1105
+ dataset:
1106
+ name: flores200-devtest
1107
+ type: flores200-devtest
1108
+ args: prs-deu
1109
+ metrics:
1110
+ - name: BLEU
1111
+ type: bleu
1112
+ value: 20.7
1113
+ - name: chr-F
1114
+ type: chrf
1115
+ value: 0.50273
1116
+ - task:
1117
+ name: Translation prs-eng
1118
+ type: translation
1119
+ args: prs-eng
1120
+ dataset:
1121
+ name: flores200-devtest
1122
+ type: flores200-devtest
1123
+ args: prs-eng
1124
+ metrics:
1125
+ - name: BLEU
1126
+ type: bleu
1127
+ value: 34.5
1128
+ - name: chr-F
1129
+ type: chrf
1130
+ value: 0.60144
1131
+ - task:
1132
+ name: Translation prs-fra
1133
+ type: translation
1134
+ args: prs-fra
1135
+ dataset:
1136
+ name: flores200-devtest
1137
+ type: flores200-devtest
1138
+ args: prs-fra
1139
+ metrics:
1140
+ - name: BLEU
1141
+ type: bleu
1142
+ value: 27.0
1143
+ - name: chr-F
1144
+ type: chrf
1145
+ value: 0.54241
1146
+ - task:
1147
+ name: Translation prs-por
1148
+ type: translation
1149
+ args: prs-por
1150
+ dataset:
1151
+ name: flores200-devtest
1152
+ type: flores200-devtest
1153
+ args: prs-por
1154
+ metrics:
1155
+ - name: BLEU
1156
+ type: bleu
1157
+ value: 26.6
1158
+ - name: chr-F
1159
+ type: chrf
1160
+ value: 0.53562
1161
+ - task:
1162
+ name: Translation prs-spa
1163
+ type: translation
1164
+ args: prs-spa
1165
+ dataset:
1166
+ name: flores200-devtest
1167
+ type: flores200-devtest
1168
+ args: prs-spa
1169
+ metrics:
1170
+ - name: BLEU
1171
+ type: bleu
1172
+ value: 18.1
1173
+ - name: chr-F
1174
+ type: chrf
1175
+ value: 0.46497
1176
+ - task:
1177
+ name: Translation san-eng
1178
+ type: translation
1179
+ args: san-eng
1180
+ dataset:
1181
+ name: flores200-devtest
1182
+ type: flores200-devtest
1183
+ args: san-eng
1184
+ metrics:
1185
+ - name: BLEU
1186
+ type: bleu
1187
+ value: 11.5
1188
+ - name: chr-F
1189
+ type: chrf
1190
+ value: 0.36840
1191
+ - task:
1192
+ name: Translation sin-deu
1193
+ type: translation
1194
+ args: sin-deu
1195
+ dataset:
1196
+ name: flores200-devtest
1197
+ type: flores200-devtest
1198
+ args: sin-deu
1199
+ metrics:
1200
+ - name: BLEU
1201
+ type: bleu
1202
+ value: 14.7
1203
+ - name: chr-F
1204
+ type: chrf
1205
+ value: 0.45041
1206
+ - task:
1207
+ name: Translation sin-eng
1208
+ type: translation
1209
+ args: sin-eng
1210
+ dataset:
1211
+ name: flores200-devtest
1212
+ type: flores200-devtest
1213
+ args: sin-eng
1214
+ metrics:
1215
+ - name: BLEU
1216
+ type: bleu
1217
+ value: 26.3
1218
+ - name: chr-F
1219
+ type: chrf
1220
+ value: 0.54060
1221
+ - task:
1222
+ name: Translation sin-fra
1223
+ type: translation
1224
+ args: sin-fra
1225
+ dataset:
1226
+ name: flores200-devtest
1227
+ type: flores200-devtest
1228
+ args: sin-fra
1229
+ metrics:
1230
+ - name: BLEU
1231
+ type: bleu
1232
+ value: 19.9
1233
+ - name: chr-F
1234
+ type: chrf
1235
+ value: 0.48163
1236
+ - task:
1237
+ name: Translation sin-por
1238
+ type: translation
1239
+ args: sin-por
1240
+ dataset:
1241
+ name: flores200-devtest
1242
+ type: flores200-devtest
1243
+ args: sin-por
1244
+ metrics:
1245
+ - name: BLEU
1246
+ type: bleu
1247
+ value: 19.6
1248
+ - name: chr-F
1249
+ type: chrf
1250
+ value: 0.47780
1251
+ - task:
1252
+ name: Translation sin-spa
1253
+ type: translation
1254
+ args: sin-spa
1255
+ dataset:
1256
+ name: flores200-devtest
1257
+ type: flores200-devtest
1258
+ args: sin-spa
1259
+ metrics:
1260
+ - name: BLEU
1261
+ type: bleu
1262
+ value: 14.2
1263
+ - name: chr-F
1264
+ type: chrf
1265
+ value: 0.42546
1266
+ - task:
1267
+ name: Translation tgk-deu
1268
+ type: translation
1269
+ args: tgk-deu
1270
+ dataset:
1271
+ name: flores200-devtest
1272
+ type: flores200-devtest
1273
+ args: tgk-deu
1274
+ metrics:
1275
+ - name: BLEU
1276
+ type: bleu
1277
+ value: 15.6
1278
+ - name: chr-F
1279
+ type: chrf
1280
+ value: 0.45203
1281
+ - task:
1282
+ name: Translation tgk-eng
1283
+ type: translation
1284
+ args: tgk-eng
1285
+ dataset:
1286
+ name: flores200-devtest
1287
+ type: flores200-devtest
1288
+ args: tgk-eng
1289
+ metrics:
1290
+ - name: BLEU
1291
+ type: bleu
1292
+ value: 25.3
1293
+ - name: chr-F
1294
+ type: chrf
1295
+ value: 0.53740
1296
+ - task:
1297
+ name: Translation tgk-fra
1298
+ type: translation
1299
+ args: tgk-fra
1300
+ dataset:
1301
+ name: flores200-devtest
1302
+ type: flores200-devtest
1303
+ args: tgk-fra
1304
+ metrics:
1305
+ - name: BLEU
1306
+ type: bleu
1307
+ value: 22.1
1308
+ - name: chr-F
1309
+ type: chrf
1310
+ value: 0.50153
1311
+ - task:
1312
+ name: Translation tgk-por
1313
+ type: translation
1314
+ args: tgk-por
1315
+ dataset:
1316
+ name: flores200-devtest
1317
+ type: flores200-devtest
1318
+ args: tgk-por
1319
+ metrics:
1320
+ - name: BLEU
1321
+ type: bleu
1322
+ value: 21.9
1323
+ - name: chr-F
1324
+ type: chrf
1325
+ value: 0.49378
1326
+ - task:
1327
+ name: Translation tgk-spa
1328
+ type: translation
1329
+ args: tgk-spa
1330
+ dataset:
1331
+ name: flores200-devtest
1332
+ type: flores200-devtest
1333
+ args: tgk-spa
1334
+ metrics:
1335
+ - name: BLEU
1336
+ type: bleu
1337
+ value: 15.9
1338
+ - name: chr-F
1339
+ type: chrf
1340
+ value: 0.44099
1341
+ - task:
1342
+ name: Translation urd-deu
1343
+ type: translation
1344
+ args: urd-deu
1345
+ dataset:
1346
+ name: flores200-devtest
1347
+ type: flores200-devtest
1348
+ args: urd-deu
1349
+ metrics:
1350
+ - name: BLEU
1351
+ type: bleu
1352
+ value: 17.2
1353
+ - name: chr-F
1354
+ type: chrf
1355
+ value: 0.46894
1356
+ - task:
1357
+ name: Translation urd-eng
1358
+ type: translation
1359
+ args: urd-eng
1360
+ dataset:
1361
+ name: flores200-devtest
1362
+ type: flores200-devtest
1363
+ args: urd-eng
1364
+ metrics:
1365
+ - name: BLEU
1366
+ type: bleu
1367
+ value: 29.3
1368
+ - name: chr-F
1369
+ type: chrf
1370
+ value: 0.56967
1371
+ - task:
1372
+ name: Translation urd-fra
1373
+ type: translation
1374
+ args: urd-fra
1375
+ dataset:
1376
+ name: flores200-devtest
1377
+ type: flores200-devtest
1378
+ args: urd-fra
1379
+ metrics:
1380
+ - name: BLEU
1381
+ type: bleu
1382
+ value: 22.6
1383
+ - name: chr-F
1384
+ type: chrf
1385
+ value: 0.50616
1386
+ - task:
1387
+ name: Translation urd-por
1388
+ type: translation
1389
+ args: urd-por
1390
+ dataset:
1391
+ name: flores200-devtest
1392
+ type: flores200-devtest
1393
+ args: urd-por
1394
+ metrics:
1395
+ - name: BLEU
1396
+ type: bleu
1397
+ value: 21.7
1398
+ - name: chr-F
1399
+ type: chrf
1400
+ value: 0.49398
1401
+ - task:
1402
+ name: Translation urd-spa
1403
+ type: translation
1404
+ args: urd-spa
1405
+ dataset:
1406
+ name: flores200-devtest
1407
+ type: flores200-devtest
1408
+ args: urd-spa
1409
+ metrics:
1410
+ - name: BLEU
1411
+ type: bleu
1412
+ value: 15.4
1413
+ - name: chr-F
1414
+ type: chrf
1415
+ value: 0.43800
1416
+ - task:
1417
+ name: Translation asm-fra
1418
+ type: translation
1419
+ args: asm-fra
1420
+ dataset:
1421
+ name: flores101-devtest
1422
+ type: flores_101
1423
+ args: asm fra devtest
1424
+ metrics:
1425
+ - name: BLEU
1426
+ type: bleu
1427
+ value: 11.6
1428
+ - name: chr-F
1429
+ type: chrf
1430
+ value: 0.35655
1431
+ - task:
1432
+ name: Translation asm-por
1433
+ type: translation
1434
+ args: asm-por
1435
+ dataset:
1436
+ name: flores101-devtest
1437
+ type: flores_101
1438
+ args: asm por devtest
1439
+ metrics:
1440
+ - name: BLEU
1441
+ type: bleu
1442
+ value: 12.5
1443
+ - name: chr-F
1444
+ type: chrf
1445
+ value: 0.38149
1446
+ - task:
1447
+ name: Translation ben-deu
1448
+ type: translation
1449
+ args: ben-deu
1450
+ dataset:
1451
+ name: flores101-devtest
1452
+ type: flores_101
1453
+ args: ben deu devtest
1454
+ metrics:
1455
+ - name: BLEU
1456
+ type: bleu
1457
+ value: 16.4
1458
+ - name: chr-F
1459
+ type: chrf
1460
+ value: 0.46873
1461
+ - task:
1462
+ name: Translation ben-eng
1463
+ type: translation
1464
+ args: ben-eng
1465
+ dataset:
1466
+ name: flores101-devtest
1467
+ type: flores_101
1468
+ args: ben eng devtest
1469
+ metrics:
1470
+ - name: BLEU
1471
+ type: bleu
1472
+ value: 30.0
1473
+ - name: chr-F
1474
+ type: chrf
1475
+ value: 0.57508
1476
+ - task:
1477
+ name: Translation ben-spa
1478
+ type: translation
1479
+ args: ben-spa
1480
+ dataset:
1481
+ name: flores101-devtest
1482
+ type: flores_101
1483
+ args: ben spa devtest
1484
+ metrics:
1485
+ - name: BLEU
1486
+ type: bleu
1487
+ value: 15.1
1488
+ - name: chr-F
1489
+ type: chrf
1490
+ value: 0.44010
1491
+ - task:
1492
+ name: Translation ckb-deu
1493
+ type: translation
1494
+ args: ckb-deu
1495
+ dataset:
1496
+ name: flores101-devtest
1497
+ type: flores_101
1498
+ args: ckb deu devtest
1499
+ metrics:
1500
+ - name: BLEU
1501
+ type: bleu
1502
+ value: 13.0
1503
+ - name: chr-F
1504
+ type: chrf
1505
+ value: 0.41546
1506
+ - task:
1507
+ name: Translation ckb-por
1508
+ type: translation
1509
+ args: ckb-por
1510
+ dataset:
1511
+ name: flores101-devtest
1512
+ type: flores_101
1513
+ args: ckb por devtest
1514
+ metrics:
1515
+ - name: BLEU
1516
+ type: bleu
1517
+ value: 17.6
1518
+ - name: chr-F
1519
+ type: chrf
1520
+ value: 0.44178
1521
+ - task:
1522
+ name: Translation ckb-spa
1523
+ type: translation
1524
+ args: ckb-spa
1525
+ dataset:
1526
+ name: flores101-devtest
1527
+ type: flores_101
1528
+ args: ckb spa devtest
1529
+ metrics:
1530
+ - name: BLEU
1531
+ type: bleu
1532
+ value: 12.7
1533
+ - name: chr-F
1534
+ type: chrf
1535
+ value: 0.39703
1536
+ - task:
1537
+ name: Translation fas-por
1538
+ type: translation
1539
+ args: fas-por
1540
+ dataset:
1541
+ name: flores101-devtest
1542
+ type: flores_101
1543
+ args: fas por devtest
1544
+ metrics:
1545
+ - name: BLEU
1546
+ type: bleu
1547
+ value: 26.1
1548
+ - name: chr-F
1549
+ type: chrf
1550
+ value: 0.54077
1551
+ - task:
1552
+ name: Translation guj-deu
1553
+ type: translation
1554
+ args: guj-deu
1555
+ dataset:
1556
+ name: flores101-devtest
1557
+ type: flores_101
1558
+ args: guj deu devtest
1559
+ metrics:
1560
+ - name: BLEU
1561
+ type: bleu
1562
+ value: 16.5
1563
+ - name: chr-F
1564
+ type: chrf
1565
+ value: 0.45906
1566
+ - task:
1567
+ name: Translation guj-spa
1568
+ type: translation
1569
+ args: guj-spa
1570
+ dataset:
1571
+ name: flores101-devtest
1572
+ type: flores_101
1573
+ args: guj spa devtest
1574
+ metrics:
1575
+ - name: BLEU
1576
+ type: bleu
1577
+ value: 15.2
1578
+ - name: chr-F
1579
+ type: chrf
1580
+ value: 0.43928
1581
+ - task:
1582
+ name: Translation hin-eng
1583
+ type: translation
1584
+ args: hin-eng
1585
+ dataset:
1586
+ name: flores101-devtest
1587
+ type: flores_101
1588
+ args: hin eng devtest
1589
+ metrics:
1590
+ - name: BLEU
1591
+ type: bleu
1592
+ value: 36.6
1593
+ - name: chr-F
1594
+ type: chrf
1595
+ value: 0.62807
1596
+ - task:
1597
+ name: Translation hin-por
1598
+ type: translation
1599
+ args: hin-por
1600
+ dataset:
1601
+ name: flores101-devtest
1602
+ type: flores_101
1603
+ args: hin por devtest
1604
+ metrics:
1605
+ - name: BLEU
1606
+ type: bleu
1607
+ value: 25.1
1608
+ - name: chr-F
1609
+ type: chrf
1610
+ value: 0.52825
1611
+ - task:
1612
+ name: Translation mar-deu
1613
+ type: translation
1614
+ args: mar-deu
1615
+ dataset:
1616
+ name: flores101-devtest
1617
+ type: flores_101
1618
+ args: mar deu devtest
1619
+ metrics:
1620
+ - name: BLEU
1621
+ type: bleu
1622
+ value: 14.8
1623
+ - name: chr-F
1624
+ type: chrf
1625
+ value: 0.44767
1626
+ - task:
1627
+ name: Translation npi-deu
1628
+ type: translation
1629
+ args: npi-deu
1630
+ dataset:
1631
+ name: flores101-devtest
1632
+ type: flores_101
1633
+ args: npi deu devtest
1634
+ metrics:
1635
+ - name: BLEU
1636
+ type: bleu
1637
+ value: 15.9
1638
+ - name: chr-F
1639
+ type: chrf
1640
+ value: 0.46178
1641
+ - task:
1642
+ name: Translation pan-fra
1643
+ type: translation
1644
+ args: pan-fra
1645
+ dataset:
1646
+ name: flores101-devtest
1647
+ type: flores_101
1648
+ args: pan fra devtest
1649
+ metrics:
1650
+ - name: BLEU
1651
+ type: bleu
1652
+ value: 23.4
1653
+ - name: chr-F
1654
+ type: chrf
1655
+ value: 0.50909
1656
+ - task:
1657
+ name: Translation pan-por
1658
+ type: translation
1659
+ args: pan-por
1660
+ dataset:
1661
+ name: flores101-devtest
1662
+ type: flores_101
1663
+ args: pan por devtest
1664
+ metrics:
1665
+ - name: BLEU
1666
+ type: bleu
1667
+ value: 23.0
1668
+ - name: chr-F
1669
+ type: chrf
1670
+ value: 0.50634
1671
+ - task:
1672
+ name: Translation pus-deu
1673
+ type: translation
1674
+ args: pus-deu
1675
+ dataset:
1676
+ name: flores101-devtest
1677
+ type: flores_101
1678
+ args: pus deu devtest
1679
+ metrics:
1680
+ - name: BLEU
1681
+ type: bleu
1682
+ value: 13.5
1683
+ - name: chr-F
1684
+ type: chrf
1685
+ value: 0.42645
1686
+ - task:
1687
+ name: Translation pus-fra
1688
+ type: translation
1689
+ args: pus-fra
1690
+ dataset:
1691
+ name: flores101-devtest
1692
+ type: flores_101
1693
+ args: pus fra devtest
1694
+ metrics:
1695
+ - name: BLEU
1696
+ type: bleu
1697
+ value: 18.0
1698
+ - name: chr-F
1699
+ type: chrf
1700
+ value: 0.45719
1701
+ - task:
1702
+ name: Translation urd-deu
1703
+ type: translation
1704
+ args: urd-deu
1705
+ dataset:
1706
+ name: flores101-devtest
1707
+ type: flores_101
1708
+ args: urd deu devtest
1709
+ metrics:
1710
+ - name: BLEU
1711
+ type: bleu
1712
+ value: 16.5
1713
+ - name: chr-F
1714
+ type: chrf
1715
+ value: 0.46102
1716
+ - task:
1717
+ name: Translation urd-eng
1718
+ type: translation
1719
+ args: urd-eng
1720
+ dataset:
1721
+ name: flores101-devtest
1722
+ type: flores_101
1723
+ args: urd eng devtest
1724
+ metrics:
1725
+ - name: BLEU
1726
+ type: bleu
1727
+ value: 28.4
1728
+ - name: chr-F
1729
+ type: chrf
1730
+ value: 0.56356
1731
+ - task:
1732
+ name: Translation ben-deu
1733
+ type: translation
1734
+ args: ben-deu
1735
+ dataset:
1736
+ name: ntrex128
1737
+ type: ntrex128
1738
+ args: ben-deu
1739
+ metrics:
1740
+ - name: BLEU
1741
+ type: bleu
1742
+ value: 15.0
1743
+ - name: chr-F
1744
+ type: chrf
1745
+ value: 0.45551
1746
+ - task:
1747
+ name: Translation ben-eng
1748
+ type: translation
1749
+ args: ben-eng
1750
+ dataset:
1751
+ name: ntrex128
1752
+ type: ntrex128
1753
+ args: ben-eng
1754
+ metrics:
1755
+ - name: BLEU
1756
+ type: bleu
1757
+ value: 29.0
1758
+ - name: chr-F
1759
+ type: chrf
1760
+ value: 0.56878
1761
+ - task:
1762
+ name: Translation ben-fra
1763
+ type: translation
1764
+ args: ben-fra
1765
+ dataset:
1766
+ name: ntrex128
1767
+ type: ntrex128
1768
+ args: ben-fra
1769
+ metrics:
1770
+ - name: BLEU
1771
+ type: bleu
1772
+ value: 18.6
1773
+ - name: chr-F
1774
+ type: chrf
1775
+ value: 0.47077
1776
+ - task:
1777
+ name: Translation ben-por
1778
+ type: translation
1779
+ args: ben-por
1780
+ dataset:
1781
+ name: ntrex128
1782
+ type: ntrex128
1783
+ args: ben-por
1784
+ metrics:
1785
+ - name: BLEU
1786
+ type: bleu
1787
+ value: 17.1
1788
+ - name: chr-F
1789
+ type: chrf
1790
+ value: 0.46049
1791
+ - task:
1792
+ name: Translation ben-spa
1793
+ type: translation
1794
+ args: ben-spa
1795
+ dataset:
1796
+ name: ntrex128
1797
+ type: ntrex128
1798
+ args: ben-spa
1799
+ metrics:
1800
+ - name: BLEU
1801
+ type: bleu
1802
+ value: 21.3
1803
+ - name: chr-F
1804
+ type: chrf
1805
+ value: 0.48833
1806
+ - task:
1807
+ name: Translation fas-deu
1808
+ type: translation
1809
+ args: fas-deu
1810
+ dataset:
1811
+ name: ntrex128
1812
+ type: ntrex128
1813
+ args: fas-deu
1814
+ metrics:
1815
+ - name: BLEU
1816
+ type: bleu
1817
+ value: 16.1
1818
+ - name: chr-F
1819
+ type: chrf
1820
+ value: 0.46991
1821
+ - task:
1822
+ name: Translation fas-eng
1823
+ type: translation
1824
+ args: fas-eng
1825
+ dataset:
1826
+ name: ntrex128
1827
+ type: ntrex128
1828
+ args: fas-eng
1829
+ metrics:
1830
+ - name: BLEU
1831
+ type: bleu
1832
+ value: 25.9
1833
+ - name: chr-F
1834
+ type: chrf
1835
+ value: 0.55119
1836
+ - task:
1837
+ name: Translation fas-fra
1838
+ type: translation
1839
+ args: fas-fra
1840
+ dataset:
1841
+ name: ntrex128
1842
+ type: ntrex128
1843
+ args: fas-fra
1844
+ metrics:
1845
+ - name: BLEU
1846
+ type: bleu
1847
+ value: 21.2
1848
+ - name: chr-F
1849
+ type: chrf
1850
+ value: 0.49626
1851
+ - task:
1852
+ name: Translation fas-por
1853
+ type: translation
1854
+ args: fas-por
1855
+ dataset:
1856
+ name: ntrex128
1857
+ type: ntrex128
1858
+ args: fas-por
1859
+ metrics:
1860
+ - name: BLEU
1861
+ type: bleu
1862
+ value: 18.6
1863
+ - name: chr-F
1864
+ type: chrf
1865
+ value: 0.47499
1866
+ - task:
1867
+ name: Translation fas-spa
1868
+ type: translation
1869
+ args: fas-spa
1870
+ dataset:
1871
+ name: ntrex128
1872
+ type: ntrex128
1873
+ args: fas-spa
1874
+ metrics:
1875
+ - name: BLEU
1876
+ type: bleu
1877
+ value: 22.8
1878
+ - name: chr-F
1879
+ type: chrf
1880
+ value: 0.50178
1881
+ - task:
1882
+ name: Translation guj-deu
1883
+ type: translation
1884
+ args: guj-deu
1885
+ dataset:
1886
+ name: ntrex128
1887
+ type: ntrex128
1888
+ args: guj-deu
1889
+ metrics:
1890
+ - name: BLEU
1891
+ type: bleu
1892
+ value: 14.3
1893
+ - name: chr-F
1894
+ type: chrf
1895
+ value: 0.43998
1896
+ - task:
1897
+ name: Translation guj-eng
1898
+ type: translation
1899
+ args: guj-eng
1900
+ dataset:
1901
+ name: ntrex128
1902
+ type: ntrex128
1903
+ args: guj-eng
1904
+ metrics:
1905
+ - name: BLEU
1906
+ type: bleu
1907
+ value: 31.0
1908
+ - name: chr-F
1909
+ type: chrf
1910
+ value: 0.58481
1911
+ - task:
1912
+ name: Translation guj-fra
1913
+ type: translation
1914
+ args: guj-fra
1915
+ dataset:
1916
+ name: ntrex128
1917
+ type: ntrex128
1918
+ args: guj-fra
1919
+ metrics:
1920
+ - name: BLEU
1921
+ type: bleu
1922
+ value: 17.3
1923
+ - name: chr-F
1924
+ type: chrf
1925
+ value: 0.45468
1926
+ - task:
1927
+ name: Translation guj-por
1928
+ type: translation
1929
+ args: guj-por
1930
+ dataset:
1931
+ name: ntrex128
1932
+ type: ntrex128
1933
+ args: guj-por
1934
+ metrics:
1935
+ - name: BLEU
1936
+ type: bleu
1937
+ value: 15.8
1938
+ - name: chr-F
1939
+ type: chrf
1940
+ value: 0.44223
1941
+ - task:
1942
+ name: Translation guj-spa
1943
+ type: translation
1944
+ args: guj-spa
1945
+ dataset:
1946
+ name: ntrex128
1947
+ type: ntrex128
1948
+ args: guj-spa
1949
+ metrics:
1950
+ - name: BLEU
1951
+ type: bleu
1952
+ value: 20.7
1953
+ - name: chr-F
1954
+ type: chrf
1955
+ value: 0.47798
1956
+ - task:
1957
+ name: Translation hin-deu
1958
+ type: translation
1959
+ args: hin-deu
1960
+ dataset:
1961
+ name: ntrex128
1962
+ type: ntrex128
1963
+ args: hin-deu
1964
+ metrics:
1965
+ - name: BLEU
1966
+ type: bleu
1967
+ value: 15.0
1968
+ - name: chr-F
1969
+ type: chrf
1970
+ value: 0.46580
1971
+ - task:
1972
+ name: Translation hin-eng
1973
+ type: translation
1974
+ args: hin-eng
1975
+ dataset:
1976
+ name: ntrex128
1977
+ type: ntrex128
1978
+ args: hin-eng
1979
+ metrics:
1980
+ - name: BLEU
1981
+ type: bleu
1982
+ value: 31.6
1983
+ - name: chr-F
1984
+ type: chrf
1985
+ value: 0.59832
1986
+ - task:
1987
+ name: Translation hin-fra
1988
+ type: translation
1989
+ args: hin-fra
1990
+ dataset:
1991
+ name: ntrex128
1992
+ type: ntrex128
1993
+ args: hin-fra
1994
+ metrics:
1995
+ - name: BLEU
1996
+ type: bleu
1997
+ value: 19.5
1998
+ - name: chr-F
1999
+ type: chrf
2000
+ value: 0.48328
2001
+ - task:
2002
+ name: Translation hin-por
2003
+ type: translation
2004
+ args: hin-por
2005
+ dataset:
2006
+ name: ntrex128
2007
+ type: ntrex128
2008
+ args: hin-por
2009
+ metrics:
2010
+ - name: BLEU
2011
+ type: bleu
2012
+ value: 17.8
2013
+ - name: chr-F
2014
+ type: chrf
2015
+ value: 0.46833
2016
+ - task:
2017
+ name: Translation hin-spa
2018
+ type: translation
2019
+ args: hin-spa
2020
+ dataset:
2021
+ name: ntrex128
2022
+ type: ntrex128
2023
+ args: hin-spa
2024
+ metrics:
2025
+ - name: BLEU
2026
+ type: bleu
2027
+ value: 21.9
2028
+ - name: chr-F
2029
+ type: chrf
2030
+ value: 0.49517
2031
+ - task:
2032
+ name: Translation kmr-eng
2033
+ type: translation
2034
+ args: kmr-eng
2035
+ dataset:
2036
+ name: ntrex128
2037
+ type: ntrex128
2038
+ args: kmr-eng
2039
+ metrics:
2040
+ - name: BLEU
2041
+ type: bleu
2042
+ value: 12.9
2043
+ - name: chr-F
2044
+ type: chrf
2045
+ value: 0.37956
2046
+ - task:
2047
+ name: Translation kmr-spa
2048
+ type: translation
2049
+ args: kmr-spa
2050
+ dataset:
2051
+ name: ntrex128
2052
+ type: ntrex128
2053
+ args: kmr-spa
2054
+ metrics:
2055
+ - name: BLEU
2056
+ type: bleu
2057
+ value: 10.8
2058
+ - name: chr-F
2059
+ type: chrf
2060
+ value: 0.35021
2061
+ - task:
2062
+ name: Translation mar-deu
2063
+ type: translation
2064
+ args: mar-deu
2065
+ dataset:
2066
+ name: ntrex128
2067
+ type: ntrex128
2068
+ args: mar-deu
2069
+ metrics:
2070
+ - name: BLEU
2071
+ type: bleu
2072
+ value: 13.5
2073
+ - name: chr-F
2074
+ type: chrf
2075
+ value: 0.43713
2076
+ - task:
2077
+ name: Translation mar-eng
2078
+ type: translation
2079
+ args: mar-eng
2080
+ dataset:
2081
+ name: ntrex128
2082
+ type: ntrex128
2083
+ args: mar-eng
2084
+ metrics:
2085
+ - name: BLEU
2086
+ type: bleu
2087
+ value: 27.4
2088
+ - name: chr-F
2089
+ type: chrf
2090
+ value: 0.55132
2091
+ - task:
2092
+ name: Translation mar-fra
2093
+ type: translation
2094
+ args: mar-fra
2095
+ dataset:
2096
+ name: ntrex128
2097
+ type: ntrex128
2098
+ args: mar-fra
2099
+ metrics:
2100
+ - name: BLEU
2101
+ type: bleu
2102
+ value: 16.9
2103
+ - name: chr-F
2104
+ type: chrf
2105
+ value: 0.44797
2106
+ - task:
2107
+ name: Translation mar-por
2108
+ type: translation
2109
+ args: mar-por
2110
+ dataset:
2111
+ name: ntrex128
2112
+ type: ntrex128
2113
+ args: mar-por
2114
+ metrics:
2115
+ - name: BLEU
2116
+ type: bleu
2117
+ value: 16.1
2118
+ - name: chr-F
2119
+ type: chrf
2120
+ value: 0.44342
2121
+ - task:
2122
+ name: Translation mar-spa
2123
+ type: translation
2124
+ args: mar-spa
2125
+ dataset:
2126
+ name: ntrex128
2127
+ type: ntrex128
2128
+ args: mar-spa
2129
+ metrics:
2130
+ - name: BLEU
2131
+ type: bleu
2132
+ value: 19.7
2133
+ - name: chr-F
2134
+ type: chrf
2135
+ value: 0.46950
2136
+ - task:
2137
+ name: Translation nep-deu
2138
+ type: translation
2139
+ args: nep-deu
2140
+ dataset:
2141
+ name: ntrex128
2142
+ type: ntrex128
2143
+ args: nep-deu
2144
+ metrics:
2145
+ - name: BLEU
2146
+ type: bleu
2147
+ value: 13.5
2148
+ - name: chr-F
2149
+ type: chrf
2150
+ value: 0.43568
2151
+ - task:
2152
+ name: Translation nep-eng
2153
+ type: translation
2154
+ args: nep-eng
2155
+ dataset:
2156
+ name: ntrex128
2157
+ type: ntrex128
2158
+ args: nep-eng
2159
+ metrics:
2160
+ - name: BLEU
2161
+ type: bleu
2162
+ value: 28.8
2163
+ - name: chr-F
2164
+ type: chrf
2165
+ value: 0.55954
2166
+ - task:
2167
+ name: Translation nep-fra
2168
+ type: translation
2169
+ args: nep-fra
2170
+ dataset:
2171
+ name: ntrex128
2172
+ type: ntrex128
2173
+ args: nep-fra
2174
+ metrics:
2175
+ - name: BLEU
2176
+ type: bleu
2177
+ value: 16.9
2178
+ - name: chr-F
2179
+ type: chrf
2180
+ value: 0.45083
2181
+ - task:
2182
+ name: Translation nep-por
2183
+ type: translation
2184
+ args: nep-por
2185
+ dataset:
2186
+ name: ntrex128
2187
+ type: ntrex128
2188
+ args: nep-por
2189
+ metrics:
2190
+ - name: BLEU
2191
+ type: bleu
2192
+ value: 16.0
2193
+ - name: chr-F
2194
+ type: chrf
2195
+ value: 0.44458
2196
+ - task:
2197
+ name: Translation nep-spa
2198
+ type: translation
2199
+ args: nep-spa
2200
+ dataset:
2201
+ name: ntrex128
2202
+ type: ntrex128
2203
+ args: nep-spa
2204
+ metrics:
2205
+ - name: BLEU
2206
+ type: bleu
2207
+ value: 19.4
2208
+ - name: chr-F
2209
+ type: chrf
2210
+ value: 0.46832
2211
+ - task:
2212
+ name: Translation pan-deu
2213
+ type: translation
2214
+ args: pan-deu
2215
+ dataset:
2216
+ name: ntrex128
2217
+ type: ntrex128
2218
+ args: pan-deu
2219
+ metrics:
2220
+ - name: BLEU
2221
+ type: bleu
2222
+ value: 14.6
2223
+ - name: chr-F
2224
+ type: chrf
2225
+ value: 0.44327
2226
+ - task:
2227
+ name: Translation pan-eng
2228
+ type: translation
2229
+ args: pan-eng
2230
+ dataset:
2231
+ name: ntrex128
2232
+ type: ntrex128
2233
+ args: pan-eng
2234
+ metrics:
2235
+ - name: BLEU
2236
+ type: bleu
2237
+ value: 30.5
2238
+ - name: chr-F
2239
+ type: chrf
2240
+ value: 0.57665
2241
+ - task:
2242
+ name: Translation pan-fra
2243
+ type: translation
2244
+ args: pan-fra
2245
+ dataset:
2246
+ name: ntrex128
2247
+ type: ntrex128
2248
+ args: pan-fra
2249
+ metrics:
2250
+ - name: BLEU
2251
+ type: bleu
2252
+ value: 17.7
2253
+ - name: chr-F
2254
+ type: chrf
2255
+ value: 0.45815
2256
+ - task:
2257
+ name: Translation pan-por
2258
+ type: translation
2259
+ args: pan-por
2260
+ dataset:
2261
+ name: ntrex128
2262
+ type: ntrex128
2263
+ args: pan-por
2264
+ metrics:
2265
+ - name: BLEU
2266
+ type: bleu
2267
+ value: 16.3
2268
+ - name: chr-F
2269
+ type: chrf
2270
+ value: 0.44608
2271
+ - task:
2272
+ name: Translation pan-spa
2273
+ type: translation
2274
+ args: pan-spa
2275
+ dataset:
2276
+ name: ntrex128
2277
+ type: ntrex128
2278
+ args: pan-spa
2279
+ metrics:
2280
+ - name: BLEU
2281
+ type: bleu
2282
+ value: 20.0
2283
+ - name: chr-F
2284
+ type: chrf
2285
+ value: 0.47289
2286
+ - task:
2287
+ name: Translation prs-deu
2288
+ type: translation
2289
+ args: prs-deu
2290
+ dataset:
2291
+ name: ntrex128
2292
+ type: ntrex128
2293
+ args: prs-deu
2294
+ metrics:
2295
+ - name: BLEU
2296
+ type: bleu
2297
+ value: 14.6
2298
+ - name: chr-F
2299
+ type: chrf
2300
+ value: 0.45067
2301
+ - task:
2302
+ name: Translation prs-eng
2303
+ type: translation
2304
+ args: prs-eng
2305
+ dataset:
2306
+ name: ntrex128
2307
+ type: ntrex128
2308
+ args: prs-eng
2309
+ metrics:
2310
+ - name: BLEU
2311
+ type: bleu
2312
+ value: 26.6
2313
+ - name: chr-F
2314
+ type: chrf
2315
+ value: 0.54767
2316
+ - task:
2317
+ name: Translation prs-fra
2318
+ type: translation
2319
+ args: prs-fra
2320
+ dataset:
2321
+ name: ntrex128
2322
+ type: ntrex128
2323
+ args: prs-fra
2324
+ metrics:
2325
+ - name: BLEU
2326
+ type: bleu
2327
+ value: 19.3
2328
+ - name: chr-F
2329
+ type: chrf
2330
+ value: 0.47453
2331
+ - task:
2332
+ name: Translation prs-por
2333
+ type: translation
2334
+ args: prs-por
2335
+ dataset:
2336
+ name: ntrex128
2337
+ type: ntrex128
2338
+ args: prs-por
2339
+ metrics:
2340
+ - name: BLEU
2341
+ type: bleu
2342
+ value: 17.1
2343
+ - name: chr-F
2344
+ type: chrf
2345
+ value: 0.45843
2346
+ - task:
2347
+ name: Translation prs-spa
2348
+ type: translation
2349
+ args: prs-spa
2350
+ dataset:
2351
+ name: ntrex128
2352
+ type: ntrex128
2353
+ args: prs-spa
2354
+ metrics:
2355
+ - name: BLEU
2356
+ type: bleu
2357
+ value: 20.9
2358
+ - name: chr-F
2359
+ type: chrf
2360
+ value: 0.48317
2361
+ - task:
2362
+ name: Translation pus-deu
2363
+ type: translation
2364
+ args: pus-deu
2365
+ dataset:
2366
+ name: ntrex128
2367
+ type: ntrex128
2368
+ args: pus-deu
2369
+ metrics:
2370
+ - name: BLEU
2371
+ type: bleu
2372
+ value: 10.3
2373
+ - name: chr-F
2374
+ type: chrf
2375
+ value: 0.38989
2376
+ - task:
2377
+ name: Translation pus-eng
2378
+ type: translation
2379
+ args: pus-eng
2380
+ dataset:
2381
+ name: ntrex128
2382
+ type: ntrex128
2383
+ args: pus-eng
2384
+ metrics:
2385
+ - name: BLEU
2386
+ type: bleu
2387
+ value: 17.6
2388
+ - name: chr-F
2389
+ type: chrf
2390
+ value: 0.44698
2391
+ - task:
2392
+ name: Translation pus-fra
2393
+ type: translation
2394
+ args: pus-fra
2395
+ dataset:
2396
+ name: ntrex128
2397
+ type: ntrex128
2398
+ args: pus-fra
2399
+ metrics:
2400
+ - name: BLEU
2401
+ type: bleu
2402
+ value: 12.6
2403
+ - name: chr-F
2404
+ type: chrf
2405
+ value: 0.39872
2406
+ - task:
2407
+ name: Translation pus-por
2408
+ type: translation
2409
+ args: pus-por
2410
+ dataset:
2411
+ name: ntrex128
2412
+ type: ntrex128
2413
+ args: pus-por
2414
+ metrics:
2415
+ - name: BLEU
2416
+ type: bleu
2417
+ value: 11.9
2418
+ - name: chr-F
2419
+ type: chrf
2420
+ value: 0.38923
2421
+ - task:
2422
+ name: Translation pus-spa
2423
+ type: translation
2424
+ args: pus-spa
2425
+ dataset:
2426
+ name: ntrex128
2427
+ type: ntrex128
2428
+ args: pus-spa
2429
+ metrics:
2430
+ - name: BLEU
2431
+ type: bleu
2432
+ value: 14.6
2433
+ - name: chr-F
2434
+ type: chrf
2435
+ value: 0.41132
2436
+ - task:
2437
+ name: Translation sin-deu
2438
+ type: translation
2439
+ args: sin-deu
2440
+ dataset:
2441
+ name: ntrex128
2442
+ type: ntrex128
2443
+ args: sin-deu
2444
+ metrics:
2445
+ - name: BLEU
2446
+ type: bleu
2447
+ value: 12.5
2448
+ - name: chr-F
2449
+ type: chrf
2450
+ value: 0.42541
2451
+ - task:
2452
+ name: Translation sin-eng
2453
+ type: translation
2454
+ args: sin-eng
2455
+ dataset:
2456
+ name: ntrex128
2457
+ type: ntrex128
2458
+ args: sin-eng
2459
+ metrics:
2460
+ - name: BLEU
2461
+ type: bleu
2462
+ value: 23.5
2463
+ - name: chr-F
2464
+ type: chrf
2465
+ value: 0.51853
2466
+ - task:
2467
+ name: Translation sin-fra
2468
+ type: translation
2469
+ args: sin-fra
2470
+ dataset:
2471
+ name: ntrex128
2472
+ type: ntrex128
2473
+ args: sin-fra
2474
+ metrics:
2475
+ - name: BLEU
2476
+ type: bleu
2477
+ value: 15.9
2478
+ - name: chr-F
2479
+ type: chrf
2480
+ value: 0.44099
2481
+ - task:
2482
+ name: Translation sin-por
2483
+ type: translation
2484
+ args: sin-por
2485
+ dataset:
2486
+ name: ntrex128
2487
+ type: ntrex128
2488
+ args: sin-por
2489
+ metrics:
2490
+ - name: BLEU
2491
+ type: bleu
2492
+ value: 14.4
2493
+ - name: chr-F
2494
+ type: chrf
2495
+ value: 0.43010
2496
+ - task:
2497
+ name: Translation sin-spa
2498
+ type: translation
2499
+ args: sin-spa
2500
+ dataset:
2501
+ name: ntrex128
2502
+ type: ntrex128
2503
+ args: sin-spa
2504
+ metrics:
2505
+ - name: BLEU
2506
+ type: bleu
2507
+ value: 18.4
2508
+ - name: chr-F
2509
+ type: chrf
2510
+ value: 0.46225
2511
+ - task:
2512
+ name: Translation tgk_Cyrl-deu
2513
+ type: translation
2514
+ args: tgk_Cyrl-deu
2515
+ dataset:
2516
+ name: ntrex128
2517
+ type: ntrex128
2518
+ args: tgk_Cyrl-deu
2519
+ metrics:
2520
+ - name: BLEU
2521
+ type: bleu
2522
+ value: 11.4
2523
+ - name: chr-F
2524
+ type: chrf
2525
+ value: 0.40368
2526
+ - task:
2527
+ name: Translation tgk_Cyrl-eng
2528
+ type: translation
2529
+ args: tgk_Cyrl-eng
2530
+ dataset:
2531
+ name: ntrex128
2532
+ type: ntrex128
2533
+ args: tgk_Cyrl-eng
2534
+ metrics:
2535
+ - name: BLEU
2536
+ type: bleu
2537
+ value: 18.2
2538
+ - name: chr-F
2539
+ type: chrf
2540
+ value: 0.47132
2541
+ - task:
2542
+ name: Translation tgk_Cyrl-fra
2543
+ type: translation
2544
+ args: tgk_Cyrl-fra
2545
+ dataset:
2546
+ name: ntrex128
2547
+ type: ntrex128
2548
+ args: tgk_Cyrl-fra
2549
+ metrics:
2550
+ - name: BLEU
2551
+ type: bleu
2552
+ value: 15.8
2553
+ - name: chr-F
2554
+ type: chrf
2555
+ value: 0.43311
2556
+ - task:
2557
+ name: Translation tgk_Cyrl-por
2558
+ type: translation
2559
+ args: tgk_Cyrl-por
2560
+ dataset:
2561
+ name: ntrex128
2562
+ type: ntrex128
2563
+ args: tgk_Cyrl-por
2564
+ metrics:
2565
+ - name: BLEU
2566
+ type: bleu
2567
+ value: 13.8
2568
+ - name: chr-F
2569
+ type: chrf
2570
+ value: 0.42095
2571
+ - task:
2572
+ name: Translation tgk_Cyrl-spa
2573
+ type: translation
2574
+ args: tgk_Cyrl-spa
2575
+ dataset:
2576
+ name: ntrex128
2577
+ type: ntrex128
2578
+ args: tgk_Cyrl-spa
2579
+ metrics:
2580
+ - name: BLEU
2581
+ type: bleu
2582
+ value: 17.3
2583
+ - name: chr-F
2584
+ type: chrf
2585
+ value: 0.44279
2586
+ - task:
2587
+ name: Translation urd-deu
2588
+ type: translation
2589
+ args: urd-deu
2590
+ dataset:
2591
+ name: ntrex128
2592
+ type: ntrex128
2593
+ args: urd-deu
2594
+ metrics:
2595
+ - name: BLEU
2596
+ type: bleu
2597
+ value: 15.5
2598
+ - name: chr-F
2599
+ type: chrf
2600
+ value: 0.45708
2601
+ - task:
2602
+ name: Translation urd-eng
2603
+ type: translation
2604
+ args: urd-eng
2605
+ dataset:
2606
+ name: ntrex128
2607
+ type: ntrex128
2608
+ args: urd-eng
2609
+ metrics:
2610
+ - name: BLEU
2611
+ type: bleu
2612
+ value: 28.5
2613
+ - name: chr-F
2614
+ type: chrf
2615
+ value: 0.56560
2616
+ - task:
2617
+ name: Translation urd-fra
2618
+ type: translation
2619
+ args: urd-fra
2620
+ dataset:
2621
+ name: ntrex128
2622
+ type: ntrex128
2623
+ args: urd-fra
2624
+ metrics:
2625
+ - name: BLEU
2626
+ type: bleu
2627
+ value: 19.0
2628
+ - name: chr-F
2629
+ type: chrf
2630
+ value: 0.47536
2631
+ - task:
2632
+ name: Translation urd-por
2633
+ type: translation
2634
+ args: urd-por
2635
+ dataset:
2636
+ name: ntrex128
2637
+ type: ntrex128
2638
+ args: urd-por
2639
+ metrics:
2640
+ - name: BLEU
2641
+ type: bleu
2642
+ value: 16.7
2643
+ - name: chr-F
2644
+ type: chrf
2645
+ value: 0.45911
2646
+ - task:
2647
+ name: Translation urd-spa
2648
+ type: translation
2649
+ args: urd-spa
2650
+ dataset:
2651
+ name: ntrex128
2652
+ type: ntrex128
2653
+ args: urd-spa
2654
+ metrics:
2655
+ - name: BLEU
2656
+ type: bleu
2657
+ value: 21.6
2658
+ - name: chr-F
2659
+ type: chrf
2660
+ value: 0.48986
2661
+ - task:
2662
+ name: Translation awa-eng
2663
+ type: translation
2664
+ args: awa-eng
2665
+ dataset:
2666
+ name: tatoeba-test-v2021-08-07
2667
+ type: tatoeba_mt
2668
+ args: awa-eng
2669
+ metrics:
2670
+ - name: BLEU
2671
+ type: bleu
2672
+ value: 40.8
2673
+ - name: chr-F
2674
+ type: chrf
2675
+ value: 0.60240
2676
+ - task:
2677
+ name: Translation ben-eng
2678
+ type: translation
2679
+ args: ben-eng
2680
+ dataset:
2681
+ name: tatoeba-test-v2021-08-07
2682
+ type: tatoeba_mt
2683
+ args: ben-eng
2684
+ metrics:
2685
+ - name: BLEU
2686
+ type: bleu
2687
+ value: 49.3
2688
+ - name: chr-F
2689
+ type: chrf
2690
+ value: 0.64471
2691
+ - task:
2692
+ name: Translation fas-deu
2693
+ type: translation
2694
+ args: fas-deu
2695
+ dataset:
2696
+ name: tatoeba-test-v2021-08-07
2697
+ type: tatoeba_mt
2698
+ args: fas-deu
2699
+ metrics:
2700
+ - name: BLEU
2701
+ type: bleu
2702
+ value: 34.7
2703
+ - name: chr-F
2704
+ type: chrf
2705
+ value: 0.58631
2706
+ - task:
2707
+ name: Translation fas-eng
2708
+ type: translation
2709
+ args: fas-eng
2710
+ dataset:
2711
+ name: tatoeba-test-v2021-08-07
2712
+ type: tatoeba_mt
2713
+ args: fas-eng
2714
+ metrics:
2715
+ - name: BLEU
2716
+ type: bleu
2717
+ value: 41.8
2718
+ - name: chr-F
2719
+ type: chrf
2720
+ value: 0.59868
2721
+ - task:
2722
+ name: Translation fas-fra
2723
+ type: translation
2724
+ args: fas-fra
2725
+ dataset:
2726
+ name: tatoeba-test-v2021-08-07
2727
+ type: tatoeba_mt
2728
+ args: fas-fra
2729
+ metrics:
2730
+ - name: BLEU
2731
+ type: bleu
2732
+ value: 35.8
2733
+ - name: chr-F
2734
+ type: chrf
2735
+ value: 0.57181
2736
+ - task:
2737
+ name: Translation hin-eng
2738
+ type: translation
2739
+ args: hin-eng
2740
+ dataset:
2741
+ name: tatoeba-test-v2021-08-07
2742
+ type: tatoeba_mt
2743
+ args: hin-eng
2744
+ metrics:
2745
+ - name: BLEU
2746
+ type: bleu
2747
+ value: 49.5
2748
+ - name: chr-F
2749
+ type: chrf
2750
+ value: 0.65417
2751
+ - task:
2752
+ name: Translation kur_Latn-deu
2753
+ type: translation
2754
+ args: kur_Latn-deu
2755
+ dataset:
2756
+ name: tatoeba-test-v2021-08-07
2757
+ type: tatoeba_mt
2758
+ args: kur_Latn-deu
2759
+ metrics:
2760
+ - name: BLEU
2761
+ type: bleu
2762
+ value: 27.0
2763
+ - name: chr-F
2764
+ type: chrf
2765
+ value: 0.42694
2766
+ - task:
2767
+ name: Translation kur_Latn-eng
2768
+ type: translation
2769
+ args: kur_Latn-eng
2770
+ dataset:
2771
+ name: tatoeba-test-v2021-08-07
2772
+ type: tatoeba_mt
2773
+ args: kur_Latn-eng
2774
+ metrics:
2775
+ - name: BLEU
2776
+ type: bleu
2777
+ value: 25.6
2778
+ - name: chr-F
2779
+ type: chrf
2780
+ value: 0.42721
2781
+ - task:
2782
+ name: Translation mar-eng
2783
+ type: translation
2784
+ args: mar-eng
2785
+ dataset:
2786
+ name: tatoeba-test-v2021-08-07
2787
+ type: tatoeba_mt
2788
+ args: mar-eng
2789
+ metrics:
2790
+ - name: BLEU
2791
+ type: bleu
2792
+ value: 48.3
2793
+ - name: chr-F
2794
+ type: chrf
2795
+ value: 0.64493
2796
+ - task:
2797
+ name: Translation multi-multi
2798
+ type: translation
2799
+ args: multi-multi
2800
+ dataset:
2801
+ name: tatoeba-test-v2020-07-28-v2023-09-26
2802
+ type: tatoeba_mt
2803
+ args: multi-multi
2804
+ metrics:
2805
+ - name: BLEU
2806
+ type: bleu
2807
+ value: 42.8
2808
+ - name: chr-F
2809
+ type: chrf
2810
+ value: 0.60665
2811
+ - task:
2812
+ name: Translation pes-eng
2813
+ type: translation
2814
+ args: pes-eng
2815
+ dataset:
2816
+ name: tatoeba-test-v2021-08-07
2817
+ type: tatoeba_mt
2818
+ args: pes-eng
2819
+ metrics:
2820
+ - name: BLEU
2821
+ type: bleu
2822
+ value: 41.9
2823
+ - name: chr-F
2824
+ type: chrf
2825
+ value: 0.59959
2826
+ - task:
2827
+ name: Translation urd-eng
2828
+ type: translation
2829
+ args: urd-eng
2830
+ dataset:
2831
+ name: tatoeba-test-v2021-08-07
2832
+ type: tatoeba_mt
2833
+ args: urd-eng
2834
+ metrics:
2835
+ - name: BLEU
2836
+ type: bleu
2837
+ value: 35.4
2838
+ - name: chr-F
2839
+ type: chrf
2840
+ value: 0.53679
2841
+ - task:
2842
+ name: Translation ben-eng
2843
+ type: translation
2844
+ args: ben-eng
2845
+ dataset:
2846
+ name: tico19-test
2847
+ type: tico19-test
2848
+ args: ben-eng
2849
+ metrics:
2850
+ - name: BLEU
2851
+ type: bleu
2852
+ value: 38.7
2853
+ - name: chr-F
2854
+ type: chrf
2855
+ value: 0.64578
2856
+ - task:
2857
+ name: Translation ben-fra
2858
+ type: translation
2859
+ args: ben-fra
2860
+ dataset:
2861
+ name: tico19-test
2862
+ type: tico19-test
2863
+ args: ben-fra
2864
+ metrics:
2865
+ - name: BLEU
2866
+ type: bleu
2867
+ value: 22.8
2868
+ - name: chr-F
2869
+ type: chrf
2870
+ value: 0.50165
2871
+ - task:
2872
+ name: Translation ben-por
2873
+ type: translation
2874
+ args: ben-por
2875
+ dataset:
2876
+ name: tico19-test
2877
+ type: tico19-test
2878
+ args: ben-por
2879
+ metrics:
2880
+ - name: BLEU
2881
+ type: bleu
2882
+ value: 27.7
2883
+ - name: chr-F
2884
+ type: chrf
2885
+ value: 0.55662
2886
+ - task:
2887
+ name: Translation ben-spa
2888
+ type: translation
2889
+ args: ben-spa
2890
+ dataset:
2891
+ name: tico19-test
2892
+ type: tico19-test
2893
+ args: ben-spa
2894
+ metrics:
2895
+ - name: BLEU
2896
+ type: bleu
2897
+ value: 29.6
2898
+ - name: chr-F
2899
+ type: chrf
2900
+ value: 0.56795
2901
+ - task:
2902
+ name: Translation ckb-eng
2903
+ type: translation
2904
+ args: ckb-eng
2905
+ dataset:
2906
+ name: tico19-test
2907
+ type: tico19-test
2908
+ args: ckb-eng
2909
+ metrics:
2910
+ - name: BLEU
2911
+ type: bleu
2912
+ value: 27.4
2913
+ - name: chr-F
2914
+ type: chrf
2915
+ value: 0.51623
2916
+ - task:
2917
+ name: Translation ckb-fra
2918
+ type: translation
2919
+ args: ckb-fra
2920
+ dataset:
2921
+ name: tico19-test
2922
+ type: tico19-test
2923
+ args: ckb-fra
2924
+ metrics:
2925
+ - name: BLEU
2926
+ type: bleu
2927
+ value: 17.1
2928
+ - name: chr-F
2929
+ type: chrf
2930
+ value: 0.42405
2931
+ - task:
2932
+ name: Translation ckb-por
2933
+ type: translation
2934
+ args: ckb-por
2935
+ dataset:
2936
+ name: tico19-test
2937
+ type: tico19-test
2938
+ args: ckb-por
2939
+ metrics:
2940
+ - name: BLEU
2941
+ type: bleu
2942
+ value: 19.0
2943
+ - name: chr-F
2944
+ type: chrf
2945
+ value: 0.45405
2946
+ - task:
2947
+ name: Translation ckb-spa
2948
+ type: translation
2949
+ args: ckb-spa
2950
+ dataset:
2951
+ name: tico19-test
2952
+ type: tico19-test
2953
+ args: ckb-spa
2954
+ metrics:
2955
+ - name: BLEU
2956
+ type: bleu
2957
+ value: 21.7
2958
+ - name: chr-F
2959
+ type: chrf
2960
+ value: 0.46976
2961
+ - task:
2962
+ name: Translation fas-eng
2963
+ type: translation
2964
+ args: fas-eng
2965
+ dataset:
2966
+ name: tico19-test
2967
+ type: tico19-test
2968
+ args: fas-eng
2969
+ metrics:
2970
+ - name: BLEU
2971
+ type: bleu
2972
+ value: 34.2
2973
+ - name: chr-F
2974
+ type: chrf
2975
+ value: 0.62079
2976
+ - task:
2977
+ name: Translation fas-fra
2978
+ type: translation
2979
+ args: fas-fra
2980
+ dataset:
2981
+ name: tico19-test
2982
+ type: tico19-test
2983
+ args: fas-fra
2984
+ metrics:
2985
+ - name: BLEU
2986
+ type: bleu
2987
+ value: 24.4
2988
+ - name: chr-F
2989
+ type: chrf
2990
+ value: 0.52041
2991
+ - task:
2992
+ name: Translation fas-por
2993
+ type: translation
2994
+ args: fas-por
2995
+ dataset:
2996
+ name: tico19-test
2997
+ type: tico19-test
2998
+ args: fas-por
2999
+ metrics:
3000
+ - name: BLEU
3001
+ type: bleu
3002
+ value: 29.2
3003
+ - name: chr-F
3004
+ type: chrf
3005
+ value: 0.56780
3006
+ - task:
3007
+ name: Translation fas-spa
3008
+ type: translation
3009
+ args: fas-spa
3010
+ dataset:
3011
+ name: tico19-test
3012
+ type: tico19-test
3013
+ args: fas-spa
3014
+ metrics:
3015
+ - name: BLEU
3016
+ type: bleu
3017
+ value: 32.3
3018
+ - name: chr-F
3019
+ type: chrf
3020
+ value: 0.58248
3021
+ - task:
3022
+ name: Translation hin-eng
3023
+ type: translation
3024
+ args: hin-eng
3025
+ dataset:
3026
+ name: tico19-test
3027
+ type: tico19-test
3028
+ args: hin-eng
3029
+ metrics:
3030
+ - name: BLEU
3031
+ type: bleu
3032
+ value: 46.8
3033
+ - name: chr-F
3034
+ type: chrf
3035
+ value: 0.70535
3036
+ - task:
3037
+ name: Translation hin-fra
3038
+ type: translation
3039
+ args: hin-fra
3040
+ dataset:
3041
+ name: tico19-test
3042
+ type: tico19-test
3043
+ args: hin-fra
3044
+ metrics:
3045
+ - name: BLEU
3046
+ type: bleu
3047
+ value: 26.6
3048
+ - name: chr-F
3049
+ type: chrf
3050
+ value: 0.53833
3051
+ - task:
3052
+ name: Translation hin-por
3053
+ type: translation
3054
+ args: hin-por
3055
+ dataset:
3056
+ name: tico19-test
3057
+ type: tico19-test
3058
+ args: hin-por
3059
+ metrics:
3060
+ - name: BLEU
3061
+ type: bleu
3062
+ value: 33.2
3063
+ - name: chr-F
3064
+ type: chrf
3065
+ value: 0.60246
3066
+ - task:
3067
+ name: Translation hin-spa
3068
+ type: translation
3069
+ args: hin-spa
3070
+ dataset:
3071
+ name: tico19-test
3072
+ type: tico19-test
3073
+ args: hin-spa
3074
+ metrics:
3075
+ - name: BLEU
3076
+ type: bleu
3077
+ value: 35.7
3078
+ - name: chr-F
3079
+ type: chrf
3080
+ value: 0.61504
3081
+ - task:
3082
+ name: Translation mar-eng
3083
+ type: translation
3084
+ args: mar-eng
3085
+ dataset:
3086
+ name: tico19-test
3087
+ type: tico19-test
3088
+ args: mar-eng
3089
+ metrics:
3090
+ - name: BLEU
3091
+ type: bleu
3092
+ value: 31.4
3093
+ - name: chr-F
3094
+ type: chrf
3095
+ value: 0.59247
3096
+ - task:
3097
+ name: Translation mar-fra
3098
+ type: translation
3099
+ args: mar-fra
3100
+ dataset:
3101
+ name: tico19-test
3102
+ type: tico19-test
3103
+ args: mar-fra
3104
+ metrics:
3105
+ - name: BLEU
3106
+ type: bleu
3107
+ value: 19.3
3108
+ - name: chr-F
3109
+ type: chrf
3110
+ value: 0.46895
3111
+ - task:
3112
+ name: Translation mar-por
3113
+ type: translation
3114
+ args: mar-por
3115
+ dataset:
3116
+ name: tico19-test
3117
+ type: tico19-test
3118
+ args: mar-por
3119
+ metrics:
3120
+ - name: BLEU
3121
+ type: bleu
3122
+ value: 23.8
3123
+ - name: chr-F
3124
+ type: chrf
3125
+ value: 0.51945
3126
+ - task:
3127
+ name: Translation mar-spa
3128
+ type: translation
3129
+ args: mar-spa
3130
+ dataset:
3131
+ name: tico19-test
3132
+ type: tico19-test
3133
+ args: mar-spa
3134
+ metrics:
3135
+ - name: BLEU
3136
+ type: bleu
3137
+ value: 26.2
3138
+ - name: chr-F
3139
+ type: chrf
3140
+ value: 0.52914
3141
+ - task:
3142
+ name: Translation nep-eng
3143
+ type: translation
3144
+ args: nep-eng
3145
+ dataset:
3146
+ name: tico19-test
3147
+ type: tico19-test
3148
+ args: nep-eng
3149
+ metrics:
3150
+ - name: BLEU
3151
+ type: bleu
3152
+ value: 40.1
3153
+ - name: chr-F
3154
+ type: chrf
3155
+ value: 0.65865
3156
+ - task:
3157
+ name: Translation nep-fra
3158
+ type: translation
3159
+ args: nep-fra
3160
+ dataset:
3161
+ name: tico19-test
3162
+ type: tico19-test
3163
+ args: nep-fra
3164
+ metrics:
3165
+ - name: BLEU
3166
+ type: bleu
3167
+ value: 23.2
3168
+ - name: chr-F
3169
+ type: chrf
3170
+ value: 0.50473
3171
+ - task:
3172
+ name: Translation nep-por
3173
+ type: translation
3174
+ args: nep-por
3175
+ dataset:
3176
+ name: tico19-test
3177
+ type: tico19-test
3178
+ args: nep-por
3179
+ metrics:
3180
+ - name: BLEU
3181
+ type: bleu
3182
+ value: 28.0
3183
+ - name: chr-F
3184
+ type: chrf
3185
+ value: 0.56185
3186
+ - task:
3187
+ name: Translation nep-spa
3188
+ type: translation
3189
+ args: nep-spa
3190
+ dataset:
3191
+ name: tico19-test
3192
+ type: tico19-test
3193
+ args: nep-spa
3194
+ metrics:
3195
+ - name: BLEU
3196
+ type: bleu
3197
+ value: 30.2
3198
+ - name: chr-F
3199
+ type: chrf
3200
+ value: 0.57270
3201
+ - task:
3202
+ name: Translation prs-eng
3203
+ type: translation
3204
+ args: prs-eng
3205
+ dataset:
3206
+ name: tico19-test
3207
+ type: tico19-test
3208
+ args: prs-eng
3209
+ metrics:
3210
+ - name: BLEU
3211
+ type: bleu
3212
+ value: 32.1
3213
+ - name: chr-F
3214
+ type: chrf
3215
+ value: 0.59536
3216
+ - task:
3217
+ name: Translation prs-fra
3218
+ type: translation
3219
+ args: prs-fra
3220
+ dataset:
3221
+ name: tico19-test
3222
+ type: tico19-test
3223
+ args: prs-fra
3224
+ metrics:
3225
+ - name: BLEU
3226
+ type: bleu
3227
+ value: 23.1
3228
+ - name: chr-F
3229
+ type: chrf
3230
+ value: 0.50044
3231
+ - task:
3232
+ name: Translation prs-por
3233
+ type: translation
3234
+ args: prs-por
3235
+ dataset:
3236
+ name: tico19-test
3237
+ type: tico19-test
3238
+ args: prs-por
3239
+ metrics:
3240
+ - name: BLEU
3241
+ type: bleu
3242
+ value: 27.3
3243
+ - name: chr-F
3244
+ type: chrf
3245
+ value: 0.54448
3246
+ - task:
3247
+ name: Translation prs-spa
3248
+ type: translation
3249
+ args: prs-spa
3250
+ dataset:
3251
+ name: tico19-test
3252
+ type: tico19-test
3253
+ args: prs-spa
3254
+ metrics:
3255
+ - name: BLEU
3256
+ type: bleu
3257
+ value: 30.2
3258
+ - name: chr-F
3259
+ type: chrf
3260
+ value: 0.56311
3261
+ - task:
3262
+ name: Translation pus-eng
3263
+ type: translation
3264
+ args: pus-eng
3265
+ dataset:
3266
+ name: tico19-test
3267
+ type: tico19-test
3268
+ args: pus-eng
3269
+ metrics:
3270
+ - name: BLEU
3271
+ type: bleu
3272
+ value: 31.4
3273
+ - name: chr-F
3274
+ type: chrf
3275
+ value: 0.56711
3276
+ - task:
3277
+ name: Translation pus-fra
3278
+ type: translation
3279
+ args: pus-fra
3280
+ dataset:
3281
+ name: tico19-test
3282
+ type: tico19-test
3283
+ args: pus-fra
3284
+ metrics:
3285
+ - name: BLEU
3286
+ type: bleu
3287
+ value: 19.4
3288
+ - name: chr-F
3289
+ type: chrf
3290
+ value: 0.45951
3291
+ - task:
3292
+ name: Translation pus-por
3293
+ type: translation
3294
+ args: pus-por
3295
+ dataset:
3296
+ name: tico19-test
3297
+ type: tico19-test
3298
+ args: pus-por
3299
+ metrics:
3300
+ - name: BLEU
3301
+ type: bleu
3302
+ value: 23.7
3303
+ - name: chr-F
3304
+ type: chrf
3305
+ value: 0.50225
3306
+ - task:
3307
+ name: Translation pus-spa
3308
+ type: translation
3309
+ args: pus-spa
3310
+ dataset:
3311
+ name: tico19-test
3312
+ type: tico19-test
3313
+ args: pus-spa
3314
+ metrics:
3315
+ - name: BLEU
3316
+ type: bleu
3317
+ value: 25.4
3318
+ - name: chr-F
3319
+ type: chrf
3320
+ value: 0.51246
3321
+ - task:
3322
+ name: Translation urd-eng
3323
+ type: translation
3324
+ args: urd-eng
3325
+ dataset:
3326
+ name: tico19-test
3327
+ type: tico19-test
3328
+ args: urd-eng
3329
+ metrics:
3330
+ - name: BLEU
3331
+ type: bleu
3332
+ value: 30.8
3333
+ - name: chr-F
3334
+ type: chrf
3335
+ value: 0.57786
3336
+ - task:
3337
+ name: Translation urd-fra
3338
+ type: translation
3339
+ args: urd-fra
3340
+ dataset:
3341
+ name: tico19-test
3342
+ type: tico19-test
3343
+ args: urd-fra
3344
+ metrics:
3345
+ - name: BLEU
3346
+ type: bleu
3347
+ value: 20.1
3348
+ - name: chr-F
3349
+ type: chrf
3350
+ value: 0.46807
3351
+ - task:
3352
+ name: Translation urd-por
3353
+ type: translation
3354
+ args: urd-por
3355
+ dataset:
3356
+ name: tico19-test
3357
+ type: tico19-test
3358
+ args: urd-por
3359
+ metrics:
3360
+ - name: BLEU
3361
+ type: bleu
3362
+ value: 24.1
3363
+ - name: chr-F
3364
+ type: chrf
3365
+ value: 0.51567
3366
+ - task:
3367
+ name: Translation urd-spa
3368
+ type: translation
3369
+ args: urd-spa
3370
+ dataset:
3371
+ name: tico19-test
3372
+ type: tico19-test
3373
+ args: urd-spa
3374
+ metrics:
3375
+ - name: BLEU
3376
+ type: bleu
3377
+ value: 26.4
3378
+ - name: chr-F
3379
+ type: chrf
3380
+ value: 0.52820
3381
+ - task:
3382
+ name: Translation hin-eng
3383
+ type: translation
3384
+ args: hin-eng
3385
+ dataset:
3386
+ name: newstest2014
3387
+ type: wmt-2014-news
3388
+ args: hin-eng
3389
+ metrics:
3390
+ - name: BLEU
3391
+ type: bleu
3392
+ value: 30.3
3393
+ - name: chr-F
3394
+ type: chrf
3395
+ value: 0.59024
3396
+ - task:
3397
+ name: Translation guj-eng
3398
+ type: translation
3399
+ args: guj-eng
3400
+ dataset:
3401
+ name: newstest2019
3402
+ type: wmt-2019-news
3403
+ args: guj-eng
3404
+ metrics:
3405
+ - name: BLEU
3406
+ type: bleu
3407
+ value: 27.2
3408
+ - name: chr-F
3409
+ type: chrf
3410
+ value: 0.53977
3411
+ - task:
3412
+ name: Translation pus-eng
3413
+ type: translation
3414
+ args: pus-eng
3415
+ dataset:
3416
+ name: newstest2020
3417
+ type: wmt-2020-news
3418
+ args: pus-eng
3419
+ metrics:
3420
+ - name: BLEU
3421
+ type: bleu
3422
+ value: 14.3
3423
+ - name: chr-F
3424
+ type: chrf
3425
+ value: 0.39447
3426
+ ---
3427
+ # opus-mt-tc-bible-big-iir-deu_eng_fra_por_spa
3428
+
3429
+ ## Table of Contents
3430
+ - [Model Details](#model-details)
3431
+ - [Uses](#uses)
3432
+ - [Risks, Limitations and Biases](#risks-limitations-and-biases)
3433
+ - [How to Get Started With the Model](#how-to-get-started-with-the-model)
3434
+ - [Training](#training)
3435
+ - [Evaluation](#evaluation)
3436
+ - [Citation Information](#citation-information)
3437
+ - [Acknowledgements](#acknowledgements)
3438
+
3439
+ ## Model Details
3440
+
3441
+ Neural machine translation model for translating from Indo-Iranian languages (iir) to unknown (deu+eng+fra+por+spa).
3442
+
3443
+ This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
3444
+ **Model Description:**
3445
+ - **Developed by:** Language Technology Research Group at the University of Helsinki
3446
+ - **Model Type:** Translation (transformer-big)
3447
+ - **Release**: 2024-05-30
3448
+ - **License:** Apache-2.0
3449
+ - **Language(s):**
3450
+ - Source Language(s): anp asm awa bal ben bho bpy ckb diq div dty fas gbm glk guj hif hin hne hns jdt kas kmr kok kur lah lrc mag mai mar mzn nep npi ori oss pal pan pes pli prs pus rhg rmy rom san sdh sin skr snd syl tgk tly urd zza
3451
+ - Target Language(s): deu eng fra por spa
3452
+ - Valid Target Language Labels: >>deu<< >>eng<< >>fra<< >>por<< >>spa<< >>xxx<<
3453
+ - **Original Model**: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/iir-deu+eng+fra+por+spa/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip)
3454
+ - **Resources for more information:**
3455
+ - [OPUS-MT dashboard](https://opus.nlpl.eu/dashboard/index.php?pkg=opusmt&test=all&scoreslang=all&chart=standard&model=Tatoeba-MT-models/iir-deu%2Beng%2Bfra%2Bpor%2Bspa/opusTCv20230926max50%2Bbt%2Bjhubc_transformer-big_2024-05-30)
3456
+ - [OPUS-MT-train GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
3457
+ - [More information about MarianNMT models in the transformers library](https://huggingface.co/docs/transformers/model_doc/marian)
3458
+ - [Tatoeba Translation Challenge](https://github.com/Helsinki-NLP/Tatoeba-Challenge/)
3459
+ - [HPLT bilingual data v1 (as part of the Tatoeba Translation Challenge dataset)](https://hplt-project.org/datasets/v1)
3460
+ - [A massively parallel Bible corpus](https://aclanthology.org/L14-1215/)
3461
+
3462
+ This is a multilingual translation model with multiple target languages. A sentence initial language token is required in the form of `>>id<<` (id = valid target language ID), e.g. `>>deu<<`
3463
+
3464
+ ## Uses
3465
+
3466
+ This model can be used for translation and text-to-text generation.
3467
+
3468
+ ## Risks, Limitations and Biases
3469
+
3470
+ **CONTENT WARNING: Readers should be aware that the model is trained on various public data sets that may contain content that is disturbing, offensive, and can propagate historical and current stereotypes.**
3471
+
3472
+ Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
3473
+
3474
+ ## How to Get Started With the Model
3475
+
3476
+ A short example code:
3477
+
3478
+ ```python
3479
+ from transformers import MarianMTModel, MarianTokenizer
3480
+
3481
+ src_text = [
3482
+ ">>deu<< Replace this with text in an accepted source language.",
3483
+ ">>spa<< This is the second sentence."
3484
+ ]
3485
+
3486
+ model_name = "pytorch-models/opus-mt-tc-bible-big-iir-deu_eng_fra_por_spa"
3487
+ tokenizer = MarianTokenizer.from_pretrained(model_name)
3488
+ model = MarianMTModel.from_pretrained(model_name)
3489
+ translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True))
3490
+
3491
+ for t in translated:
3492
+ print( tokenizer.decode(t, skip_special_tokens=True) )
3493
+ ```
3494
+
3495
+ You can also use OPUS-MT models with the transformers pipelines, for example:
3496
+
3497
+ ```python
3498
+ from transformers import pipeline
3499
+ pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-bible-big-iir-deu_eng_fra_por_spa")
3500
+ print(pipe(">>deu<< Replace this with text in an accepted source language."))
3501
+ ```
3502
+
3503
+ ## Training
3504
+
3505
+ - **Data**: opusTCv20230926max50+bt+jhubc ([source](https://github.com/Helsinki-NLP/Tatoeba-Challenge))
3506
+ - **Pre-processing**: SentencePiece (spm32k,spm32k)
3507
+ - **Model Type:** transformer-big
3508
+ - **Original MarianNMT Model**: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/iir-deu+eng+fra+por+spa/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip)
3509
+ - **Training Scripts**: [GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
3510
+
3511
+ ## Evaluation
3512
+
3513
+ * [Model scores at the OPUS-MT dashboard](https://opus.nlpl.eu/dashboard/index.php?pkg=opusmt&test=all&scoreslang=all&chart=standard&model=Tatoeba-MT-models/iir-deu%2Beng%2Bfra%2Bpor%2Bspa/opusTCv20230926max50%2Bbt%2Bjhubc_transformer-big_2024-05-30)
3514
+ * test set translations: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/iir-deu+eng+fra+por+spa/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.test.txt)
3515
+ * test set scores: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/iir-deu+eng+fra+por+spa/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.eval.txt)
3516
+ * benchmark results: [benchmark_results.txt](benchmark_results.txt)
3517
+ * benchmark output: [benchmark_translations.zip](benchmark_translations.zip)
3518
+
3519
+ | langpair | testset | chr-F | BLEU | #sent | #words |
3520
+ |----------|---------|-------|-------|-------|--------|
3521
+ | awa-eng | tatoeba-test-v2021-08-07 | 0.60240 | 40.8 | 279 | 1335 |
3522
+ | ben-eng | tatoeba-test-v2021-08-07 | 0.64471 | 49.3 | 2500 | 13978 |
3523
+ | fas-deu | tatoeba-test-v2021-08-07 | 0.58631 | 34.7 | 3185 | 25590 |
3524
+ | fas-eng | tatoeba-test-v2021-08-07 | 0.59868 | 41.8 | 3762 | 31480 |
3525
+ | fas-fra | tatoeba-test-v2021-08-07 | 0.57181 | 35.8 | 376 | 3377 |
3526
+ | hin-eng | tatoeba-test-v2021-08-07 | 0.65417 | 49.5 | 5000 | 33943 |
3527
+ | kur_Latn-deu | tatoeba-test-v2021-08-07 | 0.42694 | 27.0 | 223 | 1323 |
3528
+ | kur_Latn-eng | tatoeba-test-v2021-08-07 | 0.42721 | 25.6 | 290 | 1708 |
3529
+ | mar-eng | tatoeba-test-v2021-08-07 | 0.64493 | 48.3 | 10396 | 67527 |
3530
+ | pes-eng | tatoeba-test-v2021-08-07 | 0.59959 | 41.9 | 3757 | 31411 |
3531
+ | urd-eng | tatoeba-test-v2021-08-07 | 0.53679 | 35.4 | 1663 | 12029 |
3532
+ | ben-deu | flores101-devtest | 0.46873 | 16.4 | 1012 | 25094 |
3533
+ | ben-eng | flores101-devtest | 0.57508 | 30.0 | 1012 | 24721 |
3534
+ | ben-spa | flores101-devtest | 0.44010 | 15.1 | 1012 | 29199 |
3535
+ | ckb-deu | flores101-devtest | 0.41546 | 13.0 | 1012 | 25094 |
3536
+ | ckb-por | flores101-devtest | 0.44178 | 17.6 | 1012 | 26519 |
3537
+ | fas-por | flores101-devtest | 0.54077 | 26.1 | 1012 | 26519 |
3538
+ | guj-deu | flores101-devtest | 0.45906 | 16.5 | 1012 | 25094 |
3539
+ | guj-spa | flores101-devtest | 0.43928 | 15.2 | 1012 | 29199 |
3540
+ | hin-eng | flores101-devtest | 0.62807 | 36.6 | 1012 | 24721 |
3541
+ | hin-por | flores101-devtest | 0.52825 | 25.1 | 1012 | 26519 |
3542
+ | mar-deu | flores101-devtest | 0.44767 | 14.8 | 1012 | 25094 |
3543
+ | npi-deu | flores101-devtest | 0.46178 | 15.9 | 1012 | 25094 |
3544
+ | pan-fra | flores101-devtest | 0.50909 | 23.4 | 1012 | 28343 |
3545
+ | pan-por | flores101-devtest | 0.50634 | 23.0 | 1012 | 26519 |
3546
+ | pus-deu | flores101-devtest | 0.42645 | 13.5 | 1012 | 25094 |
3547
+ | pus-fra | flores101-devtest | 0.45719 | 18.0 | 1012 | 28343 |
3548
+ | urd-deu | flores101-devtest | 0.46102 | 16.5 | 1012 | 25094 |
3549
+ | urd-eng | flores101-devtest | 0.56356 | 28.4 | 1012 | 24721 |
3550
+ | asm-eng | flores200-devtest | 0.48589 | 21.5 | 1012 | 24721 |
3551
+ | awa-deu | flores200-devtest | 0.47071 | 16.0 | 1012 | 25094 |
3552
+ | awa-eng | flores200-devtest | 0.53069 | 26.6 | 1012 | 24721 |
3553
+ | awa-fra | flores200-devtest | 0.49700 | 21.1 | 1012 | 28343 |
3554
+ | awa-por | flores200-devtest | 0.49950 | 21.8 | 1012 | 26519 |
3555
+ | awa-spa | flores200-devtest | 0.43831 | 15.4 | 1012 | 29199 |
3556
+ | ben-deu | flores200-devtest | 0.47434 | 17.0 | 1012 | 25094 |
3557
+ | ben-eng | flores200-devtest | 0.58408 | 31.4 | 1012 | 24721 |
3558
+ | ben-fra | flores200-devtest | 0.50930 | 23.2 | 1012 | 28343 |
3559
+ | ben-por | flores200-devtest | 0.50661 | 22.4 | 1012 | 26519 |
3560
+ | ben-spa | flores200-devtest | 0.44485 | 15.7 | 1012 | 29199 |
3561
+ | bho-deu | flores200-devtest | 0.42463 | 12.8 | 1012 | 25094 |
3562
+ | bho-eng | flores200-devtest | 0.50545 | 22.6 | 1012 | 24721 |
3563
+ | bho-fra | flores200-devtest | 0.45264 | 17.4 | 1012 | 28343 |
3564
+ | bho-por | flores200-devtest | 0.44737 | 17.0 | 1012 | 26519 |
3565
+ | bho-spa | flores200-devtest | 0.40585 | 13.0 | 1012 | 29199 |
3566
+ | ckb-deu | flores200-devtest | 0.42110 | 13.6 | 1012 | 25094 |
3567
+ | ckb-eng | flores200-devtest | 0.50543 | 24.7 | 1012 | 24721 |
3568
+ | ckb-fra | flores200-devtest | 0.45847 | 19.1 | 1012 | 28343 |
3569
+ | ckb-por | flores200-devtest | 0.44567 | 17.8 | 1012 | 26519 |
3570
+ | guj-deu | flores200-devtest | 0.46758 | 17.3 | 1012 | 25094 |
3571
+ | guj-eng | flores200-devtest | 0.61139 | 34.4 | 1012 | 24721 |
3572
+ | guj-fra | flores200-devtest | 0.50349 | 22.5 | 1012 | 28343 |
3573
+ | guj-por | flores200-devtest | 0.49828 | 22.4 | 1012 | 26519 |
3574
+ | guj-spa | flores200-devtest | 0.44472 | 15.5 | 1012 | 29199 |
3575
+ | hin-deu | flores200-devtest | 0.50772 | 20.8 | 1012 | 25094 |
3576
+ | hin-eng | flores200-devtest | 0.63234 | 37.3 | 1012 | 24721 |
3577
+ | hin-fra | flores200-devtest | 0.53933 | 26.5 | 1012 | 28343 |
3578
+ | hin-por | flores200-devtest | 0.53523 | 26.1 | 1012 | 26519 |
3579
+ | hin-spa | flores200-devtest | 0.46183 | 17.4 | 1012 | 29199 |
3580
+ | hne-deu | flores200-devtest | 0.49946 | 19.0 | 1012 | 25094 |
3581
+ | hne-eng | flores200-devtest | 0.63640 | 38.1 | 1012 | 24721 |
3582
+ | hne-fra | flores200-devtest | 0.53419 | 25.7 | 1012 | 28343 |
3583
+ | hne-por | flores200-devtest | 0.53735 | 25.9 | 1012 | 26519 |
3584
+ | hne-spa | flores200-devtest | 0.45610 | 16.9 | 1012 | 29199 |
3585
+ | mag-deu | flores200-devtest | 0.50681 | 20.0 | 1012 | 25094 |
3586
+ | mag-eng | flores200-devtest | 0.63966 | 38.0 | 1012 | 24721 |
3587
+ | mag-fra | flores200-devtest | 0.53810 | 25.9 | 1012 | 28343 |
3588
+ | mag-por | flores200-devtest | 0.54065 | 26.6 | 1012 | 26519 |
3589
+ | mag-spa | flores200-devtest | 0.46131 | 17.1 | 1012 | 29199 |
3590
+ | mai-deu | flores200-devtest | 0.47686 | 16.8 | 1012 | 25094 |
3591
+ | mai-eng | flores200-devtest | 0.57552 | 30.2 | 1012 | 24721 |
3592
+ | mai-fra | flores200-devtest | 0.50909 | 22.4 | 1012 | 28343 |
3593
+ | mai-por | flores200-devtest | 0.51249 | 22.9 | 1012 | 26519 |
3594
+ | mai-spa | flores200-devtest | 0.44694 | 15.9 | 1012 | 29199 |
3595
+ | mar-deu | flores200-devtest | 0.45295 | 14.8 | 1012 | 25094 |
3596
+ | mar-eng | flores200-devtest | 0.58203 | 31.0 | 1012 | 24721 |
3597
+ | mar-fra | flores200-devtest | 0.48254 | 20.4 | 1012 | 28343 |
3598
+ | mar-por | flores200-devtest | 0.48368 | 20.4 | 1012 | 26519 |
3599
+ | mar-spa | flores200-devtest | 0.42799 | 14.7 | 1012 | 29199 |
3600
+ | npi-deu | flores200-devtest | 0.47267 | 17.2 | 1012 | 25094 |
3601
+ | npi-eng | flores200-devtest | 0.59559 | 32.5 | 1012 | 24721 |
3602
+ | npi-fra | flores200-devtest | 0.50869 | 22.5 | 1012 | 28343 |
3603
+ | npi-por | flores200-devtest | 0.50900 | 22.5 | 1012 | 26519 |
3604
+ | npi-spa | flores200-devtest | 0.44304 | 15.6 | 1012 | 29199 |
3605
+ | pan-deu | flores200-devtest | 0.48342 | 18.6 | 1012 | 25094 |
3606
+ | pan-eng | flores200-devtest | 0.60328 | 33.4 | 1012 | 24721 |
3607
+ | pan-fra | flores200-devtest | 0.51953 | 24.4 | 1012 | 28343 |
3608
+ | pan-por | flores200-devtest | 0.51428 | 23.9 | 1012 | 26519 |
3609
+ | pan-spa | flores200-devtest | 0.44615 | 16.3 | 1012 | 29199 |
3610
+ | pes-deu | flores200-devtest | 0.51124 | 21.0 | 1012 | 25094 |
3611
+ | pes-eng | flores200-devtest | 0.60538 | 33.7 | 1012 | 24721 |
3612
+ | pes-fra | flores200-devtest | 0.55157 | 27.8 | 1012 | 28343 |
3613
+ | pes-por | flores200-devtest | 0.54372 | 26.6 | 1012 | 26519 |
3614
+ | pes-spa | flores200-devtest | 0.47561 | 18.8 | 1012 | 29199 |
3615
+ | prs-deu | flores200-devtest | 0.50273 | 20.7 | 1012 | 25094 |
3616
+ | prs-eng | flores200-devtest | 0.60144 | 34.5 | 1012 | 24721 |
3617
+ | prs-fra | flores200-devtest | 0.54241 | 27.0 | 1012 | 28343 |
3618
+ | prs-por | flores200-devtest | 0.53562 | 26.6 | 1012 | 26519 |
3619
+ | prs-spa | flores200-devtest | 0.46497 | 18.1 | 1012 | 29199 |
3620
+ | sin-deu | flores200-devtest | 0.45041 | 14.7 | 1012 | 25094 |
3621
+ | sin-eng | flores200-devtest | 0.54060 | 26.3 | 1012 | 24721 |
3622
+ | sin-fra | flores200-devtest | 0.48163 | 19.9 | 1012 | 28343 |
3623
+ | sin-por | flores200-devtest | 0.47780 | 19.6 | 1012 | 26519 |
3624
+ | sin-spa | flores200-devtest | 0.42546 | 14.2 | 1012 | 29199 |
3625
+ | tgk-deu | flores200-devtest | 0.45203 | 15.6 | 1012 | 25094 |
3626
+ | tgk-eng | flores200-devtest | 0.53740 | 25.3 | 1012 | 24721 |
3627
+ | tgk-fra | flores200-devtest | 0.50153 | 22.1 | 1012 | 28343 |
3628
+ | tgk-por | flores200-devtest | 0.49378 | 21.9 | 1012 | 26519 |
3629
+ | tgk-spa | flores200-devtest | 0.44099 | 15.9 | 1012 | 29199 |
3630
+ | urd-deu | flores200-devtest | 0.46894 | 17.2 | 1012 | 25094 |
3631
+ | urd-eng | flores200-devtest | 0.56967 | 29.3 | 1012 | 24721 |
3632
+ | urd-fra | flores200-devtest | 0.50616 | 22.6 | 1012 | 28343 |
3633
+ | urd-por | flores200-devtest | 0.49398 | 21.7 | 1012 | 26519 |
3634
+ | urd-spa | flores200-devtest | 0.43800 | 15.4 | 1012 | 29199 |
3635
+ | hin-eng | newstest2014 | 0.59024 | 30.3 | 2507 | 55571 |
3636
+ | guj-eng | newstest2019 | 0.53977 | 27.2 | 1016 | 17757 |
3637
+ | ben-deu | ntrex128 | 0.45551 | 15.0 | 1997 | 48761 |
3638
+ | ben-eng | ntrex128 | 0.56878 | 29.0 | 1997 | 47673 |
3639
+ | ben-fra | ntrex128 | 0.47077 | 18.6 | 1997 | 53481 |
3640
+ | ben-por | ntrex128 | 0.46049 | 17.1 | 1997 | 51631 |
3641
+ | ben-spa | ntrex128 | 0.48833 | 21.3 | 1997 | 54107 |
3642
+ | fas-deu | ntrex128 | 0.46991 | 16.1 | 1997 | 48761 |
3643
+ | fas-eng | ntrex128 | 0.55119 | 25.9 | 1997 | 47673 |
3644
+ | fas-fra | ntrex128 | 0.49626 | 21.2 | 1997 | 53481 |
3645
+ | fas-por | ntrex128 | 0.47499 | 18.6 | 1997 | 51631 |
3646
+ | fas-spa | ntrex128 | 0.50178 | 22.8 | 1997 | 54107 |
3647
+ | guj-deu | ntrex128 | 0.43998 | 14.3 | 1997 | 48761 |
3648
+ | guj-eng | ntrex128 | 0.58481 | 31.0 | 1997 | 47673 |
3649
+ | guj-fra | ntrex128 | 0.45468 | 17.3 | 1997 | 53481 |
3650
+ | guj-por | ntrex128 | 0.44223 | 15.8 | 1997 | 51631 |
3651
+ | guj-spa | ntrex128 | 0.47798 | 20.7 | 1997 | 54107 |
3652
+ | hin-deu | ntrex128 | 0.46580 | 15.0 | 1997 | 48761 |
3653
+ | hin-eng | ntrex128 | 0.59832 | 31.6 | 1997 | 47673 |
3654
+ | hin-fra | ntrex128 | 0.48328 | 19.5 | 1997 | 53481 |
3655
+ | hin-por | ntrex128 | 0.46833 | 17.8 | 1997 | 51631 |
3656
+ | hin-spa | ntrex128 | 0.49517 | 21.9 | 1997 | 54107 |
3657
+ | mar-deu | ntrex128 | 0.43713 | 13.5 | 1997 | 48761 |
3658
+ | mar-eng | ntrex128 | 0.55132 | 27.4 | 1997 | 47673 |
3659
+ | mar-fra | ntrex128 | 0.44797 | 16.9 | 1997 | 53481 |
3660
+ | mar-por | ntrex128 | 0.44342 | 16.1 | 1997 | 51631 |
3661
+ | mar-spa | ntrex128 | 0.46950 | 19.7 | 1997 | 54107 |
3662
+ | nep-deu | ntrex128 | 0.43568 | 13.5 | 1997 | 48761 |
3663
+ | nep-eng | ntrex128 | 0.55954 | 28.8 | 1997 | 47673 |
3664
+ | nep-fra | ntrex128 | 0.45083 | 16.9 | 1997 | 53481 |
3665
+ | nep-por | ntrex128 | 0.44458 | 16.0 | 1997 | 51631 |
3666
+ | nep-spa | ntrex128 | 0.46832 | 19.4 | 1997 | 54107 |
3667
+ | pan-deu | ntrex128 | 0.44327 | 14.6 | 1997 | 48761 |
3668
+ | pan-eng | ntrex128 | 0.57665 | 30.5 | 1997 | 47673 |
3669
+ | pan-fra | ntrex128 | 0.45815 | 17.7 | 1997 | 53481 |
3670
+ | pan-por | ntrex128 | 0.44608 | 16.3 | 1997 | 51631 |
3671
+ | pan-spa | ntrex128 | 0.47289 | 20.0 | 1997 | 54107 |
3672
+ | prs-deu | ntrex128 | 0.45067 | 14.6 | 1997 | 48761 |
3673
+ | prs-eng | ntrex128 | 0.54767 | 26.6 | 1997 | 47673 |
3674
+ | prs-fra | ntrex128 | 0.47453 | 19.3 | 1997 | 53481 |
3675
+ | prs-por | ntrex128 | 0.45843 | 17.1 | 1997 | 51631 |
3676
+ | prs-spa | ntrex128 | 0.48317 | 20.9 | 1997 | 54107 |
3677
+ | pus-eng | ntrex128 | 0.44698 | 17.6 | 1997 | 47673 |
3678
+ | pus-spa | ntrex128 | 0.41132 | 14.6 | 1997 | 54107 |
3679
+ | sin-deu | ntrex128 | 0.42541 | 12.5 | 1997 | 48761 |
3680
+ | sin-eng | ntrex128 | 0.51853 | 23.5 | 1997 | 47673 |
3681
+ | sin-fra | ntrex128 | 0.44099 | 15.9 | 1997 | 53481 |
3682
+ | sin-por | ntrex128 | 0.43010 | 14.4 | 1997 | 51631 |
3683
+ | sin-spa | ntrex128 | 0.46225 | 18.4 | 1997 | 54107 |
3684
+ | tgk_Cyrl-deu | ntrex128 | 0.40368 | 11.4 | 1997 | 48761 |
3685
+ | tgk_Cyrl-eng | ntrex128 | 0.47132 | 18.2 | 1997 | 47673 |
3686
+ | tgk_Cyrl-fra | ntrex128 | 0.43311 | 15.8 | 1997 | 53481 |
3687
+ | tgk_Cyrl-por | ntrex128 | 0.42095 | 13.8 | 1997 | 51631 |
3688
+ | tgk_Cyrl-spa | ntrex128 | 0.44279 | 17.3 | 1997 | 54107 |
3689
+ | urd-deu | ntrex128 | 0.45708 | 15.5 | 1997 | 48761 |
3690
+ | urd-eng | ntrex128 | 0.56560 | 28.5 | 1997 | 47673 |
3691
+ | urd-fra | ntrex128 | 0.47536 | 19.0 | 1997 | 53481 |
3692
+ | urd-por | ntrex128 | 0.45911 | 16.7 | 1997 | 51631 |
3693
+ | urd-spa | ntrex128 | 0.48986 | 21.6 | 1997 | 54107 |
3694
+ | ben-eng | tico19-test | 0.64578 | 38.7 | 2100 | 56824 |
3695
+ | ben-fra | tico19-test | 0.50165 | 22.8 | 2100 | 64661 |
3696
+ | ben-por | tico19-test | 0.55662 | 27.7 | 2100 | 62729 |
3697
+ | ben-spa | tico19-test | 0.56795 | 29.6 | 2100 | 66563 |
3698
+ | ckb-eng | tico19-test | 0.51623 | 27.4 | 2100 | 56315 |
3699
+ | ckb-fra | tico19-test | 0.42405 | 17.1 | 2100 | 64661 |
3700
+ | ckb-por | tico19-test | 0.45405 | 19.0 | 2100 | 62729 |
3701
+ | ckb-spa | tico19-test | 0.46976 | 21.7 | 2100 | 66563 |
3702
+ | fas-eng | tico19-test | 0.62079 | 34.2 | 2100 | 56315 |
3703
+ | fas-fra | tico19-test | 0.52041 | 24.4 | 2100 | 64661 |
3704
+ | fas-por | tico19-test | 0.56780 | 29.2 | 2100 | 62729 |
3705
+ | fas-spa | tico19-test | 0.58248 | 32.3 | 2100 | 66563 |
3706
+ | hin-eng | tico19-test | 0.70535 | 46.8 | 2100 | 56323 |
3707
+ | hin-fra | tico19-test | 0.53833 | 26.6 | 2100 | 64661 |
3708
+ | hin-por | tico19-test | 0.60246 | 33.2 | 2100 | 62729 |
3709
+ | hin-spa | tico19-test | 0.61504 | 35.7 | 2100 | 66563 |
3710
+ | mar-eng | tico19-test | 0.59247 | 31.4 | 2100 | 56315 |
3711
+ | mar-fra | tico19-test | 0.46895 | 19.3 | 2100 | 64661 |
3712
+ | mar-por | tico19-test | 0.51945 | 23.8 | 2100 | 62729 |
3713
+ | mar-spa | tico19-test | 0.52914 | 26.2 | 2100 | 66563 |
3714
+ | nep-eng | tico19-test | 0.65865 | 40.1 | 2100 | 56824 |
3715
+ | nep-fra | tico19-test | 0.50473 | 23.2 | 2100 | 64661 |
3716
+ | nep-por | tico19-test | 0.56185 | 28.0 | 2100 | 62729 |
3717
+ | nep-spa | tico19-test | 0.57270 | 30.2 | 2100 | 66563 |
3718
+ | prs-eng | tico19-test | 0.59536 | 32.1 | 2100 | 56824 |
3719
+ | prs-fra | tico19-test | 0.50044 | 23.1 | 2100 | 64661 |
3720
+ | prs-por | tico19-test | 0.54448 | 27.3 | 2100 | 62729 |
3721
+ | prs-spa | tico19-test | 0.56311 | 30.2 | 2100 | 66563 |
3722
+ | pus-eng | tico19-test | 0.56711 | 31.4 | 2100 | 56315 |
3723
+ | pus-fra | tico19-test | 0.45951 | 19.4 | 2100 | 64661 |
3724
+ | pus-por | tico19-test | 0.50225 | 23.7 | 2100 | 62729 |
3725
+ | pus-spa | tico19-test | 0.51246 | 25.4 | 2100 | 66563 |
3726
+ | urd-eng | tico19-test | 0.57786 | 30.8 | 2100 | 56315 |
3727
+ | urd-fra | tico19-test | 0.46807 | 20.1 | 2100 | 64661 |
3728
+ | urd-por | tico19-test | 0.51567 | 24.1 | 2100 | 62729 |
3729
+ | urd-spa | tico19-test | 0.52820 | 26.4 | 2100 | 66563 |
3730
+
3731
+ ## Citation Information
3732
+
3733
+ * Publications: [Democratizing neural machine translation with OPUS-MT](https://doi.org/10.1007/s10579-023-09704-w) and [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
3734
+
3735
+ ```bibtex
3736
+ @article{tiedemann2023democratizing,
3737
+ title={Democratizing neural machine translation with {OPUS-MT}},
3738
+ author={Tiedemann, J{\"o}rg and Aulamo, Mikko and Bakshandaeva, Daria and Boggia, Michele and Gr{\"o}nroos, Stig-Arne and Nieminen, Tommi and Raganato, Alessandro and Scherrer, Yves and Vazquez, Raul and Virpioja, Sami},
3739
+ journal={Language Resources and Evaluation},
3740
+ number={58},
3741
+ pages={713--755},
3742
+ year={2023},
3743
+ publisher={Springer Nature},
3744
+ issn={1574-0218},
3745
+ doi={10.1007/s10579-023-09704-w}
3746
+ }
3747
+
3748
+ @inproceedings{tiedemann-thottingal-2020-opus,
3749
+ title = "{OPUS}-{MT} {--} Building open translation services for the World",
3750
+ author = {Tiedemann, J{\"o}rg and Thottingal, Santhosh},
3751
+ booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
3752
+ month = nov,
3753
+ year = "2020",
3754
+ address = "Lisboa, Portugal",
3755
+ publisher = "European Association for Machine Translation",
3756
+ url = "https://aclanthology.org/2020.eamt-1.61",
3757
+ pages = "479--480",
3758
+ }
3759
+
3760
+ @inproceedings{tiedemann-2020-tatoeba,
3761
+ title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
3762
+ author = {Tiedemann, J{\"o}rg},
3763
+ booktitle = "Proceedings of the Fifth Conference on Machine Translation",
3764
+ month = nov,
3765
+ year = "2020",
3766
+ address = "Online",
3767
+ publisher = "Association for Computational Linguistics",
3768
+ url = "https://aclanthology.org/2020.wmt-1.139",
3769
+ pages = "1174--1182",
3770
+ }
3771
+ ```
3772
+
3773
+ ## Acknowledgements
3774
+
3775
+ The work is supported by the [HPLT project](https://hplt-project.org/), funded by the European Union’s Horizon Europe research and innovation programme under grant agreement No 101070350. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland, and the [EuroHPC supercomputer LUMI](https://www.lumi-supercomputer.eu/).
3776
+
3777
+ ## Model conversion info
3778
+
3779
+ * transformers version: 4.45.1
3780
+ * OPUS-MT git hash: 0882077
3781
+ * port time: Tue Oct 8 11:32:44 EEST 2024
3782
+ * port machine: LM0-400-22516.local
benchmark_results.txt ADDED
@@ -0,0 +1,267 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ multi-multi tatoeba-test-v2020-07-28-v2023-09-26 0.60665 42.8 10000 69192
2
+ asm-fra flores101-devtest 0.35655 11.6 1012 28343
3
+ asm-por flores101-devtest 0.38149 12.5 1012 26519
4
+ asm-spa flores101-devtest 0.34288 9.0 1012 29199
5
+ ben-deu flores101-devtest 0.46873 16.4 1012 25094
6
+ ben-eng flores101-devtest 0.57508 30.0 1012 24721
7
+ ben-spa flores101-devtest 0.44010 15.1 1012 29199
8
+ ckb-deu flores101-devtest 0.41546 13.0 1012 25094
9
+ ckb-por flores101-devtest 0.44178 17.6 1012 26519
10
+ ckb-spa flores101-devtest 0.39703 12.7 1012 29199
11
+ fas-por flores101-devtest 0.54077 26.1 1012 26519
12
+ guj-deu flores101-devtest 0.45906 16.5 1012 25094
13
+ guj-spa flores101-devtest 0.43928 15.2 1012 29199
14
+ hin-eng flores101-devtest 0.62807 36.6 1012 24721
15
+ hin-por flores101-devtest 0.52825 25.1 1012 26519
16
+ mar-deu flores101-devtest 0.44767 14.8 1012 25094
17
+ npi-deu flores101-devtest 0.46178 15.9 1012 25094
18
+ pan-fra flores101-devtest 0.50909 23.4 1012 28343
19
+ pan-por flores101-devtest 0.50634 23.0 1012 26519
20
+ pus-deu flores101-devtest 0.42645 13.5 1012 25094
21
+ pus-fra flores101-devtest 0.45719 18.0 1012 28343
22
+ urd-deu flores101-devtest 0.46102 16.5 1012 25094
23
+ urd-eng flores101-devtest 0.56356 28.4 1012 24721
24
+ asm-deu flores200-devtest 0.32486 5.8 1012 25094
25
+ asm-eng flores200-devtest 0.48589 21.5 1012 24721
26
+ asm-fra flores200-devtest 0.35394 11.7 1012 28343
27
+ asm-por flores200-devtest 0.37939 12.3 1012 26519
28
+ asm-spa flores200-devtest 0.33980 8.9 1012 29199
29
+ awa-deu flores200-devtest 0.47071 16.0 1012 25094
30
+ awa-eng flores200-devtest 0.53069 26.6 1012 24721
31
+ awa-fra flores200-devtest 0.49700 21.1 1012 28343
32
+ awa-por flores200-devtest 0.49950 21.8 1012 26519
33
+ awa-spa flores200-devtest 0.43831 15.4 1012 29199
34
+ ben-deu flores200-devtest 0.47434 17.0 1012 25094
35
+ ben-eng flores200-devtest 0.58408 31.4 1012 24721
36
+ ben-fra flores200-devtest 0.50930 23.2 1012 28343
37
+ ben-por flores200-devtest 0.50661 22.4 1012 26519
38
+ ben-spa flores200-devtest 0.44485 15.7 1012 29199
39
+ bho-deu flores200-devtest 0.42463 12.8 1012 25094
40
+ bho-eng flores200-devtest 0.50545 22.6 1012 24721
41
+ bho-fra flores200-devtest 0.45264 17.4 1012 28343
42
+ bho-por flores200-devtest 0.44737 17.0 1012 26519
43
+ bho-spa flores200-devtest 0.40585 13.0 1012 29199
44
+ ckb-deu flores200-devtest 0.42110 13.6 1012 25094
45
+ ckb-eng flores200-devtest 0.50543 24.7 1012 24721
46
+ ckb-fra flores200-devtest 0.45847 19.1 1012 28343
47
+ ckb-por flores200-devtest 0.44567 17.8 1012 26519
48
+ ckb-spa flores200-devtest 0.39955 13.0 1012 29199
49
+ guj-deu flores200-devtest 0.46758 17.3 1012 25094
50
+ guj-eng flores200-devtest 0.61139 34.4 1012 24721
51
+ guj-fra flores200-devtest 0.50349 22.5 1012 28343
52
+ guj-por flores200-devtest 0.49828 22.4 1012 26519
53
+ guj-spa flores200-devtest 0.44472 15.5 1012 29199
54
+ hin-deu flores200-devtest 0.50772 20.8 1012 25094
55
+ hin-eng flores200-devtest 0.63234 37.3 1012 24721
56
+ hin-fra flores200-devtest 0.53933 26.5 1012 28343
57
+ hin-por flores200-devtest 0.53523 26.1 1012 26519
58
+ hin-spa flores200-devtest 0.46183 17.4 1012 29199
59
+ hne-deu flores200-devtest 0.49946 19.0 1012 25094
60
+ hne-eng flores200-devtest 0.63640 38.1 1012 24721
61
+ hne-fra flores200-devtest 0.53419 25.7 1012 28343
62
+ hne-por flores200-devtest 0.53735 25.9 1012 26519
63
+ hne-spa flores200-devtest 0.45610 16.9 1012 29199
64
+ kas_Arab-deu flores200-devtest 0.30071 2.9 1012 25094
65
+ kas_Arab-eng flores200-devtest 0.34015 4.5 1012 24721
66
+ kas_Arab-fra flores200-devtest 0.32331 5.9 1012 28343
67
+ kas_Arab-por flores200-devtest 0.31531 4.9 1012 26519
68
+ kas_Arab-spa flores200-devtest 0.31002 5.2 1012 29199
69
+ kas_Deva-deu flores200-devtest 0.21698 1.9 1012 25094
70
+ kas_Deva-eng flores200-devtest 0.26948 4.6 1012 24721
71
+ kas_Deva-fra flores200-devtest 0.21885 3.0 1012 28343
72
+ kas_Deva-por flores200-devtest 0.21989 2.6 1012 26519
73
+ kas_Deva-spa flores200-devtest 0.21652 2.7 1012 29199
74
+ kmr-deu flores200-devtest 0.32781 7.7 1012 25094
75
+ kmr-eng flores200-devtest 0.37717 12.8 1012 24721
76
+ kmr-fra flores200-devtest 0.33854 9.6 1012 28343
77
+ kmr-por flores200-devtest 0.33724 9.9 1012 26519
78
+ kmr-spa flores200-devtest 0.31940 8.2 1012 29199
79
+ mag-deu flores200-devtest 0.50681 20.0 1012 25094
80
+ mag-eng flores200-devtest 0.63966 38.0 1012 24721
81
+ mag-fra flores200-devtest 0.53810 25.9 1012 28343
82
+ mag-por flores200-devtest 0.54065 26.6 1012 26519
83
+ mag-spa flores200-devtest 0.46131 17.1 1012 29199
84
+ mai-deu flores200-devtest 0.47686 16.8 1012 25094
85
+ mai-eng flores200-devtest 0.57552 30.2 1012 24721
86
+ mai-fra flores200-devtest 0.50909 22.4 1012 28343
87
+ mai-por flores200-devtest 0.51249 22.9 1012 26519
88
+ mai-spa flores200-devtest 0.44694 15.9 1012 29199
89
+ mar-deu flores200-devtest 0.45295 14.8 1012 25094
90
+ mar-eng flores200-devtest 0.58203 31.0 1012 24721
91
+ mar-fra flores200-devtest 0.48254 20.4 1012 28343
92
+ mar-por flores200-devtest 0.48368 20.4 1012 26519
93
+ mar-spa flores200-devtest 0.42799 14.7 1012 29199
94
+ npi-deu flores200-devtest 0.47267 17.2 1012 25094
95
+ npi-eng flores200-devtest 0.59559 32.5 1012 24721
96
+ npi-fra flores200-devtest 0.50869 22.5 1012 28343
97
+ npi-por flores200-devtest 0.50900 22.5 1012 26519
98
+ npi-spa flores200-devtest 0.44304 15.6 1012 29199
99
+ pan-deu flores200-devtest 0.48342 18.6 1012 25094
100
+ pan-eng flores200-devtest 0.60328 33.4 1012 24721
101
+ pan-fra flores200-devtest 0.51953 24.4 1012 28343
102
+ pan-por flores200-devtest 0.51428 23.9 1012 26519
103
+ pan-spa flores200-devtest 0.44615 16.3 1012 29199
104
+ pes-deu flores200-devtest 0.51124 21.0 1012 25094
105
+ pes-eng flores200-devtest 0.60538 33.7 1012 24721
106
+ pes-fra flores200-devtest 0.55157 27.8 1012 28343
107
+ pes-por flores200-devtest 0.54372 26.6 1012 26519
108
+ pes-spa flores200-devtest 0.47561 18.8 1012 29199
109
+ prs-deu flores200-devtest 0.50273 20.7 1012 25094
110
+ prs-eng flores200-devtest 0.60144 34.5 1012 24721
111
+ prs-fra flores200-devtest 0.54241 27.0 1012 28343
112
+ prs-por flores200-devtest 0.53562 26.6 1012 26519
113
+ prs-spa flores200-devtest 0.46497 18.1 1012 29199
114
+ san-deu flores200-devtest 0.31265 5.0 1012 25094
115
+ san-eng flores200-devtest 0.36840 11.5 1012 24721
116
+ san-fra flores200-devtest 0.33014 8.4 1012 28343
117
+ san-por flores200-devtest 0.34199 8.6 1012 26519
118
+ san-spa flores200-devtest 0.32207 7.5 1012 29199
119
+ sin-deu flores200-devtest 0.45041 14.7 1012 25094
120
+ sin-eng flores200-devtest 0.54060 26.3 1012 24721
121
+ sin-fra flores200-devtest 0.48163 19.9 1012 28343
122
+ sin-por flores200-devtest 0.47780 19.6 1012 26519
123
+ sin-spa flores200-devtest 0.42546 14.2 1012 29199
124
+ tgk-deu flores200-devtest 0.45203 15.6 1012 25094
125
+ tgk-eng flores200-devtest 0.53740 25.3 1012 24721
126
+ tgk-fra flores200-devtest 0.50153 22.1 1012 28343
127
+ tgk-por flores200-devtest 0.49378 21.9 1012 26519
128
+ tgk-spa flores200-devtest 0.44099 15.9 1012 29199
129
+ urd-deu flores200-devtest 0.46894 17.2 1012 25094
130
+ urd-eng flores200-devtest 0.56967 29.3 1012 24721
131
+ urd-fra flores200-devtest 0.50616 22.6 1012 28343
132
+ urd-por flores200-devtest 0.49398 21.7 1012 26519
133
+ urd-spa flores200-devtest 0.43800 15.4 1012 29199
134
+ hin-eng newstest2014 0.59024 30.3 2507 55571
135
+ guj-eng newstest2019 0.53977 27.2 1016 17757
136
+ pus-eng newstest2020 0.39447 14.3 2719 53382
137
+ ben-deu ntrex128 0.45551 15.0 1997 48761
138
+ ben-eng ntrex128 0.56878 29.0 1997 47673
139
+ ben-fra ntrex128 0.47077 18.6 1997 53481
140
+ ben-por ntrex128 0.46049 17.1 1997 51631
141
+ ben-spa ntrex128 0.48833 21.3 1997 54107
142
+ div-deu ntrex128 0.19845 1.0 1997 48761
143
+ div-eng ntrex128 0.19666 1.4 1997 47673
144
+ div-fra ntrex128 0.20013 1.7 1997 53481
145
+ div-por ntrex128 0.18779 1.2 1997 51631
146
+ div-spa ntrex128 0.20413 1.8 1997 54107
147
+ fas-deu ntrex128 0.46991 16.1 1997 48761
148
+ fas-eng ntrex128 0.55119 25.9 1997 47673
149
+ fas-fra ntrex128 0.49626 21.2 1997 53481
150
+ fas-por ntrex128 0.47499 18.6 1997 51631
151
+ fas-spa ntrex128 0.50178 22.8 1997 54107
152
+ guj-deu ntrex128 0.43998 14.3 1997 48761
153
+ guj-eng ntrex128 0.58481 31.0 1997 47673
154
+ guj-fra ntrex128 0.45468 17.3 1997 53481
155
+ guj-por ntrex128 0.44223 15.8 1997 51631
156
+ guj-spa ntrex128 0.47798 20.7 1997 54107
157
+ hin-deu ntrex128 0.46580 15.0 1997 48761
158
+ hin-eng ntrex128 0.59832 31.6 1997 47673
159
+ hin-fra ntrex128 0.48328 19.5 1997 53481
160
+ hin-por ntrex128 0.46833 17.8 1997 51631
161
+ hin-spa ntrex128 0.49517 21.9 1997 54107
162
+ kmr-deu ntrex128 0.32197 7.3 1997 48761
163
+ kmr-eng ntrex128 0.37956 12.9 1997 47673
164
+ kmr-fra ntrex128 0.33652 9.7 1997 53481
165
+ kmr-por ntrex128 0.32539 8.6 1997 51631
166
+ kmr-spa ntrex128 0.35021 10.8 1997 54107
167
+ mar-deu ntrex128 0.43713 13.5 1997 48761
168
+ mar-eng ntrex128 0.55132 27.4 1997 47673
169
+ mar-fra ntrex128 0.44797 16.9 1997 53481
170
+ mar-por ntrex128 0.44342 16.1 1997 51631
171
+ mar-spa ntrex128 0.46950 19.7 1997 54107
172
+ nep-deu ntrex128 0.43568 13.5 1997 48761
173
+ nep-eng ntrex128 0.55954 28.8 1997 47673
174
+ nep-fra ntrex128 0.45083 16.9 1997 53481
175
+ nep-por ntrex128 0.44458 16.0 1997 51631
176
+ nep-spa ntrex128 0.46832 19.4 1997 54107
177
+ pan-deu ntrex128 0.44327 14.6 1997 48761
178
+ pan-eng ntrex128 0.57665 30.5 1997 47673
179
+ pan-fra ntrex128 0.45815 17.7 1997 53481
180
+ pan-por ntrex128 0.44608 16.3 1997 51631
181
+ pan-spa ntrex128 0.47289 20.0 1997 54107
182
+ prs-deu ntrex128 0.45067 14.6 1997 48761
183
+ prs-eng ntrex128 0.54767 26.6 1997 47673
184
+ prs-fra ntrex128 0.47453 19.3 1997 53481
185
+ prs-por ntrex128 0.45843 17.1 1997 51631
186
+ prs-spa ntrex128 0.48317 20.9 1997 54107
187
+ pus-deu ntrex128 0.38989 10.3 1997 48761
188
+ pus-eng ntrex128 0.44698 17.6 1997 47673
189
+ pus-fra ntrex128 0.39872 12.6 1997 53481
190
+ pus-por ntrex128 0.38923 11.9 1997 51631
191
+ pus-spa ntrex128 0.41132 14.6 1997 54107
192
+ sin-deu ntrex128 0.42541 12.5 1997 48761
193
+ sin-eng ntrex128 0.51853 23.5 1997 47673
194
+ sin-fra ntrex128 0.44099 15.9 1997 53481
195
+ sin-por ntrex128 0.43010 14.4 1997 51631
196
+ sin-spa ntrex128 0.46225 18.4 1997 54107
197
+ snd_Arab-deu ntrex128 0.12141 0.2 1997 48761
198
+ snd_Arab-eng ntrex128 0.11393 0.4 1997 47673
199
+ snd_Arab-fra ntrex128 0.14917 0.6 1997 53481
200
+ snd_Arab-por ntrex128 0.11125 0.4 1997 51631
201
+ snd_Arab-spa ntrex128 0.10892 0.4 1997 54107
202
+ tgk_Cyrl-deu ntrex128 0.40368 11.4 1997 48761
203
+ tgk_Cyrl-eng ntrex128 0.47132 18.2 1997 47673
204
+ tgk_Cyrl-fra ntrex128 0.43311 15.8 1997 53481
205
+ tgk_Cyrl-por ntrex128 0.42095 13.8 1997 51631
206
+ tgk_Cyrl-spa ntrex128 0.44279 17.3 1997 54107
207
+ urd-deu ntrex128 0.45708 15.5 1997 48761
208
+ urd-eng ntrex128 0.56560 28.5 1997 47673
209
+ urd-fra ntrex128 0.47536 19.0 1997 53481
210
+ urd-por ntrex128 0.45911 16.7 1997 51631
211
+ urd-spa ntrex128 0.48986 21.6 1997 54107
212
+ mar-eng tatoeba-test-v2020-07-28 0.64723 48.7 10000 64831
213
+ fas-fra tatoeba-test-v2021-03-30 0.57331 35.7 383 3442
214
+ kur_Latn-deu tatoeba-test-v2021-03-30 0.39239 24.5 238 1408
215
+ mar-eng tatoeba-test-v2021-03-30 0.64611 48.6 10250 66546
216
+ pes-eng tatoeba-test-v2021-03-30 0.59633 41.5 3763 31439
217
+ rom-eng tatoeba-test-v2021-03-30 0.28331 10.3 672 4459
218
+ zza-eng tatoeba-test-v2021-03-30 0.12982 1.4 533 3182
219
+ awa-eng tatoeba-test-v2021-08-07 0.60240 40.8 279 1335
220
+ ben-eng tatoeba-test-v2021-08-07 0.64471 49.3 2500 13978
221
+ fas-deu tatoeba-test-v2021-08-07 0.58631 34.7 3185 25590
222
+ fas-eng tatoeba-test-v2021-08-07 0.59868 41.8 3762 31480
223
+ fas-fra tatoeba-test-v2021-08-07 0.57181 35.8 376 3377
224
+ hin-eng tatoeba-test-v2021-08-07 0.65417 49.5 5000 33943
225
+ kur_Latn-deu tatoeba-test-v2021-08-07 0.42694 27.0 223 1323
226
+ kur_Latn-eng tatoeba-test-v2021-08-07 0.42721 25.6 290 1708
227
+ mar-eng tatoeba-test-v2021-08-07 0.64493 48.3 10396 67527
228
+ pes-eng tatoeba-test-v2021-08-07 0.59959 41.9 3757 31411
229
+ rom-eng tatoeba-test-v2021-08-07 0.28016 10.2 706 4690
230
+ urd-eng tatoeba-test-v2021-08-07 0.53679 35.4 1663 12029
231
+ zza-eng tatoeba-test-v2021-08-07 0.12371 0.8 529 3162
232
+ ben-eng tico19-test 0.64578 38.7 2100 56824
233
+ ben-fra tico19-test 0.50165 22.8 2100 64661
234
+ ben-por tico19-test 0.55662 27.7 2100 62729
235
+ ben-spa tico19-test 0.56795 29.6 2100 66563
236
+ ckb-eng tico19-test 0.51623 27.4 2100 56315
237
+ ckb-fra tico19-test 0.42405 17.1 2100 64661
238
+ ckb-por tico19-test 0.45405 19.0 2100 62729
239
+ ckb-spa tico19-test 0.46976 21.7 2100 66563
240
+ fas-eng tico19-test 0.62079 34.2 2100 56315
241
+ fas-fra tico19-test 0.52041 24.4 2100 64661
242
+ fas-por tico19-test 0.56780 29.2 2100 62729
243
+ fas-spa tico19-test 0.58248 32.3 2100 66563
244
+ hin-eng tico19-test 0.70535 46.8 2100 56323
245
+ hin-fra tico19-test 0.53833 26.6 2100 64661
246
+ hin-por tico19-test 0.60246 33.2 2100 62729
247
+ hin-spa tico19-test 0.61504 35.7 2100 66563
248
+ mar-eng tico19-test 0.59247 31.4 2100 56315
249
+ mar-fra tico19-test 0.46895 19.3 2100 64661
250
+ mar-por tico19-test 0.51945 23.8 2100 62729
251
+ mar-spa tico19-test 0.52914 26.2 2100 66563
252
+ nep-eng tico19-test 0.65865 40.1 2100 56824
253
+ nep-fra tico19-test 0.50473 23.2 2100 64661
254
+ nep-por tico19-test 0.56185 28.0 2100 62729
255
+ nep-spa tico19-test 0.57270 30.2 2100 66563
256
+ prs-eng tico19-test 0.59536 32.1 2100 56824
257
+ prs-fra tico19-test 0.50044 23.1 2100 64661
258
+ prs-por tico19-test 0.54448 27.3 2100 62729
259
+ prs-spa tico19-test 0.56311 30.2 2100 66563
260
+ pus-eng tico19-test 0.56711 31.4 2100 56315
261
+ pus-fra tico19-test 0.45951 19.4 2100 64661
262
+ pus-por tico19-test 0.50225 23.7 2100 62729
263
+ pus-spa tico19-test 0.51246 25.4 2100 66563
264
+ urd-eng tico19-test 0.57786 30.8 2100 56315
265
+ urd-fra tico19-test 0.46807 20.1 2100 64661
266
+ urd-por tico19-test 0.51567 24.1 2100 62729
267
+ urd-spa tico19-test 0.52820 26.4 2100 66563
benchmark_translations.zip ADDED
File without changes
config.json ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "pytorch-models/opus-mt-tc-bible-big-iir-deu_eng_fra_por_spa",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "relu",
5
+ "architectures": [
6
+ "MarianMTModel"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "bos_token_id": 0,
10
+ "classifier_dropout": 0.0,
11
+ "d_model": 1024,
12
+ "decoder_attention_heads": 16,
13
+ "decoder_ffn_dim": 4096,
14
+ "decoder_layerdrop": 0.0,
15
+ "decoder_layers": 6,
16
+ "decoder_start_token_id": 61806,
17
+ "decoder_vocab_size": 61807,
18
+ "dropout": 0.1,
19
+ "encoder_attention_heads": 16,
20
+ "encoder_ffn_dim": 4096,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 6,
23
+ "eos_token_id": 469,
24
+ "forced_eos_token_id": null,
25
+ "init_std": 0.02,
26
+ "is_encoder_decoder": true,
27
+ "max_length": null,
28
+ "max_position_embeddings": 1024,
29
+ "model_type": "marian",
30
+ "normalize_embedding": false,
31
+ "num_beams": null,
32
+ "num_hidden_layers": 6,
33
+ "pad_token_id": 61806,
34
+ "scale_embedding": true,
35
+ "share_encoder_decoder_embeddings": true,
36
+ "static_position_embeddings": true,
37
+ "torch_dtype": "float32",
38
+ "transformers_version": "4.45.1",
39
+ "use_cache": true,
40
+ "vocab_size": 61807
41
+ }
generation_config.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bad_words_ids": [
4
+ [
5
+ 61806
6
+ ]
7
+ ],
8
+ "bos_token_id": 0,
9
+ "decoder_start_token_id": 61806,
10
+ "eos_token_id": 469,
11
+ "forced_eos_token_id": 469,
12
+ "max_length": 512,
13
+ "num_beams": 4,
14
+ "pad_token_id": 61806,
15
+ "transformers_version": "4.45.1"
16
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:91346013942899a2744542e75085a35635d2a7c58d8e4c3c6ec3b4fa4e06a06d
3
+ size 958867820
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1830b9ed3c27d11ba5f9cf4e83f96d7e8561678b94d39f7f5797e4d5f1741b29
3
+ size 958919045
source.spm ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b040afaa230b22df81c657722ab3f7e42401ceb612ba18b93353a8ad64dff18d
3
+ size 926222
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"eos_token": "</s>", "unk_token": "<unk>", "pad_token": "<pad>"}
target.spm ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d45eaf1ffc10f843a3bd84a346e1e86eb69f780daaaf435efae27626bcfb755
3
+ size 802795
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"source_lang": "iir", "target_lang": "deu+eng+fra+por+spa", "unk_token": "<unk>", "eos_token": "</s>", "pad_token": "<pad>", "model_max_length": 512, "sp_model_kwargs": {}, "separate_vocabs": false, "special_tokens_map_file": null, "name_or_path": "marian-models/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30/iir-deu+eng+fra+por+spa", "tokenizer_class": "MarianTokenizer"}
vocab.json ADDED
The diff for this file is too large to render. See raw diff