tiedeman commited on
Commit
d81e05d
1 Parent(s): 818ca99

Initial commit

Browse files
.gitattributes CHANGED
@@ -33,3 +33,4 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
 
 
33
  *.zip filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ *.spm filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,2549 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ language:
4
+ - bas
5
+ - bem
6
+ - bnt
7
+ - bss
8
+ - cce
9
+ - cjk
10
+ - cwe
11
+ - de
12
+ - dig
13
+ - dug
14
+ - en
15
+ - es
16
+ - fr
17
+ - gog
18
+ - gwr
19
+ - hay
20
+ - heh
21
+ - hz
22
+ - jmc
23
+ - kam
24
+ - kdc
25
+ - kdn
26
+ - kg
27
+ - ki
28
+ - kj
29
+ - kki
30
+ - kkj
31
+ - kmb
32
+ - ksb
33
+ - lem
34
+ - lg
35
+ - ln
36
+ - lon
37
+ - lsm
38
+ - lua
39
+ - luy
40
+ - mcp
41
+ - myx
42
+ - nd
43
+ - ng
44
+ - nim
45
+ - nnb
46
+ - nr
47
+ - nso
48
+ - nuj
49
+ - ny
50
+ - nyf
51
+ - nyn
52
+ - nyo
53
+ - nyy
54
+ - old
55
+ - ozm
56
+ - pkb
57
+ - pt
58
+ - rim
59
+ - rn
60
+ - rw
61
+ - seh
62
+ - sn
63
+ - ss
64
+ - st
65
+ - suk
66
+ - sw
67
+ - sxb
68
+ - thk
69
+ - tlj
70
+ - tn
71
+ - toh
72
+ - toi
73
+ - ts
74
+ - tum
75
+ - umb
76
+ - ve
77
+ - vmw
78
+ - vun
79
+ - wmw
80
+ - xh
81
+ - xog
82
+ - zu
83
+
84
+ tags:
85
+ - translation
86
+ - opus-mt-tc-bible
87
+
88
+ license: apache-2.0
89
+ model-index:
90
+ - name: opus-mt-tc-bible-big-bnt-deu_eng_fra_por_spa
91
+ results:
92
+ - task:
93
+ name: Translation bem-eng
94
+ type: translation
95
+ args: bem-eng
96
+ dataset:
97
+ name: flores200-devtest
98
+ type: flores200-devtest
99
+ args: bem-eng
100
+ metrics:
101
+ - name: BLEU
102
+ type: bleu
103
+ value: 18.1
104
+ - name: chr-F
105
+ type: chrf
106
+ value: 0.42350
107
+ - task:
108
+ name: Translation bem-fra
109
+ type: translation
110
+ args: bem-fra
111
+ dataset:
112
+ name: flores200-devtest
113
+ type: flores200-devtest
114
+ args: bem-fra
115
+ metrics:
116
+ - name: BLEU
117
+ type: bleu
118
+ value: 12.4
119
+ - name: chr-F
120
+ type: chrf
121
+ value: 0.36976
122
+ - task:
123
+ name: Translation bem-por
124
+ type: translation
125
+ args: bem-por
126
+ dataset:
127
+ name: flores200-devtest
128
+ type: flores200-devtest
129
+ args: bem-por
130
+ metrics:
131
+ - name: BLEU
132
+ type: bleu
133
+ value: 11.3
134
+ - name: chr-F
135
+ type: chrf
136
+ value: 0.36443
137
+ - task:
138
+ name: Translation kik-eng
139
+ type: translation
140
+ args: kik-eng
141
+ dataset:
142
+ name: flores200-devtest
143
+ type: flores200-devtest
144
+ args: kik-eng
145
+ metrics:
146
+ - name: BLEU
147
+ type: bleu
148
+ value: 14.2
149
+ - name: chr-F
150
+ type: chrf
151
+ value: 0.38501
152
+ - task:
153
+ name: Translation kik-fra
154
+ type: translation
155
+ args: kik-fra
156
+ dataset:
157
+ name: flores200-devtest
158
+ type: flores200-devtest
159
+ args: kik-fra
160
+ metrics:
161
+ - name: BLEU
162
+ type: bleu
163
+ value: 10.3
164
+ - name: chr-F
165
+ type: chrf
166
+ value: 0.34427
167
+ - task:
168
+ name: Translation kin-eng
169
+ type: translation
170
+ args: kin-eng
171
+ dataset:
172
+ name: flores200-devtest
173
+ type: flores200-devtest
174
+ args: kin-eng
175
+ metrics:
176
+ - name: BLEU
177
+ type: bleu
178
+ value: 21.9
179
+ - name: chr-F
180
+ type: chrf
181
+ value: 0.46183
182
+ - task:
183
+ name: Translation kin-fra
184
+ type: translation
185
+ args: kin-fra
186
+ dataset:
187
+ name: flores200-devtest
188
+ type: flores200-devtest
189
+ args: kin-fra
190
+ metrics:
191
+ - name: BLEU
192
+ type: bleu
193
+ value: 14.7
194
+ - name: chr-F
195
+ type: chrf
196
+ value: 0.40139
197
+ - task:
198
+ name: Translation kin-por
199
+ type: translation
200
+ args: kin-por
201
+ dataset:
202
+ name: flores200-devtest
203
+ type: flores200-devtest
204
+ args: kin-por
205
+ metrics:
206
+ - name: BLEU
207
+ type: bleu
208
+ value: 13.7
209
+ - name: chr-F
210
+ type: chrf
211
+ value: 0.38408
212
+ - task:
213
+ name: Translation kin-spa
214
+ type: translation
215
+ args: kin-spa
216
+ dataset:
217
+ name: flores200-devtest
218
+ type: flores200-devtest
219
+ args: kin-spa
220
+ metrics:
221
+ - name: BLEU
222
+ type: bleu
223
+ value: 10.5
224
+ - name: chr-F
225
+ type: chrf
226
+ value: 0.35592
227
+ - task:
228
+ name: Translation kon-eng
229
+ type: translation
230
+ args: kon-eng
231
+ dataset:
232
+ name: flores200-devtest
233
+ type: flores200-devtest
234
+ args: kon-eng
235
+ metrics:
236
+ - name: BLEU
237
+ type: bleu
238
+ value: 14.0
239
+ - name: chr-F
240
+ type: chrf
241
+ value: 0.37260
242
+ - task:
243
+ name: Translation kon-fra
244
+ type: translation
245
+ args: kon-fra
246
+ dataset:
247
+ name: flores200-devtest
248
+ type: flores200-devtest
249
+ args: kon-fra
250
+ metrics:
251
+ - name: BLEU
252
+ type: bleu
253
+ value: 11.1
254
+ - name: chr-F
255
+ type: chrf
256
+ value: 0.35258
257
+ - task:
258
+ name: Translation kon-por
259
+ type: translation
260
+ args: kon-por
261
+ dataset:
262
+ name: flores200-devtest
263
+ type: flores200-devtest
264
+ args: kon-por
265
+ metrics:
266
+ - name: BLEU
267
+ type: bleu
268
+ value: 10.7
269
+ - name: chr-F
270
+ type: chrf
271
+ value: 0.34380
272
+ - task:
273
+ name: Translation lin-eng
274
+ type: translation
275
+ args: lin-eng
276
+ dataset:
277
+ name: flores200-devtest
278
+ type: flores200-devtest
279
+ args: lin-eng
280
+ metrics:
281
+ - name: BLEU
282
+ type: bleu
283
+ value: 18.1
284
+ - name: chr-F
285
+ type: chrf
286
+ value: 0.42073
287
+ - task:
288
+ name: Translation lin-fra
289
+ type: translation
290
+ args: lin-fra
291
+ dataset:
292
+ name: flores200-devtest
293
+ type: flores200-devtest
294
+ args: lin-fra
295
+ metrics:
296
+ - name: BLEU
297
+ type: bleu
298
+ value: 14.8
299
+ - name: chr-F
300
+ type: chrf
301
+ value: 0.39759
302
+ - task:
303
+ name: Translation lin-por
304
+ type: translation
305
+ args: lin-por
306
+ dataset:
307
+ name: flores200-devtest
308
+ type: flores200-devtest
309
+ args: lin-por
310
+ metrics:
311
+ - name: BLEU
312
+ type: bleu
313
+ value: 12.9
314
+ - name: chr-F
315
+ type: chrf
316
+ value: 0.37600
317
+ - task:
318
+ name: Translation lug-eng
319
+ type: translation
320
+ args: lug-eng
321
+ dataset:
322
+ name: flores200-devtest
323
+ type: flores200-devtest
324
+ args: lug-eng
325
+ metrics:
326
+ - name: BLEU
327
+ type: bleu
328
+ value: 13.5
329
+ - name: chr-F
330
+ type: chrf
331
+ value: 0.35746
332
+ - task:
333
+ name: Translation nso-deu
334
+ type: translation
335
+ args: nso-deu
336
+ dataset:
337
+ name: flores200-devtest
338
+ type: flores200-devtest
339
+ args: nso-deu
340
+ metrics:
341
+ - name: BLEU
342
+ type: bleu
343
+ value: 10.7
344
+ - name: chr-F
345
+ type: chrf
346
+ value: 0.38059
347
+ - task:
348
+ name: Translation nso-eng
349
+ type: translation
350
+ args: nso-eng
351
+ dataset:
352
+ name: flores200-devtest
353
+ type: flores200-devtest
354
+ args: nso-eng
355
+ metrics:
356
+ - name: BLEU
357
+ type: bleu
358
+ value: 28.4
359
+ - name: chr-F
360
+ type: chrf
361
+ value: 0.51453
362
+ - task:
363
+ name: Translation nso-fra
364
+ type: translation
365
+ args: nso-fra
366
+ dataset:
367
+ name: flores200-devtest
368
+ type: flores200-devtest
369
+ args: nso-fra
370
+ metrics:
371
+ - name: BLEU
372
+ type: bleu
373
+ value: 16.1
374
+ - name: chr-F
375
+ type: chrf
376
+ value: 0.41065
377
+ - task:
378
+ name: Translation nso-por
379
+ type: translation
380
+ args: nso-por
381
+ dataset:
382
+ name: flores200-devtest
383
+ type: flores200-devtest
384
+ args: nso-por
385
+ metrics:
386
+ - name: BLEU
387
+ type: bleu
388
+ value: 14.1
389
+ - name: chr-F
390
+ type: chrf
391
+ value: 0.38374
392
+ - task:
393
+ name: Translation nso-spa
394
+ type: translation
395
+ args: nso-spa
396
+ dataset:
397
+ name: flores200-devtest
398
+ type: flores200-devtest
399
+ args: nso-spa
400
+ metrics:
401
+ - name: BLEU
402
+ type: bleu
403
+ value: 10.3
404
+ - name: chr-F
405
+ type: chrf
406
+ value: 0.35022
407
+ - task:
408
+ name: Translation nya-eng
409
+ type: translation
410
+ args: nya-eng
411
+ dataset:
412
+ name: flores200-devtest
413
+ type: flores200-devtest
414
+ args: nya-eng
415
+ metrics:
416
+ - name: BLEU
417
+ type: bleu
418
+ value: 20.2
419
+ - name: chr-F
420
+ type: chrf
421
+ value: 0.44398
422
+ - task:
423
+ name: Translation nya-fra
424
+ type: translation
425
+ args: nya-fra
426
+ dataset:
427
+ name: flores200-devtest
428
+ type: flores200-devtest
429
+ args: nya-fra
430
+ metrics:
431
+ - name: BLEU
432
+ type: bleu
433
+ value: 14.0
434
+ - name: chr-F
435
+ type: chrf
436
+ value: 0.39327
437
+ - task:
438
+ name: Translation nya-por
439
+ type: translation
440
+ args: nya-por
441
+ dataset:
442
+ name: flores200-devtest
443
+ type: flores200-devtest
444
+ args: nya-por
445
+ metrics:
446
+ - name: BLEU
447
+ type: bleu
448
+ value: 12.6
449
+ - name: chr-F
450
+ type: chrf
451
+ value: 0.37373
452
+ - task:
453
+ name: Translation run-eng
454
+ type: translation
455
+ args: run-eng
456
+ dataset:
457
+ name: flores200-devtest
458
+ type: flores200-devtest
459
+ args: run-eng
460
+ metrics:
461
+ - name: BLEU
462
+ type: bleu
463
+ value: 18.9
464
+ - name: chr-F
465
+ type: chrf
466
+ value: 0.42987
467
+ - task:
468
+ name: Translation run-fra
469
+ type: translation
470
+ args: run-fra
471
+ dataset:
472
+ name: flores200-devtest
473
+ type: flores200-devtest
474
+ args: run-fra
475
+ metrics:
476
+ - name: BLEU
477
+ type: bleu
478
+ value: 14.6
479
+ - name: chr-F
480
+ type: chrf
481
+ value: 0.39369
482
+ - task:
483
+ name: Translation run-por
484
+ type: translation
485
+ args: run-por
486
+ dataset:
487
+ name: flores200-devtest
488
+ type: flores200-devtest
489
+ args: run-por
490
+ metrics:
491
+ - name: BLEU
492
+ type: bleu
493
+ value: 13.4
494
+ - name: chr-F
495
+ type: chrf
496
+ value: 0.38111
497
+ - task:
498
+ name: Translation run-spa
499
+ type: translation
500
+ args: run-spa
501
+ dataset:
502
+ name: flores200-devtest
503
+ type: flores200-devtest
504
+ args: run-spa
505
+ metrics:
506
+ - name: BLEU
507
+ type: bleu
508
+ value: 10.3
509
+ - name: chr-F
510
+ type: chrf
511
+ value: 0.35229
512
+ - task:
513
+ name: Translation sna-eng
514
+ type: translation
515
+ args: sna-eng
516
+ dataset:
517
+ name: flores200-devtest
518
+ type: flores200-devtest
519
+ args: sna-eng
520
+ metrics:
521
+ - name: BLEU
522
+ type: bleu
523
+ value: 21.1
524
+ - name: chr-F
525
+ type: chrf
526
+ value: 0.45917
527
+ - task:
528
+ name: Translation sna-fra
529
+ type: translation
530
+ args: sna-fra
531
+ dataset:
532
+ name: flores200-devtest
533
+ type: flores200-devtest
534
+ args: sna-fra
535
+ metrics:
536
+ - name: BLEU
537
+ type: bleu
538
+ value: 15.2
539
+ - name: chr-F
540
+ type: chrf
541
+ value: 0.41153
542
+ - task:
543
+ name: Translation sna-por
544
+ type: translation
545
+ args: sna-por
546
+ dataset:
547
+ name: flores200-devtest
548
+ type: flores200-devtest
549
+ args: sna-por
550
+ metrics:
551
+ - name: BLEU
552
+ type: bleu
553
+ value: 13.5
554
+ - name: chr-F
555
+ type: chrf
556
+ value: 0.38950
557
+ - task:
558
+ name: Translation sna-spa
559
+ type: translation
560
+ args: sna-spa
561
+ dataset:
562
+ name: flores200-devtest
563
+ type: flores200-devtest
564
+ args: sna-spa
565
+ metrics:
566
+ - name: BLEU
567
+ type: bleu
568
+ value: 10.3
569
+ - name: chr-F
570
+ type: chrf
571
+ value: 0.35823
572
+ - task:
573
+ name: Translation sot-deu
574
+ type: translation
575
+ args: sot-deu
576
+ dataset:
577
+ name: flores200-devtest
578
+ type: flores200-devtest
579
+ args: sot-deu
580
+ metrics:
581
+ - name: BLEU
582
+ type: bleu
583
+ value: 10.7
584
+ - name: chr-F
585
+ type: chrf
586
+ value: 0.38311
587
+ - task:
588
+ name: Translation sot-eng
589
+ type: translation
590
+ args: sot-eng
591
+ dataset:
592
+ name: flores200-devtest
593
+ type: flores200-devtest
594
+ args: sot-eng
595
+ metrics:
596
+ - name: BLEU
597
+ type: bleu
598
+ value: 26.9
599
+ - name: chr-F
600
+ type: chrf
601
+ value: 0.51854
602
+ - task:
603
+ name: Translation sot-fra
604
+ type: translation
605
+ args: sot-fra
606
+ dataset:
607
+ name: flores200-devtest
608
+ type: flores200-devtest
609
+ args: sot-fra
610
+ metrics:
611
+ - name: BLEU
612
+ type: bleu
613
+ value: 15.8
614
+ - name: chr-F
615
+ type: chrf
616
+ value: 0.41340
617
+ - task:
618
+ name: Translation sot-por
619
+ type: translation
620
+ args: sot-por
621
+ dataset:
622
+ name: flores200-devtest
623
+ type: flores200-devtest
624
+ args: sot-por
625
+ metrics:
626
+ - name: BLEU
627
+ type: bleu
628
+ value: 14.6
629
+ - name: chr-F
630
+ type: chrf
631
+ value: 0.39058
632
+ - task:
633
+ name: Translation sot-spa
634
+ type: translation
635
+ args: sot-spa
636
+ dataset:
637
+ name: flores200-devtest
638
+ type: flores200-devtest
639
+ args: sot-spa
640
+ metrics:
641
+ - name: BLEU
642
+ type: bleu
643
+ value: 10.4
644
+ - name: chr-F
645
+ type: chrf
646
+ value: 0.35245
647
+ - task:
648
+ name: Translation ssw-eng
649
+ type: translation
650
+ args: ssw-eng
651
+ dataset:
652
+ name: flores200-devtest
653
+ type: flores200-devtest
654
+ args: ssw-eng
655
+ metrics:
656
+ - name: BLEU
657
+ type: bleu
658
+ value: 20.7
659
+ - name: chr-F
660
+ type: chrf
661
+ value: 0.44925
662
+ - task:
663
+ name: Translation ssw-fra
664
+ type: translation
665
+ args: ssw-fra
666
+ dataset:
667
+ name: flores200-devtest
668
+ type: flores200-devtest
669
+ args: ssw-fra
670
+ metrics:
671
+ - name: BLEU
672
+ type: bleu
673
+ value: 14.0
674
+ - name: chr-F
675
+ type: chrf
676
+ value: 0.39386
677
+ - task:
678
+ name: Translation ssw-por
679
+ type: translation
680
+ args: ssw-por
681
+ dataset:
682
+ name: flores200-devtest
683
+ type: flores200-devtest
684
+ args: ssw-por
685
+ metrics:
686
+ - name: BLEU
687
+ type: bleu
688
+ value: 13.4
689
+ - name: chr-F
690
+ type: chrf
691
+ value: 0.38240
692
+ - task:
693
+ name: Translation swh-deu
694
+ type: translation
695
+ args: swh-deu
696
+ dataset:
697
+ name: flores200-devtest
698
+ type: flores200-devtest
699
+ args: swh-deu
700
+ metrics:
701
+ - name: BLEU
702
+ type: bleu
703
+ value: 15.6
704
+ - name: chr-F
705
+ type: chrf
706
+ value: 0.44937
707
+ - task:
708
+ name: Translation swh-eng
709
+ type: translation
710
+ args: swh-eng
711
+ dataset:
712
+ name: flores200-devtest
713
+ type: flores200-devtest
714
+ args: swh-eng
715
+ metrics:
716
+ - name: BLEU
717
+ type: bleu
718
+ value: 37.0
719
+ - name: chr-F
720
+ type: chrf
721
+ value: 0.60107
722
+ - task:
723
+ name: Translation swh-fra
724
+ type: translation
725
+ args: swh-fra
726
+ dataset:
727
+ name: flores200-devtest
728
+ type: flores200-devtest
729
+ args: swh-fra
730
+ metrics:
731
+ - name: BLEU
732
+ type: bleu
733
+ value: 23.5
734
+ - name: chr-F
735
+ type: chrf
736
+ value: 0.50257
737
+ - task:
738
+ name: Translation swh-por
739
+ type: translation
740
+ args: swh-por
741
+ dataset:
742
+ name: flores200-devtest
743
+ type: flores200-devtest
744
+ args: swh-por
745
+ metrics:
746
+ - name: BLEU
747
+ type: bleu
748
+ value: 22.8
749
+ - name: chr-F
750
+ type: chrf
751
+ value: 0.49475
752
+ - task:
753
+ name: Translation swh-spa
754
+ type: translation
755
+ args: swh-spa
756
+ dataset:
757
+ name: flores200-devtest
758
+ type: flores200-devtest
759
+ args: swh-spa
760
+ metrics:
761
+ - name: BLEU
762
+ type: bleu
763
+ value: 15.3
764
+ - name: chr-F
765
+ type: chrf
766
+ value: 0.42866
767
+ - task:
768
+ name: Translation tsn-eng
769
+ type: translation
770
+ args: tsn-eng
771
+ dataset:
772
+ name: flores200-devtest
773
+ type: flores200-devtest
774
+ args: tsn-eng
775
+ metrics:
776
+ - name: BLEU
777
+ type: bleu
778
+ value: 19.9
779
+ - name: chr-F
780
+ type: chrf
781
+ value: 0.45365
782
+ - task:
783
+ name: Translation tsn-fra
784
+ type: translation
785
+ args: tsn-fra
786
+ dataset:
787
+ name: flores200-devtest
788
+ type: flores200-devtest
789
+ args: tsn-fra
790
+ metrics:
791
+ - name: BLEU
792
+ type: bleu
793
+ value: 14.1
794
+ - name: chr-F
795
+ type: chrf
796
+ value: 0.39321
797
+ - task:
798
+ name: Translation tsn-por
799
+ type: translation
800
+ args: tsn-por
801
+ dataset:
802
+ name: flores200-devtest
803
+ type: flores200-devtest
804
+ args: tsn-por
805
+ metrics:
806
+ - name: BLEU
807
+ type: bleu
808
+ value: 13.1
809
+ - name: chr-F
810
+ type: chrf
811
+ value: 0.38354
812
+ - task:
813
+ name: Translation tsn-spa
814
+ type: translation
815
+ args: tsn-spa
816
+ dataset:
817
+ name: flores200-devtest
818
+ type: flores200-devtest
819
+ args: tsn-spa
820
+ metrics:
821
+ - name: BLEU
822
+ type: bleu
823
+ value: 10.4
824
+ - name: chr-F
825
+ type: chrf
826
+ value: 0.35183
827
+ - task:
828
+ name: Translation tso-eng
829
+ type: translation
830
+ args: tso-eng
831
+ dataset:
832
+ name: flores200-devtest
833
+ type: flores200-devtest
834
+ args: tso-eng
835
+ metrics:
836
+ - name: BLEU
837
+ type: bleu
838
+ value: 22.8
839
+ - name: chr-F
840
+ type: chrf
841
+ value: 0.46882
842
+ - task:
843
+ name: Translation tso-fra
844
+ type: translation
845
+ args: tso-fra
846
+ dataset:
847
+ name: flores200-devtest
848
+ type: flores200-devtest
849
+ args: tso-fra
850
+ metrics:
851
+ - name: BLEU
852
+ type: bleu
853
+ value: 14.2
854
+ - name: chr-F
855
+ type: chrf
856
+ value: 0.39409
857
+ - task:
858
+ name: Translation tso-por
859
+ type: translation
860
+ args: tso-por
861
+ dataset:
862
+ name: flores200-devtest
863
+ type: flores200-devtest
864
+ args: tso-por
865
+ metrics:
866
+ - name: BLEU
867
+ type: bleu
868
+ value: 13.2
869
+ - name: chr-F
870
+ type: chrf
871
+ value: 0.37640
872
+ - task:
873
+ name: Translation xho-deu
874
+ type: translation
875
+ args: xho-deu
876
+ dataset:
877
+ name: flores200-devtest
878
+ type: flores200-devtest
879
+ args: xho-deu
880
+ metrics:
881
+ - name: BLEU
882
+ type: bleu
883
+ value: 11.9
884
+ - name: chr-F
885
+ type: chrf
886
+ value: 0.39620
887
+ - task:
888
+ name: Translation xho-eng
889
+ type: translation
890
+ args: xho-eng
891
+ dataset:
892
+ name: flores200-devtest
893
+ type: flores200-devtest
894
+ args: xho-eng
895
+ metrics:
896
+ - name: BLEU
897
+ type: bleu
898
+ value: 28.8
899
+ - name: chr-F
900
+ type: chrf
901
+ value: 0.52500
902
+ - task:
903
+ name: Translation xho-fra
904
+ type: translation
905
+ args: xho-fra
906
+ dataset:
907
+ name: flores200-devtest
908
+ type: flores200-devtest
909
+ args: xho-fra
910
+ metrics:
911
+ - name: BLEU
912
+ type: bleu
913
+ value: 18.7
914
+ - name: chr-F
915
+ type: chrf
916
+ value: 0.44642
917
+ - task:
918
+ name: Translation xho-por
919
+ type: translation
920
+ args: xho-por
921
+ dataset:
922
+ name: flores200-devtest
923
+ type: flores200-devtest
924
+ args: xho-por
925
+ metrics:
926
+ - name: BLEU
927
+ type: bleu
928
+ value: 16.8
929
+ - name: chr-F
930
+ type: chrf
931
+ value: 0.42517
932
+ - task:
933
+ name: Translation xho-spa
934
+ type: translation
935
+ args: xho-spa
936
+ dataset:
937
+ name: flores200-devtest
938
+ type: flores200-devtest
939
+ args: xho-spa
940
+ metrics:
941
+ - name: BLEU
942
+ type: bleu
943
+ value: 12.0
944
+ - name: chr-F
945
+ type: chrf
946
+ value: 0.38096
947
+ - task:
948
+ name: Translation zul-deu
949
+ type: translation
950
+ args: zul-deu
951
+ dataset:
952
+ name: flores200-devtest
953
+ type: flores200-devtest
954
+ args: zul-deu
955
+ metrics:
956
+ - name: BLEU
957
+ type: bleu
958
+ value: 11.8
959
+ - name: chr-F
960
+ type: chrf
961
+ value: 0.39854
962
+ - task:
963
+ name: Translation zul-eng
964
+ type: translation
965
+ args: zul-eng
966
+ dataset:
967
+ name: flores200-devtest
968
+ type: flores200-devtest
969
+ args: zul-eng
970
+ metrics:
971
+ - name: BLEU
972
+ type: bleu
973
+ value: 29.5
974
+ - name: chr-F
975
+ type: chrf
976
+ value: 0.53428
977
+ - task:
978
+ name: Translation zul-fra
979
+ type: translation
980
+ args: zul-fra
981
+ dataset:
982
+ name: flores200-devtest
983
+ type: flores200-devtest
984
+ args: zul-fra
985
+ metrics:
986
+ - name: BLEU
987
+ type: bleu
988
+ value: 19.0
989
+ - name: chr-F
990
+ type: chrf
991
+ value: 0.45383
992
+ - task:
993
+ name: Translation zul-por
994
+ type: translation
995
+ args: zul-por
996
+ dataset:
997
+ name: flores200-devtest
998
+ type: flores200-devtest
999
+ args: zul-por
1000
+ metrics:
1001
+ - name: BLEU
1002
+ type: bleu
1003
+ value: 17.4
1004
+ - name: chr-F
1005
+ type: chrf
1006
+ value: 0.43537
1007
+ - task:
1008
+ name: Translation zul-spa
1009
+ type: translation
1010
+ args: zul-spa
1011
+ dataset:
1012
+ name: flores200-devtest
1013
+ type: flores200-devtest
1014
+ args: zul-spa
1015
+ metrics:
1016
+ - name: BLEU
1017
+ type: bleu
1018
+ value: 11.7
1019
+ - name: chr-F
1020
+ type: chrf
1021
+ value: 0.37807
1022
+ - task:
1023
+ name: Translation lin-eng
1024
+ type: translation
1025
+ args: lin-eng
1026
+ dataset:
1027
+ name: flores101-devtest
1028
+ type: flores_101
1029
+ args: lin eng devtest
1030
+ metrics:
1031
+ - name: BLEU
1032
+ type: bleu
1033
+ value: 16.9
1034
+ - name: chr-F
1035
+ type: chrf
1036
+ value: 0.40858
1037
+ - task:
1038
+ name: Translation lug-eng
1039
+ type: translation
1040
+ args: lug-eng
1041
+ dataset:
1042
+ name: flores101-devtest
1043
+ type: flores_101
1044
+ args: lug eng devtest
1045
+ metrics:
1046
+ - name: BLEU
1047
+ type: bleu
1048
+ value: 12.5
1049
+ - name: chr-F
1050
+ type: chrf
1051
+ value: 0.34463
1052
+ - task:
1053
+ name: Translation nso-eng
1054
+ type: translation
1055
+ args: nso-eng
1056
+ dataset:
1057
+ name: flores101-devtest
1058
+ type: flores_101
1059
+ args: nso eng devtest
1060
+ metrics:
1061
+ - name: BLEU
1062
+ type: bleu
1063
+ value: 26.5
1064
+ - name: chr-F
1065
+ type: chrf
1066
+ value: 0.49866
1067
+ - task:
1068
+ name: Translation nso-fra
1069
+ type: translation
1070
+ args: nso-fra
1071
+ dataset:
1072
+ name: flores101-devtest
1073
+ type: flores_101
1074
+ args: nso fra devtest
1075
+ metrics:
1076
+ - name: BLEU
1077
+ type: bleu
1078
+ value: 14.5
1079
+ - name: chr-F
1080
+ type: chrf
1081
+ value: 0.39204
1082
+ - task:
1083
+ name: Translation nya-por
1084
+ type: translation
1085
+ args: nya-por
1086
+ dataset:
1087
+ name: flores101-devtest
1088
+ type: flores_101
1089
+ args: nya por devtest
1090
+ metrics:
1091
+ - name: BLEU
1092
+ type: bleu
1093
+ value: 11.4
1094
+ - name: chr-F
1095
+ type: chrf
1096
+ value: 0.35924
1097
+ - task:
1098
+ name: Translation sna-fra
1099
+ type: translation
1100
+ args: sna-fra
1101
+ dataset:
1102
+ name: flores101-devtest
1103
+ type: flores_101
1104
+ args: sna fra devtest
1105
+ metrics:
1106
+ - name: BLEU
1107
+ type: bleu
1108
+ value: 14.3
1109
+ - name: chr-F
1110
+ type: chrf
1111
+ value: 0.40134
1112
+ - task:
1113
+ name: Translation sna-por
1114
+ type: translation
1115
+ args: sna-por
1116
+ dataset:
1117
+ name: flores101-devtest
1118
+ type: flores_101
1119
+ args: sna por devtest
1120
+ metrics:
1121
+ - name: BLEU
1122
+ type: bleu
1123
+ value: 12.5
1124
+ - name: chr-F
1125
+ type: chrf
1126
+ value: 0.37537
1127
+ - task:
1128
+ name: Translation swh-deu
1129
+ type: translation
1130
+ args: swh-deu
1131
+ dataset:
1132
+ name: flores101-devtest
1133
+ type: flores_101
1134
+ args: swh deu devtest
1135
+ metrics:
1136
+ - name: BLEU
1137
+ type: bleu
1138
+ value: 14.2
1139
+ - name: chr-F
1140
+ type: chrf
1141
+ value: 0.43073
1142
+ - task:
1143
+ name: Translation xho-spa
1144
+ type: translation
1145
+ args: xho-spa
1146
+ dataset:
1147
+ name: flores101-devtest
1148
+ type: flores_101
1149
+ args: xho spa devtest
1150
+ metrics:
1151
+ - name: BLEU
1152
+ type: bleu
1153
+ value: 10.8
1154
+ - name: chr-F
1155
+ type: chrf
1156
+ value: 0.36714
1157
+ - task:
1158
+ name: Translation zul-fra
1159
+ type: translation
1160
+ args: zul-fra
1161
+ dataset:
1162
+ name: flores101-devtest
1163
+ type: flores_101
1164
+ args: zul fra devtest
1165
+ metrics:
1166
+ - name: BLEU
1167
+ type: bleu
1168
+ value: 17.4
1169
+ - name: chr-F
1170
+ type: chrf
1171
+ value: 0.43723
1172
+ - task:
1173
+ name: Translation zul-por
1174
+ type: translation
1175
+ args: zul-por
1176
+ dataset:
1177
+ name: flores101-devtest
1178
+ type: flores_101
1179
+ args: zul por devtest
1180
+ metrics:
1181
+ - name: BLEU
1182
+ type: bleu
1183
+ value: 15.9
1184
+ - name: chr-F
1185
+ type: chrf
1186
+ value: 0.41886
1187
+ - task:
1188
+ name: Translation bem-eng
1189
+ type: translation
1190
+ args: bem-eng
1191
+ dataset:
1192
+ name: ntrex128
1193
+ type: ntrex128
1194
+ args: bem-eng
1195
+ metrics:
1196
+ - name: BLEU
1197
+ type: bleu
1198
+ value: 19.1
1199
+ - name: chr-F
1200
+ type: chrf
1201
+ value: 0.43168
1202
+ - task:
1203
+ name: Translation bem-fra
1204
+ type: translation
1205
+ args: bem-fra
1206
+ dataset:
1207
+ name: ntrex128
1208
+ type: ntrex128
1209
+ args: bem-fra
1210
+ metrics:
1211
+ - name: BLEU
1212
+ type: bleu
1213
+ value: 11.4
1214
+ - name: chr-F
1215
+ type: chrf
1216
+ value: 0.36401
1217
+ - task:
1218
+ name: Translation bem-por
1219
+ type: translation
1220
+ args: bem-por
1221
+ dataset:
1222
+ name: ntrex128
1223
+ type: ntrex128
1224
+ args: bem-por
1225
+ metrics:
1226
+ - name: BLEU
1227
+ type: bleu
1228
+ value: 11.6
1229
+ - name: chr-F
1230
+ type: chrf
1231
+ value: 0.36288
1232
+ - task:
1233
+ name: Translation bem-spa
1234
+ type: translation
1235
+ args: bem-spa
1236
+ dataset:
1237
+ name: ntrex128
1238
+ type: ntrex128
1239
+ args: bem-spa
1240
+ metrics:
1241
+ - name: BLEU
1242
+ type: bleu
1243
+ value: 13.7
1244
+ - name: chr-F
1245
+ type: chrf
1246
+ value: 0.38219
1247
+ - task:
1248
+ name: Translation kin-eng
1249
+ type: translation
1250
+ args: kin-eng
1251
+ dataset:
1252
+ name: ntrex128
1253
+ type: ntrex128
1254
+ args: kin-eng
1255
+ metrics:
1256
+ - name: BLEU
1257
+ type: bleu
1258
+ value: 20.8
1259
+ - name: chr-F
1260
+ type: chrf
1261
+ value: 0.46996
1262
+ - task:
1263
+ name: Translation kin-fra
1264
+ type: translation
1265
+ args: kin-fra
1266
+ dataset:
1267
+ name: ntrex128
1268
+ type: ntrex128
1269
+ args: kin-fra
1270
+ metrics:
1271
+ - name: BLEU
1272
+ type: bleu
1273
+ value: 14.7
1274
+ - name: chr-F
1275
+ type: chrf
1276
+ value: 0.40765
1277
+ - task:
1278
+ name: Translation kin-por
1279
+ type: translation
1280
+ args: kin-por
1281
+ dataset:
1282
+ name: ntrex128
1283
+ type: ntrex128
1284
+ args: kin-por
1285
+ metrics:
1286
+ - name: BLEU
1287
+ type: bleu
1288
+ value: 13.1
1289
+ - name: chr-F
1290
+ type: chrf
1291
+ value: 0.38834
1292
+ - task:
1293
+ name: Translation kin-spa
1294
+ type: translation
1295
+ args: kin-spa
1296
+ dataset:
1297
+ name: ntrex128
1298
+ type: ntrex128
1299
+ args: kin-spa
1300
+ metrics:
1301
+ - name: BLEU
1302
+ type: bleu
1303
+ value: 15.9
1304
+ - name: chr-F
1305
+ type: chrf
1306
+ value: 0.41552
1307
+ - task:
1308
+ name: Translation nde-eng
1309
+ type: translation
1310
+ args: nde-eng
1311
+ dataset:
1312
+ name: ntrex128
1313
+ type: ntrex128
1314
+ args: nde-eng
1315
+ metrics:
1316
+ - name: BLEU
1317
+ type: bleu
1318
+ value: 17.1
1319
+ - name: chr-F
1320
+ type: chrf
1321
+ value: 0.42744
1322
+ - task:
1323
+ name: Translation nde-fra
1324
+ type: translation
1325
+ args: nde-fra
1326
+ dataset:
1327
+ name: ntrex128
1328
+ type: ntrex128
1329
+ args: nde-fra
1330
+ metrics:
1331
+ - name: BLEU
1332
+ type: bleu
1333
+ value: 11.1
1334
+ - name: chr-F
1335
+ type: chrf
1336
+ value: 0.36837
1337
+ - task:
1338
+ name: Translation nde-por
1339
+ type: translation
1340
+ args: nde-por
1341
+ dataset:
1342
+ name: ntrex128
1343
+ type: ntrex128
1344
+ args: nde-por
1345
+ metrics:
1346
+ - name: BLEU
1347
+ type: bleu
1348
+ value: 11.0
1349
+ - name: chr-F
1350
+ type: chrf
1351
+ value: 0.36769
1352
+ - task:
1353
+ name: Translation nde-spa
1354
+ type: translation
1355
+ args: nde-spa
1356
+ dataset:
1357
+ name: ntrex128
1358
+ type: ntrex128
1359
+ args: nde-spa
1360
+ metrics:
1361
+ - name: BLEU
1362
+ type: bleu
1363
+ value: 13.0
1364
+ - name: chr-F
1365
+ type: chrf
1366
+ value: 0.37924
1367
+ - task:
1368
+ name: Translation nso-eng
1369
+ type: translation
1370
+ args: nso-eng
1371
+ dataset:
1372
+ name: ntrex128
1373
+ type: ntrex128
1374
+ args: nso-eng
1375
+ metrics:
1376
+ - name: BLEU
1377
+ type: bleu
1378
+ value: 21.5
1379
+ - name: chr-F
1380
+ type: chrf
1381
+ value: 0.47231
1382
+ - task:
1383
+ name: Translation nso-fra
1384
+ type: translation
1385
+ args: nso-fra
1386
+ dataset:
1387
+ name: ntrex128
1388
+ type: ntrex128
1389
+ args: nso-fra
1390
+ metrics:
1391
+ - name: BLEU
1392
+ type: bleu
1393
+ value: 13.6
1394
+ - name: chr-F
1395
+ type: chrf
1396
+ value: 0.39206
1397
+ - task:
1398
+ name: Translation nso-por
1399
+ type: translation
1400
+ args: nso-por
1401
+ dataset:
1402
+ name: ntrex128
1403
+ type: ntrex128
1404
+ args: nso-por
1405
+ metrics:
1406
+ - name: BLEU
1407
+ type: bleu
1408
+ value: 12.3
1409
+ - name: chr-F
1410
+ type: chrf
1411
+ value: 0.37912
1412
+ - task:
1413
+ name: Translation nso-spa
1414
+ type: translation
1415
+ args: nso-spa
1416
+ dataset:
1417
+ name: ntrex128
1418
+ type: ntrex128
1419
+ args: nso-spa
1420
+ metrics:
1421
+ - name: BLEU
1422
+ type: bleu
1423
+ value: 15.2
1424
+ - name: chr-F
1425
+ type: chrf
1426
+ value: 0.40135
1427
+ - task:
1428
+ name: Translation nya-deu
1429
+ type: translation
1430
+ args: nya-deu
1431
+ dataset:
1432
+ name: ntrex128
1433
+ type: ntrex128
1434
+ args: nya-deu
1435
+ metrics:
1436
+ - name: BLEU
1437
+ type: bleu
1438
+ value: 10.0
1439
+ - name: chr-F
1440
+ type: chrf
1441
+ value: 0.37303
1442
+ - task:
1443
+ name: Translation nya-eng
1444
+ type: translation
1445
+ args: nya-eng
1446
+ dataset:
1447
+ name: ntrex128
1448
+ type: ntrex128
1449
+ args: nya-eng
1450
+ metrics:
1451
+ - name: BLEU
1452
+ type: bleu
1453
+ value: 23.3
1454
+ - name: chr-F
1455
+ type: chrf
1456
+ value: 0.47072
1457
+ - task:
1458
+ name: Translation nya-fra
1459
+ type: translation
1460
+ args: nya-fra
1461
+ dataset:
1462
+ name: ntrex128
1463
+ type: ntrex128
1464
+ args: nya-fra
1465
+ metrics:
1466
+ - name: BLEU
1467
+ type: bleu
1468
+ value: 14.1
1469
+ - name: chr-F
1470
+ type: chrf
1471
+ value: 0.39866
1472
+ - task:
1473
+ name: Translation nya-por
1474
+ type: translation
1475
+ args: nya-por
1476
+ dataset:
1477
+ name: ntrex128
1478
+ type: ntrex128
1479
+ args: nya-por
1480
+ metrics:
1481
+ - name: BLEU
1482
+ type: bleu
1483
+ value: 13.7
1484
+ - name: chr-F
1485
+ type: chrf
1486
+ value: 0.38960
1487
+ - task:
1488
+ name: Translation nya-spa
1489
+ type: translation
1490
+ args: nya-spa
1491
+ dataset:
1492
+ name: ntrex128
1493
+ type: ntrex128
1494
+ args: nya-spa
1495
+ metrics:
1496
+ - name: BLEU
1497
+ type: bleu
1498
+ value: 16.2
1499
+ - name: chr-F
1500
+ type: chrf
1501
+ value: 0.41006
1502
+ - task:
1503
+ name: Translation ssw-eng
1504
+ type: translation
1505
+ args: ssw-eng
1506
+ dataset:
1507
+ name: ntrex128
1508
+ type: ntrex128
1509
+ args: ssw-eng
1510
+ metrics:
1511
+ - name: BLEU
1512
+ type: bleu
1513
+ value: 23.5
1514
+ - name: chr-F
1515
+ type: chrf
1516
+ value: 0.48682
1517
+ - task:
1518
+ name: Translation ssw-fra
1519
+ type: translation
1520
+ args: ssw-fra
1521
+ dataset:
1522
+ name: ntrex128
1523
+ type: ntrex128
1524
+ args: ssw-fra
1525
+ metrics:
1526
+ - name: BLEU
1527
+ type: bleu
1528
+ value: 13.7
1529
+ - name: chr-F
1530
+ type: chrf
1531
+ value: 0.39351
1532
+ - task:
1533
+ name: Translation ssw-por
1534
+ type: translation
1535
+ args: ssw-por
1536
+ dataset:
1537
+ name: ntrex128
1538
+ type: ntrex128
1539
+ args: ssw-por
1540
+ metrics:
1541
+ - name: BLEU
1542
+ type: bleu
1543
+ value: 13.4
1544
+ - name: chr-F
1545
+ type: chrf
1546
+ value: 0.39220
1547
+ - task:
1548
+ name: Translation ssw-spa
1549
+ type: translation
1550
+ args: ssw-spa
1551
+ dataset:
1552
+ name: ntrex128
1553
+ type: ntrex128
1554
+ args: ssw-spa
1555
+ metrics:
1556
+ - name: BLEU
1557
+ type: bleu
1558
+ value: 15.9
1559
+ - name: chr-F
1560
+ type: chrf
1561
+ value: 0.40839
1562
+ - task:
1563
+ name: Translation swa-deu
1564
+ type: translation
1565
+ args: swa-deu
1566
+ dataset:
1567
+ name: ntrex128
1568
+ type: ntrex128
1569
+ args: swa-deu
1570
+ metrics:
1571
+ - name: BLEU
1572
+ type: bleu
1573
+ value: 14.1
1574
+ - name: chr-F
1575
+ type: chrf
1576
+ value: 0.43880
1577
+ - task:
1578
+ name: Translation swa-eng
1579
+ type: translation
1580
+ args: swa-eng
1581
+ dataset:
1582
+ name: ntrex128
1583
+ type: ntrex128
1584
+ args: swa-eng
1585
+ metrics:
1586
+ - name: BLEU
1587
+ type: bleu
1588
+ value: 35.4
1589
+ - name: chr-F
1590
+ type: chrf
1591
+ value: 0.58527
1592
+ - task:
1593
+ name: Translation swa-fra
1594
+ type: translation
1595
+ args: swa-fra
1596
+ dataset:
1597
+ name: ntrex128
1598
+ type: ntrex128
1599
+ args: swa-fra
1600
+ metrics:
1601
+ - name: BLEU
1602
+ type: bleu
1603
+ value: 19.7
1604
+ - name: chr-F
1605
+ type: chrf
1606
+ value: 0.47344
1607
+ - task:
1608
+ name: Translation swa-por
1609
+ type: translation
1610
+ args: swa-por
1611
+ dataset:
1612
+ name: ntrex128
1613
+ type: ntrex128
1614
+ args: swa-por
1615
+ metrics:
1616
+ - name: BLEU
1617
+ type: bleu
1618
+ value: 19.1
1619
+ - name: chr-F
1620
+ type: chrf
1621
+ value: 0.46292
1622
+ - task:
1623
+ name: Translation swa-spa
1624
+ type: translation
1625
+ args: swa-spa
1626
+ dataset:
1627
+ name: ntrex128
1628
+ type: ntrex128
1629
+ args: swa-spa
1630
+ metrics:
1631
+ - name: BLEU
1632
+ type: bleu
1633
+ value: 22.9
1634
+ - name: chr-F
1635
+ type: chrf
1636
+ value: 0.48780
1637
+ - task:
1638
+ name: Translation tsn-deu
1639
+ type: translation
1640
+ args: tsn-deu
1641
+ dataset:
1642
+ name: ntrex128
1643
+ type: ntrex128
1644
+ args: tsn-deu
1645
+ metrics:
1646
+ - name: BLEU
1647
+ type: bleu
1648
+ value: 10.4
1649
+ - name: chr-F
1650
+ type: chrf
1651
+ value: 0.38791
1652
+ - task:
1653
+ name: Translation tsn-eng
1654
+ type: translation
1655
+ args: tsn-eng
1656
+ dataset:
1657
+ name: ntrex128
1658
+ type: ntrex128
1659
+ args: tsn-eng
1660
+ metrics:
1661
+ - name: BLEU
1662
+ type: bleu
1663
+ value: 25.3
1664
+ - name: chr-F
1665
+ type: chrf
1666
+ value: 0.50413
1667
+ - task:
1668
+ name: Translation tsn-fra
1669
+ type: translation
1670
+ args: tsn-fra
1671
+ dataset:
1672
+ name: ntrex128
1673
+ type: ntrex128
1674
+ args: tsn-fra
1675
+ metrics:
1676
+ - name: BLEU
1677
+ type: bleu
1678
+ value: 15.8
1679
+ - name: chr-F
1680
+ type: chrf
1681
+ value: 0.41912
1682
+ - task:
1683
+ name: Translation tsn-por
1684
+ type: translation
1685
+ args: tsn-por
1686
+ dataset:
1687
+ name: ntrex128
1688
+ type: ntrex128
1689
+ args: tsn-por
1690
+ metrics:
1691
+ - name: BLEU
1692
+ type: bleu
1693
+ value: 15.3
1694
+ - name: chr-F
1695
+ type: chrf
1696
+ value: 0.41090
1697
+ - task:
1698
+ name: Translation tsn-spa
1699
+ type: translation
1700
+ args: tsn-spa
1701
+ dataset:
1702
+ name: ntrex128
1703
+ type: ntrex128
1704
+ args: tsn-spa
1705
+ metrics:
1706
+ - name: BLEU
1707
+ type: bleu
1708
+ value: 17.7
1709
+ - name: chr-F
1710
+ type: chrf
1711
+ value: 0.42979
1712
+ - task:
1713
+ name: Translation ven-eng
1714
+ type: translation
1715
+ args: ven-eng
1716
+ dataset:
1717
+ name: ntrex128
1718
+ type: ntrex128
1719
+ args: ven-eng
1720
+ metrics:
1721
+ - name: BLEU
1722
+ type: bleu
1723
+ value: 18.4
1724
+ - name: chr-F
1725
+ type: chrf
1726
+ value: 0.43364
1727
+ - task:
1728
+ name: Translation ven-fra
1729
+ type: translation
1730
+ args: ven-fra
1731
+ dataset:
1732
+ name: ntrex128
1733
+ type: ntrex128
1734
+ args: ven-fra
1735
+ metrics:
1736
+ - name: BLEU
1737
+ type: bleu
1738
+ value: 12.0
1739
+ - name: chr-F
1740
+ type: chrf
1741
+ value: 0.37138
1742
+ - task:
1743
+ name: Translation ven-por
1744
+ type: translation
1745
+ args: ven-por
1746
+ dataset:
1747
+ name: ntrex128
1748
+ type: ntrex128
1749
+ args: ven-por
1750
+ metrics:
1751
+ - name: BLEU
1752
+ type: bleu
1753
+ value: 11.2
1754
+ - name: chr-F
1755
+ type: chrf
1756
+ value: 0.36506
1757
+ - task:
1758
+ name: Translation ven-spa
1759
+ type: translation
1760
+ args: ven-spa
1761
+ dataset:
1762
+ name: ntrex128
1763
+ type: ntrex128
1764
+ args: ven-spa
1765
+ metrics:
1766
+ - name: BLEU
1767
+ type: bleu
1768
+ value: 13.1
1769
+ - name: chr-F
1770
+ type: chrf
1771
+ value: 0.38029
1772
+ - task:
1773
+ name: Translation xho-deu
1774
+ type: translation
1775
+ args: xho-deu
1776
+ dataset:
1777
+ name: ntrex128
1778
+ type: ntrex128
1779
+ args: xho-deu
1780
+ metrics:
1781
+ - name: BLEU
1782
+ type: bleu
1783
+ value: 10.5
1784
+ - name: chr-F
1785
+ type: chrf
1786
+ value: 0.38470
1787
+ - task:
1788
+ name: Translation xho-eng
1789
+ type: translation
1790
+ args: xho-eng
1791
+ dataset:
1792
+ name: ntrex128
1793
+ type: ntrex128
1794
+ args: xho-eng
1795
+ metrics:
1796
+ - name: BLEU
1797
+ type: bleu
1798
+ value: 26.5
1799
+ - name: chr-F
1800
+ type: chrf
1801
+ value: 0.50778
1802
+ - task:
1803
+ name: Translation xho-fra
1804
+ type: translation
1805
+ args: xho-fra
1806
+ dataset:
1807
+ name: ntrex128
1808
+ type: ntrex128
1809
+ args: xho-fra
1810
+ metrics:
1811
+ - name: BLEU
1812
+ type: bleu
1813
+ value: 15.1
1814
+ - name: chr-F
1815
+ type: chrf
1816
+ value: 0.41066
1817
+ - task:
1818
+ name: Translation xho-por
1819
+ type: translation
1820
+ args: xho-por
1821
+ dataset:
1822
+ name: ntrex128
1823
+ type: ntrex128
1824
+ args: xho-por
1825
+ metrics:
1826
+ - name: BLEU
1827
+ type: bleu
1828
+ value: 14.0
1829
+ - name: chr-F
1830
+ type: chrf
1831
+ value: 0.39822
1832
+ - task:
1833
+ name: Translation xho-spa
1834
+ type: translation
1835
+ args: xho-spa
1836
+ dataset:
1837
+ name: ntrex128
1838
+ type: ntrex128
1839
+ args: xho-spa
1840
+ metrics:
1841
+ - name: BLEU
1842
+ type: bleu
1843
+ value: 16.7
1844
+ - name: chr-F
1845
+ type: chrf
1846
+ value: 0.42129
1847
+ - task:
1848
+ name: Translation zul-deu
1849
+ type: translation
1850
+ args: zul-deu
1851
+ dataset:
1852
+ name: ntrex128
1853
+ type: ntrex128
1854
+ args: zul-deu
1855
+ metrics:
1856
+ - name: BLEU
1857
+ type: bleu
1858
+ value: 10.9
1859
+ - name: chr-F
1860
+ type: chrf
1861
+ value: 0.38155
1862
+ - task:
1863
+ name: Translation zul-eng
1864
+ type: translation
1865
+ args: zul-eng
1866
+ dataset:
1867
+ name: ntrex128
1868
+ type: ntrex128
1869
+ args: zul-eng
1870
+ metrics:
1871
+ - name: BLEU
1872
+ type: bleu
1873
+ value: 26.9
1874
+ - name: chr-F
1875
+ type: chrf
1876
+ value: 0.50361
1877
+ - task:
1878
+ name: Translation zul-fra
1879
+ type: translation
1880
+ args: zul-fra
1881
+ dataset:
1882
+ name: ntrex128
1883
+ type: ntrex128
1884
+ args: zul-fra
1885
+ metrics:
1886
+ - name: BLEU
1887
+ type: bleu
1888
+ value: 15.0
1889
+ - name: chr-F
1890
+ type: chrf
1891
+ value: 0.40779
1892
+ - task:
1893
+ name: Translation zul-por
1894
+ type: translation
1895
+ args: zul-por
1896
+ dataset:
1897
+ name: ntrex128
1898
+ type: ntrex128
1899
+ args: zul-por
1900
+ metrics:
1901
+ - name: BLEU
1902
+ type: bleu
1903
+ value: 14.5
1904
+ - name: chr-F
1905
+ type: chrf
1906
+ value: 0.39932
1907
+ - task:
1908
+ name: Translation zul-spa
1909
+ type: translation
1910
+ args: zul-spa
1911
+ dataset:
1912
+ name: ntrex128
1913
+ type: ntrex128
1914
+ args: zul-spa
1915
+ metrics:
1916
+ - name: BLEU
1917
+ type: bleu
1918
+ value: 16.9
1919
+ - name: chr-F
1920
+ type: chrf
1921
+ value: 0.41836
1922
+ - task:
1923
+ name: Translation multi-multi
1924
+ type: translation
1925
+ args: multi-multi
1926
+ dataset:
1927
+ name: tatoeba-test-v2020-07-28-v2023-09-26
1928
+ type: tatoeba_mt
1929
+ args: multi-multi
1930
+ metrics:
1931
+ - name: BLEU
1932
+ type: bleu
1933
+ value: 30.9
1934
+ - name: chr-F
1935
+ type: chrf
1936
+ value: 0.47850
1937
+ - task:
1938
+ name: Translation run-deu
1939
+ type: translation
1940
+ args: run-deu
1941
+ dataset:
1942
+ name: tatoeba-test-v2021-08-07
1943
+ type: tatoeba_mt
1944
+ args: run-deu
1945
+ metrics:
1946
+ - name: BLEU
1947
+ type: bleu
1948
+ value: 26.1
1949
+ - name: chr-F
1950
+ type: chrf
1951
+ value: 0.43836
1952
+ - task:
1953
+ name: Translation run-eng
1954
+ type: translation
1955
+ args: run-eng
1956
+ dataset:
1957
+ name: tatoeba-test-v2021-08-07
1958
+ type: tatoeba_mt
1959
+ args: run-eng
1960
+ metrics:
1961
+ - name: BLEU
1962
+ type: bleu
1963
+ value: 39.4
1964
+ - name: chr-F
1965
+ type: chrf
1966
+ value: 0.54089
1967
+ - task:
1968
+ name: Translation run-fra
1969
+ type: translation
1970
+ args: run-fra
1971
+ dataset:
1972
+ name: tatoeba-test-v2021-08-07
1973
+ type: tatoeba_mt
1974
+ args: run-fra
1975
+ metrics:
1976
+ - name: BLEU
1977
+ type: bleu
1978
+ value: 26.1
1979
+ - name: chr-F
1980
+ type: chrf
1981
+ value: 0.46240
1982
+ - task:
1983
+ name: Translation run-spa
1984
+ type: translation
1985
+ args: run-spa
1986
+ dataset:
1987
+ name: tatoeba-test-v2021-08-07
1988
+ type: tatoeba_mt
1989
+ args: run-spa
1990
+ metrics:
1991
+ - name: BLEU
1992
+ type: bleu
1993
+ value: 25.8
1994
+ - name: chr-F
1995
+ type: chrf
1996
+ value: 0.46496
1997
+ - task:
1998
+ name: Translation swa-eng
1999
+ type: translation
2000
+ args: swa-eng
2001
+ dataset:
2002
+ name: tatoeba-test-v2021-08-07
2003
+ type: tatoeba_mt
2004
+ args: swa-eng
2005
+ metrics:
2006
+ - name: BLEU
2007
+ type: bleu
2008
+ value: 45.9
2009
+ - name: chr-F
2010
+ type: chrf
2011
+ value: 0.59947
2012
+ - task:
2013
+ name: Translation xho-eng
2014
+ type: translation
2015
+ args: xho-eng
2016
+ dataset:
2017
+ name: tatoeba-test-v2021-03-30
2018
+ type: tatoeba_mt
2019
+ args: xho-eng
2020
+ metrics:
2021
+ - name: BLEU
2022
+ type: bleu
2023
+ value: 41.7
2024
+ - name: chr-F
2025
+ type: chrf
2026
+ value: 0.56987
2027
+ - task:
2028
+ name: Translation kin-eng
2029
+ type: translation
2030
+ args: kin-eng
2031
+ dataset:
2032
+ name: tico19-test
2033
+ type: tico19-test
2034
+ args: kin-eng
2035
+ metrics:
2036
+ - name: BLEU
2037
+ type: bleu
2038
+ value: 18.8
2039
+ - name: chr-F
2040
+ type: chrf
2041
+ value: 0.42280
2042
+ - task:
2043
+ name: Translation kin-fra
2044
+ type: translation
2045
+ args: kin-fra
2046
+ dataset:
2047
+ name: tico19-test
2048
+ type: tico19-test
2049
+ args: kin-fra
2050
+ metrics:
2051
+ - name: BLEU
2052
+ type: bleu
2053
+ value: 13.8
2054
+ - name: chr-F
2055
+ type: chrf
2056
+ value: 0.36808
2057
+ - task:
2058
+ name: Translation kin-por
2059
+ type: translation
2060
+ args: kin-por
2061
+ dataset:
2062
+ name: tico19-test
2063
+ type: tico19-test
2064
+ args: kin-por
2065
+ metrics:
2066
+ - name: BLEU
2067
+ type: bleu
2068
+ value: 14.6
2069
+ - name: chr-F
2070
+ type: chrf
2071
+ value: 0.37819
2072
+ - task:
2073
+ name: Translation kin-spa
2074
+ type: translation
2075
+ args: kin-spa
2076
+ dataset:
2077
+ name: tico19-test
2078
+ type: tico19-test
2079
+ args: kin-spa
2080
+ metrics:
2081
+ - name: BLEU
2082
+ type: bleu
2083
+ value: 15.9
2084
+ - name: chr-F
2085
+ type: chrf
2086
+ value: 0.39391
2087
+ - task:
2088
+ name: Translation lin-eng
2089
+ type: translation
2090
+ args: lin-eng
2091
+ dataset:
2092
+ name: tico19-test
2093
+ type: tico19-test
2094
+ args: lin-eng
2095
+ metrics:
2096
+ - name: BLEU
2097
+ type: bleu
2098
+ value: 18.4
2099
+ - name: chr-F
2100
+ type: chrf
2101
+ value: 0.41495
2102
+ - task:
2103
+ name: Translation lin-fra
2104
+ type: translation
2105
+ args: lin-fra
2106
+ dataset:
2107
+ name: tico19-test
2108
+ type: tico19-test
2109
+ args: lin-fra
2110
+ metrics:
2111
+ - name: BLEU
2112
+ type: bleu
2113
+ value: 13.5
2114
+ - name: chr-F
2115
+ type: chrf
2116
+ value: 0.36297
2117
+ - task:
2118
+ name: Translation lin-por
2119
+ type: translation
2120
+ args: lin-por
2121
+ dataset:
2122
+ name: tico19-test
2123
+ type: tico19-test
2124
+ args: lin-por
2125
+ metrics:
2126
+ - name: BLEU
2127
+ type: bleu
2128
+ value: 13.3
2129
+ - name: chr-F
2130
+ type: chrf
2131
+ value: 0.36515
2132
+ - task:
2133
+ name: Translation lin-spa
2134
+ type: translation
2135
+ args: lin-spa
2136
+ dataset:
2137
+ name: tico19-test
2138
+ type: tico19-test
2139
+ args: lin-spa
2140
+ metrics:
2141
+ - name: BLEU
2142
+ type: bleu
2143
+ value: 14.4
2144
+ - name: chr-F
2145
+ type: chrf
2146
+ value: 0.37607
2147
+ - task:
2148
+ name: Translation lug-eng
2149
+ type: translation
2150
+ args: lug-eng
2151
+ dataset:
2152
+ name: tico19-test
2153
+ type: tico19-test
2154
+ args: lug-eng
2155
+ metrics:
2156
+ - name: BLEU
2157
+ type: bleu
2158
+ value: 22.2
2159
+ - name: chr-F
2160
+ type: chrf
2161
+ value: 0.43948
2162
+ - task:
2163
+ name: Translation lug-fra
2164
+ type: translation
2165
+ args: lug-fra
2166
+ dataset:
2167
+ name: tico19-test
2168
+ type: tico19-test
2169
+ args: lug-fra
2170
+ metrics:
2171
+ - name: BLEU
2172
+ type: bleu
2173
+ value: 13.7
2174
+ - name: chr-F
2175
+ type: chrf
2176
+ value: 0.36537
2177
+ - task:
2178
+ name: Translation lug-por
2179
+ type: translation
2180
+ args: lug-por
2181
+ dataset:
2182
+ name: tico19-test
2183
+ type: tico19-test
2184
+ args: lug-por
2185
+ metrics:
2186
+ - name: BLEU
2187
+ type: bleu
2188
+ value: 14.6
2189
+ - name: chr-F
2190
+ type: chrf
2191
+ value: 0.38018
2192
+ - task:
2193
+ name: Translation lug-spa
2194
+ type: translation
2195
+ args: lug-spa
2196
+ dataset:
2197
+ name: tico19-test
2198
+ type: tico19-test
2199
+ args: lug-spa
2200
+ metrics:
2201
+ - name: BLEU
2202
+ type: bleu
2203
+ value: 15.5
2204
+ - name: chr-F
2205
+ type: chrf
2206
+ value: 0.38861
2207
+ - task:
2208
+ name: Translation swa-eng
2209
+ type: translation
2210
+ args: swa-eng
2211
+ dataset:
2212
+ name: tico19-test
2213
+ type: tico19-test
2214
+ args: swa-eng
2215
+ metrics:
2216
+ - name: BLEU
2217
+ type: bleu
2218
+ value: 34.5
2219
+ - name: chr-F
2220
+ type: chrf
2221
+ value: 0.58126
2222
+ - task:
2223
+ name: Translation swa-fra
2224
+ type: translation
2225
+ args: swa-fra
2226
+ dataset:
2227
+ name: tico19-test
2228
+ type: tico19-test
2229
+ args: swa-fra
2230
+ metrics:
2231
+ - name: BLEU
2232
+ type: bleu
2233
+ value: 20.5
2234
+ - name: chr-F
2235
+ type: chrf
2236
+ value: 0.46470
2237
+ - task:
2238
+ name: Translation swa-por
2239
+ type: translation
2240
+ args: swa-por
2241
+ dataset:
2242
+ name: tico19-test
2243
+ type: tico19-test
2244
+ args: swa-por
2245
+ metrics:
2246
+ - name: BLEU
2247
+ type: bleu
2248
+ value: 22.8
2249
+ - name: chr-F
2250
+ type: chrf
2251
+ value: 0.49374
2252
+ - task:
2253
+ name: Translation swa-spa
2254
+ type: translation
2255
+ args: swa-spa
2256
+ dataset:
2257
+ name: tico19-test
2258
+ type: tico19-test
2259
+ args: swa-spa
2260
+ metrics:
2261
+ - name: BLEU
2262
+ type: bleu
2263
+ value: 24.5
2264
+ - name: chr-F
2265
+ type: chrf
2266
+ value: 0.50214
2267
+ - task:
2268
+ name: Translation zul-eng
2269
+ type: translation
2270
+ args: zul-eng
2271
+ dataset:
2272
+ name: tico19-test
2273
+ type: tico19-test
2274
+ args: zul-eng
2275
+ metrics:
2276
+ - name: BLEU
2277
+ type: bleu
2278
+ value: 32.2
2279
+ - name: chr-F
2280
+ type: chrf
2281
+ value: 0.55678
2282
+ - task:
2283
+ name: Translation zul-fra
2284
+ type: translation
2285
+ args: zul-fra
2286
+ dataset:
2287
+ name: tico19-test
2288
+ type: tico19-test
2289
+ args: zul-fra
2290
+ metrics:
2291
+ - name: BLEU
2292
+ type: bleu
2293
+ value: 18.6
2294
+ - name: chr-F
2295
+ type: chrf
2296
+ value: 0.43797
2297
+ - task:
2298
+ name: Translation zul-por
2299
+ type: translation
2300
+ args: zul-por
2301
+ dataset:
2302
+ name: tico19-test
2303
+ type: tico19-test
2304
+ args: zul-por
2305
+ metrics:
2306
+ - name: BLEU
2307
+ type: bleu
2308
+ value: 19.9
2309
+ - name: chr-F
2310
+ type: chrf
2311
+ value: 0.45560
2312
+ - task:
2313
+ name: Translation zul-spa
2314
+ type: translation
2315
+ args: zul-spa
2316
+ dataset:
2317
+ name: tico19-test
2318
+ type: tico19-test
2319
+ args: zul-spa
2320
+ metrics:
2321
+ - name: BLEU
2322
+ type: bleu
2323
+ value: 21.4
2324
+ - name: chr-F
2325
+ type: chrf
2326
+ value: 0.46505
2327
+ ---
2328
+ # opus-mt-tc-bible-big-bnt-deu_eng_fra_por_spa
2329
+
2330
+ ## Table of Contents
2331
+ - [Model Details](#model-details)
2332
+ - [Uses](#uses)
2333
+ - [Risks, Limitations and Biases](#risks-limitations-and-biases)
2334
+ - [How to Get Started With the Model](#how-to-get-started-with-the-model)
2335
+ - [Training](#training)
2336
+ - [Evaluation](#evaluation)
2337
+ - [Citation Information](#citation-information)
2338
+ - [Acknowledgements](#acknowledgements)
2339
+
2340
+ ## Model Details
2341
+
2342
+ Neural machine translation model for translating from Bantu languages (bnt) to unknown (deu+eng+fra+por+spa).
2343
+
2344
+ This model is part of the [OPUS-MT project](https://github.com/Helsinki-NLP/Opus-MT), an effort to make neural machine translation models widely available and accessible for many languages in the world. All models are originally trained using the amazing framework of [Marian NMT](https://marian-nmt.github.io/), an efficient NMT implementation written in pure C++. The models have been converted to pyTorch using the transformers library by huggingface. Training data is taken from [OPUS](https://opus.nlpl.eu/) and training pipelines use the procedures of [OPUS-MT-train](https://github.com/Helsinki-NLP/Opus-MT-train).
2345
+ **Model Description:**
2346
+ - **Developed by:** Language Technology Research Group at the University of Helsinki
2347
+ - **Model Type:** Translation (transformer-big)
2348
+ - **Release**: 2024-05-30
2349
+ - **License:** Apache-2.0
2350
+ - **Language(s):**
2351
+ - Source Language(s): bas bem bnt bss cce cjk cwe dig dug gog gwr hay heh her jmc kam kdc kdn kik kin kki kkj kmb kng kon ksb kua ldi lem lin lon lsm lua lug luy mcp myx nbl nde ndo nim nnb nso nuj nya nyf nyn nyo nyy old ozm pkb rim run seh sna sot ssw suk swa swc swh sxb thk tlj toh toi tsn tso tum umb ven vmw vun wmw xho xog zul
2352
+ - Target Language(s): deu eng fra por spa
2353
+ - Valid Target Language Labels: >>deu<< >>eng<< >>fra<< >>por<< >>spa<< >>xxx<<
2354
+ - **Original Model**: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/bnt-deu+eng+fra+por+spa/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip)
2355
+ - **Resources for more information:**
2356
+ - [OPUS-MT dashboard](https://opus.nlpl.eu/dashboard/index.php?pkg=opusmt&test=all&scoreslang=all&chart=standard&model=Tatoeba-MT-models/bnt-deu%2Beng%2Bfra%2Bpor%2Bspa/opusTCv20230926max50%2Bbt%2Bjhubc_transformer-big_2024-05-30)
2357
+ - [OPUS-MT-train GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
2358
+ - [More information about MarianNMT models in the transformers library](https://huggingface.co/docs/transformers/model_doc/marian)
2359
+ - [Tatoeba Translation Challenge](https://github.com/Helsinki-NLP/Tatoeba-Challenge/)
2360
+ - [HPLT bilingual data v1 (as part of the Tatoeba Translation Challenge dataset)](https://hplt-project.org/datasets/v1)
2361
+ - [A massively parallel Bible corpus](https://aclanthology.org/L14-1215/)
2362
+
2363
+ This is a multilingual translation model with multiple target languages. A sentence initial language token is required in the form of `>>id<<` (id = valid target language ID), e.g. `>>deu<<`
2364
+
2365
+ ## Uses
2366
+
2367
+ This model can be used for translation and text-to-text generation.
2368
+
2369
+ ## Risks, Limitations and Biases
2370
+
2371
+ **CONTENT WARNING: Readers should be aware that the model is trained on various public data sets that may contain content that is disturbing, offensive, and can propagate historical and current stereotypes.**
2372
+
2373
+ Significant research has explored bias and fairness issues with language models (see, e.g., [Sheng et al. (2021)](https://aclanthology.org/2021.acl-long.330.pdf) and [Bender et al. (2021)](https://dl.acm.org/doi/pdf/10.1145/3442188.3445922)).
2374
+
2375
+ ## How to Get Started With the Model
2376
+
2377
+ A short example code:
2378
+
2379
+ ```python
2380
+ from transformers import MarianMTModel, MarianTokenizer
2381
+
2382
+ src_text = [
2383
+ ">>deu<< Replace this with text in an accepted source language.",
2384
+ ">>spa<< This is the second sentence."
2385
+ ]
2386
+
2387
+ model_name = "pytorch-models/opus-mt-tc-bible-big-bnt-deu_eng_fra_por_spa"
2388
+ tokenizer = MarianTokenizer.from_pretrained(model_name)
2389
+ model = MarianMTModel.from_pretrained(model_name)
2390
+ translated = model.generate(**tokenizer(src_text, return_tensors="pt", padding=True))
2391
+
2392
+ for t in translated:
2393
+ print( tokenizer.decode(t, skip_special_tokens=True) )
2394
+ ```
2395
+
2396
+ You can also use OPUS-MT models with the transformers pipelines, for example:
2397
+
2398
+ ```python
2399
+ from transformers import pipeline
2400
+ pipe = pipeline("translation", model="Helsinki-NLP/opus-mt-tc-bible-big-bnt-deu_eng_fra_por_spa")
2401
+ print(pipe(">>deu<< Replace this with text in an accepted source language."))
2402
+ ```
2403
+
2404
+ ## Training
2405
+
2406
+ - **Data**: opusTCv20230926max50+bt+jhubc ([source](https://github.com/Helsinki-NLP/Tatoeba-Challenge))
2407
+ - **Pre-processing**: SentencePiece (spm32k,spm32k)
2408
+ - **Model Type:** transformer-big
2409
+ - **Original MarianNMT Model**: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip](https://object.pouta.csc.fi/Tatoeba-MT-models/bnt-deu+eng+fra+por+spa/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30.zip)
2410
+ - **Training Scripts**: [GitHub Repo](https://github.com/Helsinki-NLP/OPUS-MT-train)
2411
+
2412
+ ## Evaluation
2413
+
2414
+ * [Model scores at the OPUS-MT dashboard](https://opus.nlpl.eu/dashboard/index.php?pkg=opusmt&test=all&scoreslang=all&chart=standard&model=Tatoeba-MT-models/bnt-deu%2Beng%2Bfra%2Bpor%2Bspa/opusTCv20230926max50%2Bbt%2Bjhubc_transformer-big_2024-05-30)
2415
+ * test set translations: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.test.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/bnt-deu+eng+fra+por+spa/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.test.txt)
2416
+ * test set scores: [opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.eval.txt](https://object.pouta.csc.fi/Tatoeba-MT-models/bnt-deu+eng+fra+por+spa/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-29.eval.txt)
2417
+ * benchmark results: [benchmark_results.txt](benchmark_results.txt)
2418
+ * benchmark output: [benchmark_translations.zip](benchmark_translations.zip)
2419
+
2420
+ | langpair | testset | chr-F | BLEU | #sent | #words |
2421
+ |----------|---------|-------|-------|-------|--------|
2422
+ | run-deu | tatoeba-test-v2021-08-07 | 0.43836 | 26.1 | 1752 | 10562 |
2423
+ | run-eng | tatoeba-test-v2021-08-07 | 0.54089 | 39.4 | 1703 | 10041 |
2424
+ | run-fra | tatoeba-test-v2021-08-07 | 0.46240 | 26.1 | 1274 | 7479 |
2425
+ | run-spa | tatoeba-test-v2021-08-07 | 0.46496 | 25.8 | 963 | 5167 |
2426
+ | swa-eng | tatoeba-test-v2021-08-07 | 0.59947 | 45.9 | 387 | 2508 |
2427
+ | lin-eng | flores101-devtest | 0.40858 | 16.9 | 1012 | 24721 |
2428
+ | nso-eng | flores101-devtest | 0.49866 | 26.5 | 1012 | 24721 |
2429
+ | sna-fra | flores101-devtest | 0.40134 | 14.3 | 1012 | 28343 |
2430
+ | swh-deu | flores101-devtest | 0.43073 | 14.2 | 1012 | 25094 |
2431
+ | zul-fra | flores101-devtest | 0.43723 | 17.4 | 1012 | 28343 |
2432
+ | zul-por | flores101-devtest | 0.41886 | 15.9 | 1012 | 26519 |
2433
+ | bem-eng | flores200-devtest | 0.42350 | 18.1 | 1012 | 24721 |
2434
+ | kin-eng | flores200-devtest | 0.46183 | 21.9 | 1012 | 24721 |
2435
+ | kin-fra | flores200-devtest | 0.40139 | 14.7 | 1012 | 28343 |
2436
+ | lin-eng | flores200-devtest | 0.42073 | 18.1 | 1012 | 24721 |
2437
+ | nso-eng | flores200-devtest | 0.51453 | 28.4 | 1012 | 24721 |
2438
+ | nso-fra | flores200-devtest | 0.41065 | 16.1 | 1012 | 28343 |
2439
+ | nya-eng | flores200-devtest | 0.44398 | 20.2 | 1012 | 24721 |
2440
+ | run-eng | flores200-devtest | 0.42987 | 18.9 | 1012 | 24721 |
2441
+ | sna-eng | flores200-devtest | 0.45917 | 21.1 | 1012 | 24721 |
2442
+ | sna-fra | flores200-devtest | 0.41153 | 15.2 | 1012 | 28343 |
2443
+ | sot-eng | flores200-devtest | 0.51854 | 26.9 | 1012 | 24721 |
2444
+ | sot-fra | flores200-devtest | 0.41340 | 15.8 | 1012 | 28343 |
2445
+ | ssw-eng | flores200-devtest | 0.44925 | 20.7 | 1012 | 24721 |
2446
+ | swh-deu | flores200-devtest | 0.44937 | 15.6 | 1012 | 25094 |
2447
+ | swh-eng | flores200-devtest | 0.60107 | 37.0 | 1012 | 24721 |
2448
+ | swh-fra | flores200-devtest | 0.50257 | 23.5 | 1012 | 28343 |
2449
+ | swh-por | flores200-devtest | 0.49475 | 22.8 | 1012 | 26519 |
2450
+ | swh-spa | flores200-devtest | 0.42866 | 15.3 | 1012 | 29199 |
2451
+ | tsn-eng | flores200-devtest | 0.45365 | 19.9 | 1012 | 24721 |
2452
+ | tso-eng | flores200-devtest | 0.46882 | 22.8 | 1012 | 24721 |
2453
+ | xho-eng | flores200-devtest | 0.52500 | 28.8 | 1012 | 24721 |
2454
+ | xho-fra | flores200-devtest | 0.44642 | 18.7 | 1012 | 28343 |
2455
+ | xho-por | flores200-devtest | 0.42517 | 16.8 | 1012 | 26519 |
2456
+ | zul-eng | flores200-devtest | 0.53428 | 29.5 | 1012 | 24721 |
2457
+ | zul-fra | flores200-devtest | 0.45383 | 19.0 | 1012 | 28343 |
2458
+ | zul-por | flores200-devtest | 0.43537 | 17.4 | 1012 | 26519 |
2459
+ | bem-eng | ntrex128 | 0.43168 | 19.1 | 1997 | 47673 |
2460
+ | kin-eng | ntrex128 | 0.46996 | 20.8 | 1997 | 47673 |
2461
+ | kin-fra | ntrex128 | 0.40765 | 14.7 | 1997 | 53481 |
2462
+ | kin-spa | ntrex128 | 0.41552 | 15.9 | 1997 | 54107 |
2463
+ | nde-eng | ntrex128 | 0.42744 | 17.1 | 1997 | 47673 |
2464
+ | nso-eng | ntrex128 | 0.47231 | 21.5 | 1997 | 47673 |
2465
+ | nso-spa | ntrex128 | 0.40135 | 15.2 | 1997 | 54107 |
2466
+ | nya-eng | ntrex128 | 0.47072 | 23.3 | 1997 | 47673 |
2467
+ | nya-spa | ntrex128 | 0.41006 | 16.2 | 1997 | 54107 |
2468
+ | ssw-eng | ntrex128 | 0.48682 | 23.5 | 1997 | 47673 |
2469
+ | ssw-spa | ntrex128 | 0.40839 | 15.9 | 1997 | 54107 |
2470
+ | swa-deu | ntrex128 | 0.43880 | 14.1 | 1997 | 48761 |
2471
+ | swa-eng | ntrex128 | 0.58527 | 35.4 | 1997 | 47673 |
2472
+ | swa-fra | ntrex128 | 0.47344 | 19.7 | 1997 | 53481 |
2473
+ | swa-por | ntrex128 | 0.46292 | 19.1 | 1997 | 51631 |
2474
+ | swa-spa | ntrex128 | 0.48780 | 22.9 | 1997 | 54107 |
2475
+ | tsn-eng | ntrex128 | 0.50413 | 25.3 | 1997 | 47673 |
2476
+ | tsn-fra | ntrex128 | 0.41912 | 15.8 | 1997 | 53481 |
2477
+ | tsn-por | ntrex128 | 0.41090 | 15.3 | 1997 | 51631 |
2478
+ | tsn-spa | ntrex128 | 0.42979 | 17.7 | 1997 | 54107 |
2479
+ | ven-eng | ntrex128 | 0.43364 | 18.4 | 1997 | 47673 |
2480
+ | xho-eng | ntrex128 | 0.50778 | 26.5 | 1997 | 47673 |
2481
+ | xho-fra | ntrex128 | 0.41066 | 15.1 | 1997 | 53481 |
2482
+ | xho-spa | ntrex128 | 0.42129 | 16.7 | 1997 | 54107 |
2483
+ | zul-eng | ntrex128 | 0.50361 | 26.9 | 1997 | 47673 |
2484
+ | zul-fra | ntrex128 | 0.40779 | 15.0 | 1997 | 53481 |
2485
+ | zul-spa | ntrex128 | 0.41836 | 16.9 | 1997 | 54107 |
2486
+ | kin-eng | tico19-test | 0.42280 | 18.8 | 2100 | 56323 |
2487
+ | lin-eng | tico19-test | 0.41495 | 18.4 | 2100 | 56323 |
2488
+ | lug-eng | tico19-test | 0.43948 | 22.2 | 2100 | 56323 |
2489
+ | swa-eng | tico19-test | 0.58126 | 34.5 | 2100 | 56315 |
2490
+ | swa-fra | tico19-test | 0.46470 | 20.5 | 2100 | 64661 |
2491
+ | swa-por | tico19-test | 0.49374 | 22.8 | 2100 | 62729 |
2492
+ | swa-spa | tico19-test | 0.50214 | 24.5 | 2100 | 66563 |
2493
+ | zul-eng | tico19-test | 0.55678 | 32.2 | 2100 | 56804 |
2494
+ | zul-fra | tico19-test | 0.43797 | 18.6 | 2100 | 64661 |
2495
+ | zul-por | tico19-test | 0.45560 | 19.9 | 2100 | 62729 |
2496
+ | zul-spa | tico19-test | 0.46505 | 21.4 | 2100 | 66563 |
2497
+
2498
+ ## Citation Information
2499
+
2500
+ * Publications: [Democratizing neural machine translation with OPUS-MT](https://doi.org/10.1007/s10579-023-09704-w) and [OPUS-MT – Building open translation services for the World](https://aclanthology.org/2020.eamt-1.61/) and [The Tatoeba Translation Challenge – Realistic Data Sets for Low Resource and Multilingual MT](https://aclanthology.org/2020.wmt-1.139/) (Please, cite if you use this model.)
2501
+
2502
+ ```bibtex
2503
+ @article{tiedemann2023democratizing,
2504
+ title={Democratizing neural machine translation with {OPUS-MT}},
2505
+ author={Tiedemann, J{\"o}rg and Aulamo, Mikko and Bakshandaeva, Daria and Boggia, Michele and Gr{\"o}nroos, Stig-Arne and Nieminen, Tommi and Raganato, Alessandro and Scherrer, Yves and Vazquez, Raul and Virpioja, Sami},
2506
+ journal={Language Resources and Evaluation},
2507
+ number={58},
2508
+ pages={713--755},
2509
+ year={2023},
2510
+ publisher={Springer Nature},
2511
+ issn={1574-0218},
2512
+ doi={10.1007/s10579-023-09704-w}
2513
+ }
2514
+
2515
+ @inproceedings{tiedemann-thottingal-2020-opus,
2516
+ title = "{OPUS}-{MT} {--} Building open translation services for the World",
2517
+ author = {Tiedemann, J{\"o}rg and Thottingal, Santhosh},
2518
+ booktitle = "Proceedings of the 22nd Annual Conference of the European Association for Machine Translation",
2519
+ month = nov,
2520
+ year = "2020",
2521
+ address = "Lisboa, Portugal",
2522
+ publisher = "European Association for Machine Translation",
2523
+ url = "https://aclanthology.org/2020.eamt-1.61",
2524
+ pages = "479--480",
2525
+ }
2526
+
2527
+ @inproceedings{tiedemann-2020-tatoeba,
2528
+ title = "The Tatoeba Translation Challenge {--} Realistic Data Sets for Low Resource and Multilingual {MT}",
2529
+ author = {Tiedemann, J{\"o}rg},
2530
+ booktitle = "Proceedings of the Fifth Conference on Machine Translation",
2531
+ month = nov,
2532
+ year = "2020",
2533
+ address = "Online",
2534
+ publisher = "Association for Computational Linguistics",
2535
+ url = "https://aclanthology.org/2020.wmt-1.139",
2536
+ pages = "1174--1182",
2537
+ }
2538
+ ```
2539
+
2540
+ ## Acknowledgements
2541
+
2542
+ The work is supported by the [HPLT project](https://hplt-project.org/), funded by the European Union’s Horizon Europe research and innovation programme under grant agreement No 101070350. We are also grateful for the generous computational resources and IT infrastructure provided by [CSC -- IT Center for Science](https://www.csc.fi/), Finland, and the [EuroHPC supercomputer LUMI](https://www.lumi-supercomputer.eu/).
2543
+
2544
+ ## Model conversion info
2545
+
2546
+ * transformers version: 4.45.1
2547
+ * OPUS-MT git hash: ae3fcbd
2548
+ * port time: Mon Oct 7 15:10:35 EEST 2024
2549
+ * port machine: LM0-400-22516.local
benchmark_results.txt ADDED
@@ -0,0 +1,224 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ multi-multi tatoeba-test-v2020-07-28-v2023-09-26 0.47850 30.9 6293 37606
2
+ kam-fra flores101-devtest 0.24283 4.7 1012 28343
3
+ lin-deu flores101-devtest 0.33421 7.7 1012 25094
4
+ lin-eng flores101-devtest 0.40858 16.9 1012 24721
5
+ lin-spa flores101-devtest 0.33503 9.0 1012 29199
6
+ lug-deu flores101-devtest 0.29673 6.0 1012 25094
7
+ lug-eng flores101-devtest 0.34463 12.5 1012 24721
8
+ lug-fra flores101-devtest 0.30565 8.1 1012 28343
9
+ lug-por flores101-devtest 0.30191 8.0 1012 26519
10
+ nso-deu flores101-devtest 0.36478 9.7 1012 25094
11
+ nso-eng flores101-devtest 0.49866 26.5 1012 24721
12
+ nso-fra flores101-devtest 0.39204 14.5 1012 28343
13
+ nya-por flores101-devtest 0.35924 11.4 1012 26519
14
+ nya-spa flores101-devtest 0.33426 8.7 1012 29199
15
+ sna-fra flores101-devtest 0.40134 14.3 1012 28343
16
+ sna-por flores101-devtest 0.37537 12.5 1012 26519
17
+ sna-spa flores101-devtest 0.34542 9.4 1012 29199
18
+ swh-deu flores101-devtest 0.43073 14.2 1012 25094
19
+ umb-fra flores101-devtest 0.24142 3.7 1012 28343
20
+ umb-por flores101-devtest 0.25350 4.4 1012 26519
21
+ umb-spa flores101-devtest 0.24756 3.6 1012 29199
22
+ xho-spa flores101-devtest 0.36714 10.8 1012 29199
23
+ zul-fra flores101-devtest 0.43723 17.4 1012 28343
24
+ zul-por flores101-devtest 0.41886 15.9 1012 26519
25
+ bem-deu flores200-devtest 0.34227 7.6 1012 25094
26
+ bem-eng flores200-devtest 0.42350 18.1 1012 24721
27
+ bem-fra flores200-devtest 0.36976 12.4 1012 28343
28
+ bem-por flores200-devtest 0.36443 11.3 1012 26519
29
+ bem-spa flores200-devtest 0.34016 9.3 1012 29199
30
+ cjk-deu flores200-devtest 0.23961 2.4 1012 25094
31
+ cjk-eng flores200-devtest 0.25448 4.6 1012 24721
32
+ cjk-fra flores200-devtest 0.24006 3.9 1012 28343
33
+ cjk-por flores200-devtest 0.25810 4.3 1012 26519
34
+ cjk-spa flores200-devtest 0.25387 3.8 1012 29199
35
+ kam-deu flores200-devtest 0.25599 3.1 1012 25094
36
+ kam-eng flores200-devtest 0.27510 7.0 1012 24721
37
+ kam-fra flores200-devtest 0.25343 5.2 1012 28343
38
+ kam-por flores200-devtest 0.25602 5.3 1012 26519
39
+ kam-spa flores200-devtest 0.25510 4.6 1012 29199
40
+ kik-deu flores200-devtest 0.32252 6.2 1012 25094
41
+ kik-eng flores200-devtest 0.38501 14.2 1012 24721
42
+ kik-fra flores200-devtest 0.34427 10.3 1012 28343
43
+ kik-por flores200-devtest 0.34146 9.4 1012 26519
44
+ kik-spa flores200-devtest 0.32062 7.6 1012 29199
45
+ kin-deu flores200-devtest 0.36157 9.7 1012 25094
46
+ kin-eng flores200-devtest 0.46183 21.9 1012 24721
47
+ kin-fra flores200-devtest 0.40139 14.7 1012 28343
48
+ kin-por flores200-devtest 0.38408 13.7 1012 26519
49
+ kin-spa flores200-devtest 0.35592 10.5 1012 29199
50
+ kmb-deu flores200-devtest 0.26310 3.2 1012 25094
51
+ kmb-eng flores200-devtest 0.27999 6.2 1012 24721
52
+ kmb-fra flores200-devtest 0.26298 5.6 1012 28343
53
+ kmb-por flores200-devtest 0.27417 5.8 1012 26519
54
+ kmb-spa flores200-devtest 0.26802 4.9 1012 29199
55
+ kon-deu flores200-devtest 0.32094 6.6 1012 25094
56
+ kon-eng flores200-devtest 0.37260 14.0 1012 24721
57
+ kon-fra flores200-devtest 0.35258 11.1 1012 28343
58
+ kon-por flores200-devtest 0.34380 10.7 1012 26519
59
+ kon-spa flores200-devtest 0.31985 8.2 1012 29199
60
+ lin-deu flores200-devtest 0.34496 8.4 1012 25094
61
+ lin-eng flores200-devtest 0.42073 18.1 1012 24721
62
+ lin-fra flores200-devtest 0.39759 14.8 1012 28343
63
+ lin-por flores200-devtest 0.37600 12.9 1012 26519
64
+ lin-spa flores200-devtest 0.34459 9.8 1012 29199
65
+ lua-deu flores200-devtest 0.26601 3.2 1012 25094
66
+ lua-eng flores200-devtest 0.29489 7.7 1012 24721
67
+ lua-fra flores200-devtest 0.27005 5.5 1012 28343
68
+ lua-por flores200-devtest 0.27062 5.6 1012 26519
69
+ lua-spa flores200-devtest 0.27050 5.0 1012 29199
70
+ lug-deu flores200-devtest 0.30557 6.3 1012 25094
71
+ lug-eng flores200-devtest 0.35746 13.5 1012 24721
72
+ lug-fra flores200-devtest 0.32168 9.3 1012 28343
73
+ lug-por flores200-devtest 0.31717 8.7 1012 26519
74
+ lug-spa flores200-devtest 0.29897 7.0 1012 29199
75
+ nso-deu flores200-devtest 0.38059 10.7 1012 25094
76
+ nso-eng flores200-devtest 0.51453 28.4 1012 24721
77
+ nso-fra flores200-devtest 0.41065 16.1 1012 28343
78
+ nso-por flores200-devtest 0.38374 14.1 1012 26519
79
+ nso-spa flores200-devtest 0.35022 10.3 1012 29199
80
+ nya-deu flores200-devtest 0.35468 8.6 1012 25094
81
+ nya-eng flores200-devtest 0.44398 20.2 1012 24721
82
+ nya-fra flores200-devtest 0.39327 14.0 1012 28343
83
+ nya-por flores200-devtest 0.37373 12.6 1012 26519
84
+ nya-spa flores200-devtest 0.34416 9.5 1012 29199
85
+ run-deu flores200-devtest 0.35728 9.3 1012 25094
86
+ run-eng flores200-devtest 0.42987 18.9 1012 24721
87
+ run-fra flores200-devtest 0.39369 14.6 1012 28343
88
+ run-por flores200-devtest 0.38111 13.4 1012 26519
89
+ run-spa flores200-devtest 0.35229 10.3 1012 29199
90
+ sna-deu flores200-devtest 0.36522 9.5 1012 25094
91
+ sna-eng flores200-devtest 0.45917 21.1 1012 24721
92
+ sna-fra flores200-devtest 0.41153 15.2 1012 28343
93
+ sna-por flores200-devtest 0.38950 13.5 1012 26519
94
+ sna-spa flores200-devtest 0.35823 10.3 1012 29199
95
+ sot-deu flores200-devtest 0.38311 10.7 1012 25094
96
+ sot-eng flores200-devtest 0.51854 26.9 1012 24721
97
+ sot-fra flores200-devtest 0.41340 15.8 1012 28343
98
+ sot-por flores200-devtest 0.39058 14.6 1012 26519
99
+ sot-spa flores200-devtest 0.35245 10.4 1012 29199
100
+ ssw-deu flores200-devtest 0.35599 8.8 1012 25094
101
+ ssw-eng flores200-devtest 0.44925 20.7 1012 24721
102
+ ssw-fra flores200-devtest 0.39386 14.0 1012 28343
103
+ ssw-por flores200-devtest 0.38240 13.4 1012 26519
104
+ ssw-spa flores200-devtest 0.33998 9.2 1012 29199
105
+ swh-deu flores200-devtest 0.44937 15.6 1012 25094
106
+ swh-eng flores200-devtest 0.60107 37.0 1012 24721
107
+ swh-fra flores200-devtest 0.50257 23.5 1012 28343
108
+ swh-por flores200-devtest 0.49475 22.8 1012 26519
109
+ swh-spa flores200-devtest 0.42866 15.3 1012 29199
110
+ tsn-deu flores200-devtest 0.36377 9.5 1012 25094
111
+ tsn-eng flores200-devtest 0.45365 19.9 1012 24721
112
+ tsn-fra flores200-devtest 0.39321 14.1 1012 28343
113
+ tsn-por flores200-devtest 0.38354 13.1 1012 26519
114
+ tsn-spa flores200-devtest 0.35183 10.4 1012 29199
115
+ tso-deu flores200-devtest 0.35502 9.1 1012 25094
116
+ tso-eng flores200-devtest 0.46882 22.8 1012 24721
117
+ tso-fra flores200-devtest 0.39409 14.2 1012 28343
118
+ tso-por flores200-devtest 0.37640 13.2 1012 26519
119
+ tso-spa flores200-devtest 0.34034 9.5 1012 29199
120
+ tum-deu flores200-devtest 0.29994 4.1 1012 25094
121
+ tum-eng flores200-devtest 0.34380 9.5 1012 24721
122
+ tum-fra flores200-devtest 0.31358 7.1 1012 28343
123
+ tum-por flores200-devtest 0.30909 6.9 1012 26519
124
+ tum-spa flores200-devtest 0.30108 6.2 1012 29199
125
+ umb-deu flores200-devtest 0.25337 2.9 1012 25094
126
+ umb-eng flores200-devtest 0.26682 5.2 1012 24721
127
+ umb-fra flores200-devtest 0.25858 4.6 1012 28343
128
+ umb-por flores200-devtest 0.27016 5.1 1012 26519
129
+ umb-spa flores200-devtest 0.25987 4.3 1012 29199
130
+ xho-deu flores200-devtest 0.39620 11.9 1012 25094
131
+ xho-eng flores200-devtest 0.52500 28.8 1012 24721
132
+ xho-fra flores200-devtest 0.44642 18.7 1012 28343
133
+ xho-por flores200-devtest 0.42517 16.8 1012 26519
134
+ xho-spa flores200-devtest 0.38096 12.0 1012 29199
135
+ zul-deu flores200-devtest 0.39854 11.8 1012 25094
136
+ zul-eng flores200-devtest 0.53428 29.5 1012 24721
137
+ zul-fra flores200-devtest 0.45383 19.0 1012 28343
138
+ zul-por flores200-devtest 0.43537 17.4 1012 26519
139
+ zul-spa flores200-devtest 0.37807 11.7 1012 29199
140
+ bem-deu ntrex128 0.34524 7.8 1997 48761
141
+ bem-eng ntrex128 0.43168 19.1 1997 47673
142
+ bem-fra ntrex128 0.36401 11.4 1997 53481
143
+ bem-por ntrex128 0.36288 11.6 1997 51631
144
+ bem-spa ntrex128 0.38219 13.7 1997 54107
145
+ kin-deu ntrex128 0.36926 9.2 1997 48761
146
+ kin-eng ntrex128 0.46996 20.8 1997 47673
147
+ kin-fra ntrex128 0.40765 14.7 1997 53481
148
+ kin-por ntrex128 0.38834 13.1 1997 51631
149
+ kin-spa ntrex128 0.41552 15.9 1997 54107
150
+ nde-deu ntrex128 0.35313 8.0 1997 48761
151
+ nde-eng ntrex128 0.42744 17.1 1997 47673
152
+ nde-fra ntrex128 0.36837 11.1 1997 53481
153
+ nde-por ntrex128 0.36769 11.0 1997 51631
154
+ nde-spa ntrex128 0.37924 13.0 1997 54107
155
+ nso-deu ntrex128 0.37072 9.4 1997 48761
156
+ nso-eng ntrex128 0.47231 21.5 1997 47673
157
+ nso-fra ntrex128 0.39206 13.6 1997 53481
158
+ nso-por ntrex128 0.37912 12.3 1997 51631
159
+ nso-spa ntrex128 0.40135 15.2 1997 54107
160
+ nya-deu ntrex128 0.37303 10.0 1997 48761
161
+ nya-eng ntrex128 0.47072 23.3 1997 47673
162
+ nya-fra ntrex128 0.39866 14.1 1997 53481
163
+ nya-por ntrex128 0.38960 13.7 1997 51631
164
+ nya-spa ntrex128 0.41006 16.2 1997 54107
165
+ ssw-deu ntrex128 0.37436 9.6 1997 48761
166
+ ssw-eng ntrex128 0.48682 23.5 1997 47673
167
+ ssw-fra ntrex128 0.39351 13.7 1997 53481
168
+ ssw-por ntrex128 0.39220 13.4 1997 51631
169
+ ssw-spa ntrex128 0.40839 15.9 1997 54107
170
+ swa-deu ntrex128 0.43880 14.1 1997 48761
171
+ swa-eng ntrex128 0.58527 35.4 1997 47673
172
+ swa-fra ntrex128 0.47344 19.7 1997 53481
173
+ swa-por ntrex128 0.46292 19.1 1997 51631
174
+ swa-spa ntrex128 0.48780 22.9 1997 54107
175
+ tsn-deu ntrex128 0.38791 10.4 1997 48761
176
+ tsn-eng ntrex128 0.50413 25.3 1997 47673
177
+ tsn-fra ntrex128 0.41912 15.8 1997 53481
178
+ tsn-por ntrex128 0.41090 15.3 1997 51631
179
+ tsn-spa ntrex128 0.42979 17.7 1997 54107
180
+ ven-deu ntrex128 0.34586 7.5 1997 48761
181
+ ven-eng ntrex128 0.43364 18.4 1997 47673
182
+ ven-fra ntrex128 0.37138 12.0 1997 53481
183
+ ven-por ntrex128 0.36506 11.2 1997 51631
184
+ ven-spa ntrex128 0.38029 13.1 1997 54107
185
+ xho-deu ntrex128 0.38470 10.5 1997 48761
186
+ xho-eng ntrex128 0.50778 26.5 1997 47673
187
+ xho-fra ntrex128 0.41066 15.1 1997 53481
188
+ xho-por ntrex128 0.39822 14.0 1997 51631
189
+ xho-spa ntrex128 0.42129 16.7 1997 54107
190
+ zul-deu ntrex128 0.38155 10.9 1997 48761
191
+ zul-eng ntrex128 0.50361 26.9 1997 47673
192
+ zul-fra ntrex128 0.40779 15.0 1997 53481
193
+ zul-por ntrex128 0.39932 14.5 1997 51631
194
+ zul-spa ntrex128 0.41836 16.9 1997 54107
195
+ run-fra tatoeba-test-v2020-07-28 0.45463 26.1 1278 7496
196
+ swa-eng tatoeba-test-v2020-07-28 0.60548 46.5 386 2499
197
+ run-fra tatoeba-test-v2021-03-30 0.45463 26.1 1278 7496
198
+ run-spa tatoeba-test-v2021-03-30 0.45472 25.1 968 5198
199
+ xho-eng tatoeba-test-v2021-03-30 0.56987 41.7 222 1467
200
+ run-deu tatoeba-test-v2021-08-07 0.43836 26.1 1752 10562
201
+ run-eng tatoeba-test-v2021-08-07 0.54089 39.4 1703 10041
202
+ run-fra tatoeba-test-v2021-08-07 0.46240 26.1 1274 7479
203
+ run-spa tatoeba-test-v2021-08-07 0.46496 25.8 963 5167
204
+ swa-eng tatoeba-test-v2021-08-07 0.59947 45.9 387 2508
205
+ kin-eng tico19-test 0.42280 18.8 2100 56323
206
+ kin-fra tico19-test 0.36808 13.8 2100 64661
207
+ kin-por tico19-test 0.37819 14.6 2100 62729
208
+ kin-spa tico19-test 0.39391 15.9 2100 66563
209
+ lin-eng tico19-test 0.41495 18.4 2100 56323
210
+ lin-fra tico19-test 0.36297 13.5 2100 64661
211
+ lin-por tico19-test 0.36515 13.3 2100 62729
212
+ lin-spa tico19-test 0.37607 14.4 2100 66563
213
+ lug-eng tico19-test 0.43948 22.2 2100 56323
214
+ lug-fra tico19-test 0.36537 13.7 2100 64661
215
+ lug-por tico19-test 0.38018 14.6 2100 62729
216
+ lug-spa tico19-test 0.38861 15.5 2100 66563
217
+ swa-eng tico19-test 0.58126 34.5 2100 56315
218
+ swa-fra tico19-test 0.46470 20.5 2100 64661
219
+ swa-por tico19-test 0.49374 22.8 2100 62729
220
+ swa-spa tico19-test 0.50214 24.5 2100 66563
221
+ zul-eng tico19-test 0.55678 32.2 2100 56804
222
+ zul-fra tico19-test 0.43797 18.6 2100 64661
223
+ zul-por tico19-test 0.45560 19.9 2100 62729
224
+ zul-spa tico19-test 0.46505 21.4 2100 66563
benchmark_translations.zip ADDED
File without changes
config.json ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "pytorch-models/opus-mt-tc-bible-big-bnt-deu_eng_fra_por_spa",
3
+ "activation_dropout": 0.0,
4
+ "activation_function": "relu",
5
+ "architectures": [
6
+ "MarianMTModel"
7
+ ],
8
+ "attention_dropout": 0.0,
9
+ "bos_token_id": 0,
10
+ "classifier_dropout": 0.0,
11
+ "d_model": 1024,
12
+ "decoder_attention_heads": 16,
13
+ "decoder_ffn_dim": 4096,
14
+ "decoder_layerdrop": 0.0,
15
+ "decoder_layers": 6,
16
+ "decoder_start_token_id": 61611,
17
+ "decoder_vocab_size": 61612,
18
+ "dropout": 0.1,
19
+ "encoder_attention_heads": 16,
20
+ "encoder_ffn_dim": 4096,
21
+ "encoder_layerdrop": 0.0,
22
+ "encoder_layers": 6,
23
+ "eos_token_id": 332,
24
+ "forced_eos_token_id": null,
25
+ "init_std": 0.02,
26
+ "is_encoder_decoder": true,
27
+ "max_length": null,
28
+ "max_position_embeddings": 1024,
29
+ "model_type": "marian",
30
+ "normalize_embedding": false,
31
+ "num_beams": null,
32
+ "num_hidden_layers": 6,
33
+ "pad_token_id": 61611,
34
+ "scale_embedding": true,
35
+ "share_encoder_decoder_embeddings": true,
36
+ "static_position_embeddings": true,
37
+ "torch_dtype": "float32",
38
+ "transformers_version": "4.45.1",
39
+ "use_cache": true,
40
+ "vocab_size": 61612
41
+ }
generation_config.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bad_words_ids": [
4
+ [
5
+ 61611
6
+ ]
7
+ ],
8
+ "bos_token_id": 0,
9
+ "decoder_start_token_id": 61611,
10
+ "eos_token_id": 332,
11
+ "forced_eos_token_id": 332,
12
+ "max_length": 512,
13
+ "num_beams": 4,
14
+ "pad_token_id": 61611,
15
+ "transformers_version": "4.45.1"
16
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d7b2a5cf0bb495004726f9a521043c4f0215c523a1044e5f051428980518f786
3
+ size 958068320
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:884c221077d2a6b709e421cb3db27788e18528815b94dbfe1858badd90f98f15
3
+ size 958119557
source.spm ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:525c410bf925e5f32c6f7782c8a78005fa4bf151230d7d7044dde8e9d52c7aef
3
+ size 767476
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"eos_token": "</s>", "unk_token": "<unk>", "pad_token": "<pad>"}
target.spm ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3967c580a86b33624698c54b4cba559acb207da8d9623613f61429af6b394bf9
3
+ size 814366
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"source_lang": "bnt", "target_lang": "deu+eng+fra+por+spa", "unk_token": "<unk>", "eos_token": "</s>", "pad_token": "<pad>", "model_max_length": 512, "sp_model_kwargs": {}, "separate_vocabs": false, "special_tokens_map_file": null, "name_or_path": "marian-models/opusTCv20230926max50+bt+jhubc_transformer-big_2024-05-30/bnt-deu+eng+fra+por+spa", "tokenizer_class": "MarianTokenizer"}
vocab.json ADDED
The diff for this file is too large to render. See raw diff