soonchang commited on
Commit
40fc9a6
β€’
1 Parent(s): 0ba539f

Create README_ms.md

Browse files
Files changed (1) hide show
  1. README_ms.md +553 -0
README_ms.md ADDED
@@ -0,0 +1,553 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ task_categories:
3
+ - question-answering
4
+ language:
5
+ - ms
6
+ tags:
7
+ - knowledge
8
+ pretty_name: MalayMMLU
9
+ size_categories:
10
+ - 10K<n<100K
11
+ ---
12
+ # MalayMMLU
13
+
14
+ Dilancarkan pada 27 September 2024
15
+
16
+ <h4 align="center">
17
+ <p>
18
+ <a href="https://huggingface.co/datasets/UMxYTLAILabs/MalayMMLU">English</a> |
19
+ <b href="https://huggingface.co/datasets/UMxYTLAILabs/MalayMMLU/blob/main/README_ms.md">Bahasa Melayu</b>
20
+ <p>
21
+ <p align="center" style="display: flex; flex-direction: row; justify-content: center; align-items: center">
22
+ πŸ“„ <a href="https://openreview.net/pdf?id=VAXwQqkp5e" target="_blank" style="margin-right: 15px; margin-left: 10px">Paper</a> β€’
23
+ πŸ€— <a href="https://huggingface.co/datasets/UMxYTLAILabs/MalayMMLU" target="_blank" style="margin-left: 10px">Dataset</a>
24
+ </p>
25
+ </h4>
26
+
27
+
28
+ ## Pengenalan
29
+
30
+ MalayMMLU ialah tanda aras kefahaman bahasa pelbagai tugas (Massive Multitask Language Understanding (MMLU) dalam Bahasa Inggeris) pertama untuk Bahasa Melayu. Tanda aras ini merangkumi 24,213 soalan yang meliputi peringkat pendidikan rendah (Tahun 1-6) dan menengah (Tingkatan 1-5) di Malaysia, terdiri daripada 5 topik utama yang dibahagikan kepada 22 subjek.
31
+
32
+ <p align="center">
33
+ <img src="imgs/MalayMMLU.png" width="250" >
34
+ </p>
35
+
36
+ | **Topik** | **Subjek** |
37
+ |----------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
38
+ | **STEM** | Sains Komputer (Menengah), Biologi (Menengah), Kimia (Menengah), Literasi Komputer (Menengah), Matematik (Rendah, Menengah), Matematik Tambahan (Menengah), Reka Bentuk dan Teknologi (Rendah, Menengah), Sains Teras (Rendah, Menengah), Teknologi Maklumat dan Komunikasi (Rendah), Teknologi Automotif (Menengah) |
39
+ | **Bahasa** | Bahasa Melayu (Rendah, Menengah) |
40
+ | **Sains Sosial** | Geografi (Menengah), Kajian Tempatan (Rendah), Sejarah (Rendah, Menengah) |
41
+ | **Lain-lain** | Kemahiran Hidup (Rendah, Menengah), Prinsip Perakaunan (Menengah), Ekonomi (Menengah), Perniagaan (Menengah), Pertanian (Menengah) |
42
+ | **Kemanusiaan** | Pendidikan Al Quran dan Al Sunnah (Menengah), Pendidikan Islam (Rendah, Menengah), Pengetahuan Sains Sukan (Menengah) |
43
+
44
+ ## Keputusan
45
+
46
+ #### Keputusan Penilaian Zero-shot untuk MalayMMLU (Ketepatan token pertama)
47
+
48
+ <table>
49
+ <thead>
50
+ <tr>
51
+ <th rowspan="2">Organisasi</th>
52
+ <th rowspan="2">Model</th>
53
+ <th rowspan="2">Visual</th>
54
+ <th colspan="7">Ketepatan</th>
55
+ </tr>
56
+ <tr>
57
+ <th>Bahasa</th>
58
+ <th>Kemanusiaan</th>
59
+ <th>STEM</th>
60
+ <th>Sains Sosial</th>
61
+ <th>Lain-lain</th>
62
+ <th>Purata</th>
63
+ </tr>
64
+ </thead>
65
+ <tbody>
66
+ <tr>
67
+ <td></td>
68
+ <td>Rawak</td>
69
+ <td></td>
70
+ <td>38.01</td>
71
+ <td>42.09</td>
72
+ <td>36.31</td>
73
+ <td>36.01</td>
74
+ <td>38.07</td>
75
+ <td>38.02</td>
76
+ </tr>
77
+ <tr>
78
+ <td rowspan="4">OpenAI</td>
79
+ <td>GPT-4o</td>
80
+ <td style="color: green;">βœ”</td>
81
+ <td><strong>87.12</strong></td>
82
+ <td><strong>88.12</strong></td>
83
+ <td><strong>83.83</strong></td>
84
+ <td><strong>82.58</strong></td>
85
+ <td><strong>83.09</strong></td>
86
+ <td><strong>84.98</strong></td>
87
+ </tr>
88
+ <tr>
89
+ <td>GPT-4</td>
90
+ <td style="color: green;">βœ”</td>
91
+ <td><ins>82.90</ins></td>
92
+ <td><ins>83.91</ins></td>
93
+ <td>78.80</td>
94
+ <td><ins>77.29</ins></td>
95
+ <td><ins>77.33</ins></td>
96
+ <td><ins>80.11</ins></td>
97
+ </tr>
98
+ <tr>
99
+ <td>GPT-4o mini</td>
100
+ <td style="color: green;">βœ”</td>
101
+ <td>82.03</td>
102
+ <td>81.50</td>
103
+ <td>78.51</td>
104
+ <td>75.67</td>
105
+ <td>76.30</td>
106
+ <td>78.78</td>
107
+ </tr>
108
+ <tr>
109
+ <td>GPT-3.5</td>
110
+ <td></td>
111
+ <td>69.62</td>
112
+ <td>71.01</td>
113
+ <td>67.17</td>
114
+ <td>66.70</td>
115
+ <td>63.73</td>
116
+ <td>67.78</td>
117
+ </tr>
118
+ <tr>
119
+ <td rowspan="7">Meta</td>
120
+ <td>LLaMA-3.1 (70B)</td>
121
+ <td></td>
122
+ <td>78.75</td>
123
+ <td>82.59</td>
124
+ <td>78.96</td>
125
+ <td>77.20</td>
126
+ <td>75.32</td>
127
+ <td>78.44</td>
128
+ </tr>
129
+ <tr>
130
+ <td>LLaMA-3.1 (8B)</td>
131
+ <td></td>
132
+ <td>65.47</td>
133
+ <td>67.17</td>
134
+ <td>64.10</td>
135
+ <td>62.59</td>
136
+ <td>62.13</td>
137
+ <td>64.24</td>
138
+ </tr>
139
+ <tr>
140
+ <td>LLaMA-3 (8B)</td>
141
+ <td></td>
142
+ <td>63.93</td>
143
+ <td>66.21</td>
144
+ <td>62.26</td>
145
+ <td>62.97</td>
146
+ <td>61.38</td>
147
+ <td>63.46</td>
148
+ </tr>
149
+ <tr>
150
+ <td>LLaMA-2 (13B)</td>
151
+ <td></td>
152
+ <td>45.58</td>
153
+ <td>50.72</td>
154
+ <td>44.13</td>
155
+ <td>44.55</td>
156
+ <td>40.87</td>
157
+ <td>45.26</td>
158
+ </tr>
159
+ <tr>
160
+ <td>LLaMA-2 (7B)</td>
161
+ <td></td>
162
+ <td>47.47</td>
163
+ <td>52.74</td>
164
+ <td>48.71</td>
165
+ <td>50.72</td>
166
+ <td>48.19</td>
167
+ <td>49.61</td>
168
+ </tr>
169
+ <tr>
170
+ <td>LLaMA-3.2 (3B)</td>
171
+ <td></td>
172
+ <td>58.52</td>
173
+ <td>60.66</td>
174
+ <td>56.65</td>
175
+ <td>54.06</td>
176
+ <td>52.75</td>
177
+ <td>56.45</td>
178
+ </tr>
179
+ <tr>
180
+ <td>LLaMA-3.2 (1B)</td>
181
+ <td></td>
182
+ <td>38.88</td>
183
+ <td>43.30</td>
184
+ <td>40.65</td>
185
+ <td>40.56</td>
186
+ <td>39.55</td>
187
+ <td>40.46</td>
188
+ </tr>
189
+ <tr>
190
+ <td rowspan="8">Qwen (Alibaba)</td>
191
+ <td>Qwen 2.5 (72B)</td>
192
+ <td></td>
193
+ <td>79.09</td>
194
+ <td>79.95</td>
195
+ <td><ins>80.88</ins></td>
196
+ <td>75.80</td>
197
+ <td>75.05</td>
198
+ <td>77.79</td>
199
+ </tr>
200
+ <tr>
201
+ <td>Qwen-2.5 (32B)</td>
202
+ <td></td>
203
+ <td>76.96</td>
204
+ <td>76.70</td>
205
+ <td>79.74</td>
206
+ <td>72.35</td>
207
+ <td>70.88</td>
208
+ <td>74.83</td>
209
+ </tr>
210
+ <tr>
211
+ <td>Qwen-2-VL (7B)</td>
212
+ <td style="color: green;">βœ”</td>
213
+ <td>68.16</td>
214
+ <td>63.62</td>
215
+ <td>67.58</td>
216
+ <td>60.38</td>
217
+ <td>59.08</td>
218
+ <td>63.49</td>
219
+ </tr>
220
+ <tr>
221
+ <td>Qwen-2-VL (2B)</td>
222
+ <td style="color: green;">βœ”</td>
223
+ <td>58.22</td>
224
+ <td>55.56</td>
225
+ <td>57.51</td>
226
+ <td>53.67</td>
227
+ <td>55.10</td>
228
+ <td>55.83</td>
229
+ </tr>
230
+ <tr>
231
+ <td>Qwen-1.5 (14B)</td>
232
+ <td></td>
233
+ <td>64.47</td>
234
+ <td>60.64</td>
235
+ <td>61.97</td>
236
+ <td>57.66</td>
237
+ <td>58.05</td>
238
+ <td>60.47</td>
239
+ </tr>
240
+ <tr>
241
+ <td>Qwen-1.5 (7B)</td>
242
+ <td></td>
243
+ <td>60.13</td>
244
+ <td>59.14</td>
245
+ <td>58.62</td>
246
+ <td>54.26</td>
247
+ <td>54.67</td>
248
+ <td>57.18</td>
249
+ </tr>
250
+ <tr>
251
+ <td>Qwen-1.5 (4B)</td>
252
+ <td></td>
253
+ <td>48.39</td>
254
+ <td>52.01</td>
255
+ <td>51.37</td>
256
+ <td>50.00</td>
257
+ <td>49.10</td>
258
+ <td>49.93</td>
259
+ </tr>
260
+ <tr>
261
+ <td>Qwen-1.5 (1.8B)</td>
262
+ <td></td>
263
+ <td>42.70</td>
264
+ <td>43.37</td>
265
+ <td>43.68</td>
266
+ <td>43.12</td>
267
+ <td>44.42</td>
268
+ <td>43.34</td>
269
+ </tr>
270
+ <tr>
271
+ <td rowspan="5">Zhipu</td>
272
+ <td>GLM-4-Plus</td>
273
+ <td></td>
274
+ <td>78.04</td>
275
+ <td>75.63</td>
276
+ <td>77.49</td>
277
+ <td>74.07</td>
278
+ <td>72.66</td>
279
+ <td>75.48</td>
280
+ </tr>
281
+ <tr>
282
+ <td>GLM-4-Air</td>
283
+ <td></td>
284
+ <td>67.88</td>
285
+ <td>69.56</td>
286
+ <td>70.20</td>
287
+ <td>66.06</td>
288
+ <td>66.18</td>
289
+ <td>67.60</td>
290
+ </tr>
291
+ <tr>
292
+ <td>GLM-4-Flash</td>
293
+ <td></td>
294
+ <td>63.52</td>
295
+ <td>65.69</td>
296
+ <td>66.31</td>
297
+ <td>63.21</td>
298
+ <td>63.59</td>
299
+ <td>64.12</td>
300
+ </tr>
301
+ <tr>
302
+ <td>GLM-4</td>
303
+ <td></td>
304
+ <td>63.39</td>
305
+ <td>56.72</td>
306
+ <td>54.40</td>
307
+ <td>57.24</td>
308
+ <td>55.00</td>
309
+ <td>58.07</td>
310
+ </tr>
311
+ <tr>
312
+ <td>GLM-4<sup>††</sup> (9B)</td>
313
+ <td></td>
314
+ <td>58.51</td>
315
+ <td>60.48</td>
316
+ <td>56.32</td>
317
+ <td>55.04</td>
318
+ <td>53.97</td>
319
+ <td>56.87</td>
320
+ </tr>
321
+ <tr>
322
+ <td rowspan="3">Google</td>
323
+ <td>Gemma-2 (9B)</td>
324
+ <td></td>
325
+ <td>75.83</td>
326
+ <td>72.83</td>
327
+ <td>75.07</td>
328
+ <td>69.72</td>
329
+ <td>70.33</td>
330
+ <td>72.51</td>
331
+ </tr>
332
+ <tr>
333
+ <td>Gemma (7B)</td>
334
+ <td></td>
335
+ <td>45.53</td>
336
+ <td>50.92</td>
337
+ <td>46.13</td>
338
+ <td>47.33</td>
339
+ <td>46.27</td>
340
+ <td>47.21</td>
341
+ </tr>
342
+ <tr>
343
+ <td>Gemma (2B)</td>
344
+ <td></td>
345
+ <td>46.50</td>
346
+ <td>51.15</td>
347
+ <td>49.20</td>
348
+ <td>48.06</td>
349
+ <td>48.79</td>
350
+ <td>48.46</td>
351
+ </tr>
352
+ <tr>
353
+ <td rowspan="2">SAIL (Sea)</td>
354
+ <td>Sailor<sup>†</sup> (14B)</td>
355
+ <td></td>
356
+ <td>78.40</td>
357
+ <td>72.88</td>
358
+ <td>69.63</td>
359
+ <td>69.47</td>
360
+ <td>68.67</td>
361
+ <td>72.29</td>
362
+ </tr>
363
+ <tr>
364
+ <td>Sailor<sup>†</sup> (7B)</td>
365
+ <td></td>
366
+ <td>74.54</td>
367
+ <td>68.62</td>
368
+ <td>62.79</td>
369
+ <td>64.69</td>
370
+ <td>63.61</td>
371
+ <td>67.58</td>
372
+ </tr>
373
+ <tr>
374
+ <td>Cohere for AI</td>
375
+ <td>Command R (32B)</td>
376
+ <td></td>
377
+ <td>71.68</td>
378
+ <td>71.49</td>
379
+ <td>66.68</td>
380
+ <td>67.19</td>
381
+ <td>63.64</td>
382
+ <td>68.47</td>
383
+ </tr>
384
+ <tr>
385
+ <td>OpenGVLab</td>
386
+ <td>InternVL2 (40B)</td>
387
+ <td style="color: green;">βœ”</td>
388
+ <td>70.36</td>
389
+ <td>68.49</td>
390
+ <td>64.88</td>
391
+ <td>65.93</td>
392
+ <td>60.54</td>
393
+ <td>66.51</td>
394
+ </tr>
395
+ <tr>
396
+ <td>Damo (Alibaba)</td>
397
+ <td>SeaLLM-v2.5<sup>†</sup> (7B)</td>
398
+ <td></td>
399
+ <td>69.75</td>
400
+ <td>67.94</td>
401
+ <td>65.29</td>
402
+ <td>62.66</td>
403
+ <td>63.61</td>
404
+ <td>65.89</td>
405
+ </tr>
406
+ <tr>
407
+ <td rowspan="4">Mistral</td>
408
+ <td>Pixtral (12B)</td>
409
+ <td style="color: green;">βœ”</td>
410
+ <td>64.81</td>
411
+ <td>62.68</td>
412
+ <td>64.72</td>
413
+ <td>63.93</td>
414
+ <td>59.49</td>
415
+ <td>63.25</td>
416
+ </tr>
417
+ <tr>
418
+ <td>Mistral Small (22B)</td>
419
+ <td></td>
420
+ <td>65.19</td>
421
+ <td>65.03</td>
422
+ <td>63.36</td>
423
+ <td>61.58</td>
424
+ <td>59.99</td>
425
+ <td>63.05</td>
426
+ </tr>
427
+ <tr>
428
+ <td>Mistral-v0.3 (7B)</td>
429
+ <td></td>
430
+ <td>56.97</td>
431
+ <td>59.29</td>
432
+ <td>57.14</td>
433
+ <td>58.28</td>
434
+ <td>56.56</td>
435
+ <td>57.71</td>
436
+ </tr>
437
+ <tr>
438
+ <td>Mistral-v0.2 (7B)</td>
439
+ <td></td>
440
+ <td>56.23</td>
441
+ <td>59.86</td>
442
+ <td>57.10</td>
443
+ <td>56.65</td>
444
+ <td>55.22</td>
445
+ <td>56.92</td>
446
+ </tr>
447
+ <tr>
448
+ <td rowspan="2">Microsoft</td>
449
+ <td>Phi-3 (14B)</td>
450
+ <td></td>
451
+ <td>60.07</td>
452
+ <td>58.89</td>
453
+ <td>60.91</td>
454
+ <td>58.73</td>
455
+ <td>55.24</td>
456
+ <td>58.72</td>
457
+ </tr>
458
+ <tr>
459
+ <td>Phi-3 (3.8B)</td>
460
+ <td></td>
461
+ <td>52.24</td>
462
+ <td>55.52</td>
463
+ <td>54.81</td>
464
+ <td>53.70</td>
465
+ <td>51.74</td>
466
+ <td>53.43</td>
467
+ </tr>
468
+ <tr>
469
+ <td>01.AI</td>
470
+ <td>Yi-1.5 (9B)</td>
471
+ <td></td>
472
+ <td>56.20</td>
473
+ <td>53.36</td>
474
+ <td>57.47</td>
475
+ <td>50.53</td>
476
+ <td>49.75</td>
477
+ <td>53.08</td>
478
+ </tr>
479
+ <tr>
480
+ <td rowspan="2">Stability AI</td>
481
+ <td>StableLM 2 (12B)</td>
482
+ <td></td>
483
+ <td>53.40</td>
484
+ <td>54.84</td>
485
+ <td>51.45</td>
486
+ <td>51.79</td>
487
+ <td>50.16</td>
488
+ <td>52.45</td>
489
+ </tr>
490
+ <tr>
491
+ <td>StableLM 2 (1.6B)</td>
492
+ <td></td>
493
+ <td>43.92</td>
494
+ <td>51.10</td>
495
+ <td>45.27</td>
496
+ <td>46.14</td>
497
+ <td>46.75</td>
498
+ <td>46.48</td>
499
+ </tr>
500
+ <tr>
501
+ <td>Baichuan</td>
502
+ <td>Baichuan-2 (7B)</td>
503
+ <td></td>
504
+ <td>40.41</td>
505
+ <td>47.35</td>
506
+ <td>44.37</td>
507
+ <td>46.33</td>
508
+ <td>43.54</td>
509
+ <td>44.30</td>
510
+ </tr>
511
+ <tr>
512
+ <td>Mesolitica</td>
513
+ <td>MaLLaM-v2<sup>†</sup> (5B)</td>
514
+ <td></td>
515
+ <td>42.57</td>
516
+ <td>46.44</td>
517
+ <td>42.24</td>
518
+ <td>40.82</td>
519
+ <td>38.74</td>
520
+ <td>42.08</td>
521
+ </tr>
522
+ <tr>
523
+ <td>Yellow.ai</td>
524
+ <td>Komodo<sup>†</sup> (7B)</td>
525
+ <td></td>
526
+ <td>43.62</td>
527
+ <td>45.53</td>
528
+ <td>39.34</td>
529
+ <td>39.75</td>
530
+ <td>39.48</td>
531
+ <td>41.72</td>
532
+ </tr>
533
+ </tbody>
534
+ </table>
535
+ Markah tertinggi telah <strong>ditebalkan</strong> dan markah kedua tertinggi telah <ins>digariskan</ins>.
536
+ † menunjukkan LLM yang dilatih dengan dataset Asia Tenggara.
537
+ †† menunjukkan GLM-4 sumber terbuka.
538
+
539
+
540
+ ## Rujukan
541
+
542
+ ```bibtex
543
+ @InProceedings{MalayMMLU2024,
544
+ author = {Poh, Soon Chang and Yang, Sze Jue and Tan, Jeraelyn Ming Li and Chieng, Lawrence Leroy Tze Yao and Tan, Jia Xuan and Yu, Zhenyu and Foong, Chee Mun and Chan, Chee Seng},
545
+ title = {MalayMMLU: A Multitask Benchmark for the Low-Resource Malay Language},
546
+ booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2024},
547
+ month = {November},
548
+ year = {2024},
549
+ }
550
+ ```
551
+
552
+ ## Maklumbalas
553
+ Cadangan dan pendapat (sama ada positif atau negatif) amat dialu-alukan. Sila hubungi dengan menghantar emel ke `cs.chan di um.edu.my`.