soonchang commited on
Commit
b5d7641
β€’
1 Parent(s): 08384ad

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +501 -23
README.md CHANGED
@@ -40,30 +40,508 @@ MalayMMLU is the first multitask language understanding (MLU) for Malay Language
40
 
41
  #### Zero-shot results of LLMs on MalayMMLU (First token accuracy)
42
 
43
- | **Model** | **Language** | **Humanities** | **STEM** | **Social Science** | **Others** | **Average** |
44
- |-------------------------|-------------------|---------------------|---------------|-------------------------|-----------------|------------------|
45
- | Random | 38.01 | 42.09 | 36.31 | 36.01 | 38.07 | 38.02 |
46
- | GPT-4 | **82.90** | **83.91** | **78.80** | **77.29** | **77.33** | **80.11** |
47
- | GPT-3.5 | 69.62 | 71.01 | 67.17 | 66.70 | 63.73 | 67.78 |
48
- | [LLaMA-3 (8B)](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) | 63.93 | 66.21 | 62.26 | 62.97 | 61.38 | 63.46 |
49
- | [LLaMA-2 (13B)](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf) | 45.58 | 50.72 | 44.13 | 44.55 | 40.87 | 45.26 |
50
- | [LLaMA-2 (7B)](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) | 47.47 | 52.74 | 48.71 | 50.72 | 48.19 | 49.61 |
51
- | [Mistral-v0.3 (7B)](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.3) | 56.97 | 59.29 | 57.14 | 58.28 | 56.56 | 57.71 |
52
- | [Mistral-v0.2 (7B)](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) | 56.23 | 59.86 | 57.10 | 56.65 | 55.22 | 56.92 |
53
- | [Sailor (7B)](https://huggingface.co/sail/Sailor-7B-Chat) | 74.54 | 68.62 | 62.79 | 64.69 | 63.61 | 67.58 |
54
- | [SeaLLM-v2.5 (7B)](https://huggingface.co/SeaLLMs/SeaLLM-7B-v2.5) | 69.75 | 67.94 | 65.29 | 62.66 | 63.61 | 65.89 |
55
- | [Phi-3 (14B)](https://huggingface.co/microsoft/Phi-3-medium-4k-instruct) | 60.07 | 58.89 | 60.91 | 58.73 | 55.24 | 58.72 |
56
- | [Phi-3 (3.8B)](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) | 52.24 | 55.52 | 54.81 | 53.70 | 51.74 | 53.43 |
57
- | [GLM-4 (9B)](https://huggingface.co/THUDM/glm-4-9b-chat) | 58.51 | 60.48 | 56.32 | 55.04 | 53.97 | 56.87 |
58
- | [Qwen-1.5 (7B)](https://huggingface.co/Qwen/Qwen1.5-7B-Chat) | 60.13 | 59.14 | 58.62 | 54.26 | 54.67 | 57.18 |
59
- | [Qwen-1.5 (4B)](https://huggingface.co/Qwen/Qwen1.5-4B-Chat) | 48.39 | 52.01 | 51.37 | 50.00 | 49.10 | 49.93 |
60
- | [Qwen-1.5 (1.8B)](https://huggingface.co/Qwen/Qwen1.5-1.8B-Chat) | 42.70 | 43.37 | 43.68 | 43.12 | 44.42 | 43.34 |
61
- | [Gemma (7B)](https://huggingface.co/google/gemma-7b-it) | 45.53 | 50.92 | 46.13 | 47.33 | 46.27 | 47.21 |
62
- | [Gemma (2B)](https://huggingface.co/google/gemma-2b-it) | 46.50 | 51.15 | 49.20 | 48.06 | 48.79 | 48.46 |
63
- | [Baichuan-2 (7B)](https://huggingface.co/baichuan-inc/Baichuan2-7B-Chat) | 40.41 | 47.35 | 44.37 | 46.33 | 43.54 | 44.30 |
64
- | [Komodo (7B)](https://huggingface.co/Yellow-AI-NLP/komodo-7b-base) | 43.62 | 45.53 | 39.34 | 39.75 | 39.48 | 41.72 |
65
- | [MaLLaM-v2 (5B)](https://huggingface.co/mesolitica/mallam-5b-20k-instructions-v2)| 42.56 | 46.42 | 42.16 | 40.81 | 38.81 | 42.07 |
66
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
67
 
68
 
69
  ## Citation
 
40
 
41
  #### Zero-shot results of LLMs on MalayMMLU (First token accuracy)
42
 
43
+ | **Category** | **Subjects** |
44
+ |----------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
45
+ | **STEM** | Computer Science (Secondary), Biology (Secondary), Chemistry (Secondary), Computer Literacy (Secondary), Mathematics (Primary, Secondary), Additional Mathematics (Secondary), Design and Technology (Primary, Secondary), Core Science (Primary, Secondary), Information and Communication Technology (Primary), Automotive Technology (Secondary) |
46
+ | **Language** | Malay Language (Primary, Secondary) |
47
+ | **Social science** | Geography (Secondary), Local Studies (Primary), History (Primary, Secondary) |
48
+ | **Others** | Life Skills (Primary, Secondary), Principles of Accounting (Secondary), Economics (Secondary), Business (Secondary), Agriculture (Secondary) |
49
+ | **Humanities** | Quran and Sunnah (Secondary), Islam (Primary, Secondary), Sports Science Knowledge (Secondary) |
50
+
51
+ ## Result
52
+
53
+ #### Zero-shot results of LLMs on MalayMMLU (First token accuracy)
 
 
 
 
 
 
 
 
 
 
 
 
54
 
55
+ <table>
56
+ <thead>
57
+ <tr>
58
+ <th rowspan="2">Organization</th>
59
+ <th rowspan="2">Model</th>
60
+ <th rowspan="2">Vision</th>
61
+ <th colspan="7">Acc.</th>
62
+ </tr>
63
+ <tr>
64
+ <th>Language</th>
65
+ <th>Humanities</th>
66
+ <th>STEM</th>
67
+ <th>Social Science</th>
68
+ <th>Others</th>
69
+ <th>Average</th>
70
+ </tr>
71
+ </thead>
72
+ <tbody>
73
+ <tr>
74
+ <td></td>
75
+ <td>Random</td>
76
+ <td></td>
77
+ <td>38.01</td>
78
+ <td>42.09</td>
79
+ <td>36.31</td>
80
+ <td>36.01</td>
81
+ <td>38.07</td>
82
+ <td>38.02</td>
83
+ </tr>
84
+ <tr>
85
+ <td rowspan="4">OpenAI</td>
86
+ <td>GPT-4o</td>
87
+ <td style="color: green;">βœ”</td>
88
+ <td><strong>87.12</strong></td>
89
+ <td><strong>88.12</strong></td>
90
+ <td><strong>83.83</strong></td>
91
+ <td><strong>82.58</strong></td>
92
+ <td><strong>83.09</strong></td>
93
+ <td><strong>84.98</strong></td>
94
+ </tr>
95
+ <tr>
96
+ <td>GPT-4</td>
97
+ <td style="color: green;">βœ”</td>
98
+ <td><ins>82.90</ins></td>
99
+ <td><ins>83.91</ins></td>
100
+ <td>78.80</td>
101
+ <td><ins>77.29</ins></td>
102
+ <td><ins>77.33</ins></td>
103
+ <td><ins>80.11</ins></td>
104
+ </tr>
105
+ <tr>
106
+ <td>GPT-4o mini</td>
107
+ <td style="color: green;">βœ”</td>
108
+ <td>82.03</td>
109
+ <td>81.50</td>
110
+ <td>78.51</td>
111
+ <td>75.67</td>
112
+ <td>76.30</td>
113
+ <td>78.78</td>
114
+ </tr>
115
+ <tr>
116
+ <td>GPT-3.5</td>
117
+ <td></td>
118
+ <td>69.62</td>
119
+ <td>71.01</td>
120
+ <td>67.17</td>
121
+ <td>66.70</td>
122
+ <td>63.73</td>
123
+ <td>67.78</td>
124
+ </tr>
125
+ <tr>
126
+ <td rowspan="7">Meta</td>
127
+ <td>LLaMA-3.1 (70B)</td>
128
+ <td></td>
129
+ <td>78.75</td>
130
+ <td>82.59</td>
131
+ <td>78.96</td>
132
+ <td>77.20</td>
133
+ <td>75.32</td>
134
+ <td>78.44</td>
135
+ </tr>
136
+ <tr>
137
+ <td>LLaMA-3.1 (8B)</td>
138
+ <td></td>
139
+ <td>65.47</td>
140
+ <td>67.17</td>
141
+ <td>64.10</td>
142
+ <td>62.59</td>
143
+ <td>62.13</td>
144
+ <td>64.24</td>
145
+ </tr>
146
+ <tr>
147
+ <td>LLaMA-3 (8B)</td>
148
+ <td></td>
149
+ <td>63.93</td>
150
+ <td>66.21</td>
151
+ <td>62.26</td>
152
+ <td>62.97</td>
153
+ <td>61.38</td>
154
+ <td>63.46</td>
155
+ </tr>
156
+ <tr>
157
+ <td>LLaMA-2 (13B)</td>
158
+ <td></td>
159
+ <td>45.58</td>
160
+ <td>50.72</td>
161
+ <td>44.13</td>
162
+ <td>44.55</td>
163
+ <td>40.87</td>
164
+ <td>45.26</td>
165
+ </tr>
166
+ <tr>
167
+ <td>LLaMA-2 (7B)</td>
168
+ <td></td>
169
+ <td>47.47</td>
170
+ <td>52.74</td>
171
+ <td>48.71</td>
172
+ <td>50.72</td>
173
+ <td>48.19</td>
174
+ <td>49.61</td>
175
+ </tr>
176
+ <tr>
177
+ <td>LLaMA-3.2 (3B)</td>
178
+ <td></td>
179
+ <td>58.52</td>
180
+ <td>60.66</td>
181
+ <td>56.65</td>
182
+ <td>54.06</td>
183
+ <td>52.75</td>
184
+ <td>56.45</td>
185
+ </tr>
186
+ <tr>
187
+ <td>LLaMA-3.2 (1B)</td>
188
+ <td></td>
189
+ <td>38.88</td>
190
+ <td>43.30</td>
191
+ <td>40.65</td>
192
+ <td>40.56</td>
193
+ <td>39.55</td>
194
+ <td>40.46</td>
195
+ </tr>
196
+ <tr>
197
+ <td rowspan="8">Qwen (Alibaba)</td>
198
+ <td>Qwen 2.5 (72B)</td>
199
+ <td></td>
200
+ <td>79.09</td>
201
+ <td>79.95</td>
202
+ <td><ins>80.88</ins></td>
203
+ <td>75.80</td>
204
+ <td>75.05</td>
205
+ <td>77.79</td>
206
+ </tr>
207
+ <tr>
208
+ <td>Qwen-2.5 (32B)</td>
209
+ <td></td>
210
+ <td>76.96</td>
211
+ <td>76.70</td>
212
+ <td>79.74</td>
213
+ <td>72.35</td>
214
+ <td>70.88</td>
215
+ <td>74.83</td>
216
+ </tr>
217
+ <tr>
218
+ <td>Qwen-2-VL (7B)</td>
219
+ <td style="color: green;">βœ”</td>
220
+ <td>68.16</td>
221
+ <td>63.62</td>
222
+ <td>67.58</td>
223
+ <td>60.38</td>
224
+ <td>59.08</td>
225
+ <td>63.49</td>
226
+ </tr>
227
+ <tr>
228
+ <td>Qwen-2-VL (2B)</td>
229
+ <td style="color: green;">βœ”</td>
230
+ <td>58.22</td>
231
+ <td>55.56</td>
232
+ <td>57.51</td>
233
+ <td>53.67</td>
234
+ <td>55.10</td>
235
+ <td>55.83</td>
236
+ </tr>
237
+ <tr>
238
+ <td>Qwen-1.5 (14B)</td>
239
+ <td></td>
240
+ <td>64.47</td>
241
+ <td>60.64</td>
242
+ <td>61.97</td>
243
+ <td>57.66</td>
244
+ <td>58.05</td>
245
+ <td>60.47</td>
246
+ </tr>
247
+ <tr>
248
+ <td>Qwen-1.5 (7B)</td>
249
+ <td></td>
250
+ <td>60.13</td>
251
+ <td>59.14</td>
252
+ <td>58.62</td>
253
+ <td>54.26</td>
254
+ <td>54.67</td>
255
+ <td>57.18</td>
256
+ </tr>
257
+ <tr>
258
+ <td>Qwen-1.5 (4B)</td>
259
+ <td></td>
260
+ <td>48.39</td>
261
+ <td>52.01</td>
262
+ <td>51.37</td>
263
+ <td>50.00</td>
264
+ <td>49.10</td>
265
+ <td>49.93</td>
266
+ </tr>
267
+ <tr>
268
+ <td>Qwen-1.5 (1.8B)</td>
269
+ <td></td>
270
+ <td>42.70</td>
271
+ <td>43.37</td>
272
+ <td>43.68</td>
273
+ <td>43.12</td>
274
+ <td>44.42</td>
275
+ <td>43.34</td>
276
+ </tr>
277
+ <tr>
278
+ <td rowspan="5">Zhipu</td>
279
+ <td>GLM-4-Plus</td>
280
+ <td></td>
281
+ <td>78.04</td>
282
+ <td>75.63</td>
283
+ <td>77.49</td>
284
+ <td>74.07</td>
285
+ <td>72.66</td>
286
+ <td>75.48</td>
287
+ </tr>
288
+ <tr>
289
+ <td>GLM-4-Air</td>
290
+ <td></td>
291
+ <td>67.88</td>
292
+ <td>69.56</td>
293
+ <td>70.20</td>
294
+ <td>66.06</td>
295
+ <td>66.18</td>
296
+ <td>67.60</td>
297
+ </tr>
298
+ <tr>
299
+ <td>GLM-4-Flash</td>
300
+ <td></td>
301
+ <td>63.52</td>
302
+ <td>65.69</td>
303
+ <td>66.31</td>
304
+ <td>63.21</td>
305
+ <td>63.59</td>
306
+ <td>64.12</td>
307
+ </tr>
308
+ <tr>
309
+ <td>GLM-4</td>
310
+ <td></td>
311
+ <td>63.39</td>
312
+ <td>56.72</td>
313
+ <td>54.40</td>
314
+ <td>57.24</td>
315
+ <td>55.00</td>
316
+ <td>58.07</td>
317
+ </tr>
318
+ <tr>
319
+ <td>GLM-4<sup>††</sup> (9B)</td>
320
+ <td></td>
321
+ <td>58.51</td>
322
+ <td>60.48</td>
323
+ <td>56.32</td>
324
+ <td>55.04</td>
325
+ <td>53.97</td>
326
+ <td>56.87</td>
327
+ </tr>
328
+ <tr>
329
+ <td rowspan="3">Google</td>
330
+ <td>Gemma-2 (9B)</td>
331
+ <td></td>
332
+ <td>75.83</td>
333
+ <td>72.83</td>
334
+ <td>75.07</td>
335
+ <td>69.72</td>
336
+ <td>70.33</td>
337
+ <td>72.51</td>
338
+ </tr>
339
+ <tr>
340
+ <td>Gemma (7B)</td>
341
+ <td></td>
342
+ <td>45.53</td>
343
+ <td>50.92</td>
344
+ <td>46.13</td>
345
+ <td>47.33</td>
346
+ <td>46.27</td>
347
+ <td>47.21</td>
348
+ </tr>
349
+ <tr>
350
+ <td>Gemma (2B)</td>
351
+ <td></td>
352
+ <td>46.50</td>
353
+ <td>51.15</td>
354
+ <td>49.20</td>
355
+ <td>48.06</td>
356
+ <td>48.79</td>
357
+ <td>48.46</td>
358
+ </tr>
359
+ <tr>
360
+ <td rowspan="2">SAIL (Sea)</td>
361
+ <td>Sailor<sup>†</sup> (14B)</td>
362
+ <td></td>
363
+ <td>78.40</td>
364
+ <td>72.88</td>
365
+ <td>69.63</td>
366
+ <td>69.47</td>
367
+ <td>68.67</td>
368
+ <td>72.29</td>
369
+ </tr>
370
+ <tr>
371
+ <td>Sailor<sup>†</sup> (7B)</td>
372
+ <td></td>
373
+ <td>74.54</td>
374
+ <td>68.62</td>
375
+ <td>62.79</td>
376
+ <td>64.69</td>
377
+ <td>63.61</td>
378
+ <td>67.58</td>
379
+ </tr>
380
+ <tr>
381
+ <td>Cohere for AI</td>
382
+ <td>Command R (32B)</td>
383
+ <td></td>
384
+ <td>71.68</td>
385
+ <td>71.49</td>
386
+ <td>66.68</td>
387
+ <td>67.19</td>
388
+ <td>63.64</td>
389
+ <td>68.47</td>
390
+ </tr>
391
+ <tr>
392
+ <td>OpenGVLab</td>
393
+ <td>InternVL2 (40B)</td>
394
+ <td style="color: green;">βœ”</td>
395
+ <td>70.36</td>
396
+ <td>68.49</td>
397
+ <td>64.88</td>
398
+ <td>65.93</td>
399
+ <td>60.54</td>
400
+ <td>66.51</td>
401
+ </tr>
402
+ <tr>
403
+ <td>Damo (Alibaba)</td>
404
+ <td>SeaLLM-v2.5<sup>†</sup> (7B)</td>
405
+ <td></td>
406
+ <td>69.75</td>
407
+ <td>67.94</td>
408
+ <td>65.29</td>
409
+ <td>62.66</td>
410
+ <td>63.61</td>
411
+ <td>65.89</td>
412
+ </tr>
413
+ <tr>
414
+ <td rowspan="4">Mistral</td>
415
+ <td>Pixtral (12B)</td>
416
+ <td style="color: green;">βœ”</td>
417
+ <td>64.81</td>
418
+ <td>62.68</td>
419
+ <td>64.72</td>
420
+ <td>63.93</td>
421
+ <td>59.49</td>
422
+ <td>63.25</td>
423
+ </tr>
424
+ <tr>
425
+ <td>Mistral Small (22B)</td>
426
+ <td></td>
427
+ <td>65.19</td>
428
+ <td>65.03</td>
429
+ <td>63.36</td>
430
+ <td>61.58</td>
431
+ <td>59.99</td>
432
+ <td>63.05</td>
433
+ </tr>
434
+ <tr>
435
+ <td>Mistral-v0.3 (7B)</td>
436
+ <td></td>
437
+ <td>56.97</td>
438
+ <td>59.29</td>
439
+ <td>57.14</td>
440
+ <td>58.28</td>
441
+ <td>56.56</td>
442
+ <td>57.71</td>
443
+ </tr>
444
+ <tr>
445
+ <td>Mistral-v0.2 (7B)</td>
446
+ <td></td>
447
+ <td>56.23</td>
448
+ <td>59.86</td>
449
+ <td>57.10</td>
450
+ <td>56.65</td>
451
+ <td>55.22</td>
452
+ <td>56.92</td>
453
+ </tr>
454
+ <tr>
455
+ <td rowspan="2">Microsoft</td>
456
+ <td>Phi-3 (14B)</td>
457
+ <td></td>
458
+ <td>60.07</td>
459
+ <td>58.89</td>
460
+ <td>60.91</td>
461
+ <td>58.73</td>
462
+ <td>55.24</td>
463
+ <td>58.72</td>
464
+ </tr>
465
+ <tr>
466
+ <td>Phi-3 (3.8B)</td>
467
+ <td></td>
468
+ <td>52.24</td>
469
+ <td>55.52</td>
470
+ <td>54.81</td>
471
+ <td>53.70</td>
472
+ <td>51.74</td>
473
+ <td>53.43</td>
474
+ </tr>
475
+ <tr>
476
+ <td>01.AI</td>
477
+ <td>Yi-1.5 (9B)</td>
478
+ <td></td>
479
+ <td>56.20</td>
480
+ <td>53.36</td>
481
+ <td>57.47</td>
482
+ <td>50.53</td>
483
+ <td>49.75</td>
484
+ <td>53.08</td>
485
+ </tr>
486
+ <tr>
487
+ <td rowspan="2">Stability AI</td>
488
+ <td>StableLM 2 (12B)</td>
489
+ <td></td>
490
+ <td>53.40</td>
491
+ <td>54.84</td>
492
+ <td>51.45</td>
493
+ <td>51.79</td>
494
+ <td>50.16</td>
495
+ <td>52.45</td>
496
+ </tr>
497
+ <tr>
498
+ <td>StableLM 2 (1.6B)</td>
499
+ <td></td>
500
+ <td>43.92</td>
501
+ <td>51.10</td>
502
+ <td>45.27</td>
503
+ <td>46.14</td>
504
+ <td>46.75</td>
505
+ <td>46.48</td>
506
+ </tr>
507
+ <tr>
508
+ <td>Baichuan</td>
509
+ <td>Baichuan-2 (7B)</td>
510
+ <td></td>
511
+ <td>40.41</td>
512
+ <td>47.35</td>
513
+ <td>44.37</td>
514
+ <td>46.33</td>
515
+ <td>43.54</td>
516
+ <td>44.30</td>
517
+ </tr>
518
+ <tr>
519
+ <td>Mesolitica</td>
520
+ <td>MaLLaM-v2<sup>†</sup> (5B)</td>
521
+ <td></td>
522
+ <td>42.57</td>
523
+ <td>46.44</td>
524
+ <td>42.24</td>
525
+ <td>40.82</td>
526
+ <td>38.74</td>
527
+ <td>42.08</td>
528
+ </tr>
529
+ <tr>
530
+ <td>Yellow.ai</td>
531
+ <td>Komodo<sup>†</sup> (7B)</td>
532
+ <td></td>
533
+ <td>43.62</td>
534
+ <td>45.53</td>
535
+ <td>39.34</td>
536
+ <td>39.75</td>
537
+ <td>39.48</td>
538
+ <td>41.72</td>
539
+ </tr>
540
+ </tbody>
541
+ </table>
542
+ Highest scores are <strong>bolded</strong> and second highest scores are <ins>underlined</ins>.
543
+ † denotes LLMs fine-tuned with Southeast Asia datasets.
544
+ †† denotes open-source GLM-4.
545
 
546
 
547
  ## Citation