Update README.md
Browse files
README.md
CHANGED
@@ -9,17 +9,17 @@ language:
|
|
9 |
tags:
|
10 |
- medical
|
11 |
---
|
|
|
12 |
[Technoculture/MD7b-alpha](https://huggingface.co/Technoculture/MD7b-alpha) adapter merged with its Base Model (Meditron 7B)
|
13 |
|
14 |
# Evaluations
|
15 |
-
|
16 |
## Open LLM Leaderboard
|
17 |
|
18 |
| Model | ARC |HellaSwag| MMLU |TruthfulQA|Winogrande|GSM8K|
|
19 |
|---------------------------------------------------|----:|--------:|--------------------------|---------:|---------:|----:|
|
20 |
|[MT7Bi](https://huggingface.co/Technoculture/MT7Bi)|50.94| 73.24|Error: File does not exist| 43.04| 72.06|22.52|
|
21 |
|
22 |
-
### ARC
|
23 |
| Task |Version| Metric | Value | |Stderr|
|
24 |
|-------------|-------|--------------------|-------------|---|------|
|
25 |
|arc_challenge|Yaml |acc,none | 0.48| | |
|
@@ -28,9 +28,7 @@ tags:
|
|
28 |
| | |acc_norm_stderr,none| 0.01| | |
|
29 |
| | |alias |arc_challenge| | |
|
30 |
|
31 |
-
|
32 |
-
|
33 |
-
### HellaSwag
|
34 |
| Task |Version| Metric | Value | |Stderr|
|
35 |
|---------|-------|--------------------|---------|---|------|
|
36 |
|hellaswag|Yaml |acc,none | 0.54| | |
|
@@ -39,13 +37,7 @@ Average: 50.94%
|
|
39 |
| | |acc_norm_stderr,none| 0| | |
|
40 |
| | |alias |hellaswag| | |
|
41 |
|
42 |
-
|
43 |
-
|
44 |
-
### MMLU
|
45 |
-
|
46 |
-
Average: Error: File does not exist%
|
47 |
-
|
48 |
-
### TruthfulQA
|
49 |
| Task |Version| Metric | Value | |Stderr|
|
50 |
|--------------|-------|-----------------------|-----------------|---|------|
|
51 |
|truthfulqa |N/A |bleu_max,none | 16.17| | |
|
@@ -107,26 +99,18 @@ Average: Error: File does not exist%
|
|
107 |
| | |acc_stderr,none | 0.01| | |
|
108 |
| | |alias | - truthfulqa_mc2| | |
|
109 |
|
110 |
-
|
111 |
-
|
112 |
-
### Winogrande
|
113 |
| Task |Version| Metric | Value | |Stderr|
|
114 |
|----------|-------|---------------|----------|---|------|
|
115 |
|winogrande|Yaml |acc,none | 0.72| | |
|
116 |
| | |acc_stderr,none| 0.01| | |
|
117 |
| | |alias |winogrande| | |
|
118 |
|
119 |
-
|
120 |
-
|
121 |
-
### GSM8K
|
122 |
|Task |Version| Metric |Value| |Stderr|
|
123 |
|-----|-------|-----------------------------|-----|---|------|
|
124 |
|gsm8k|Yaml |exact_match,get-answer | 0.23| | |
|
125 |
| | |exact_match_stderr,get-answer| 0.01| | |
|
126 |
| | |alias |gsm8k| | |
|
127 |
|
128 |
-
Average: 22.52%
|
129 |
-
|
130 |
-
Average score: Not available due to errors
|
131 |
-
|
132 |
Elapsed time: 03:56:55
|
|
|
9 |
tags:
|
10 |
- medical
|
11 |
---
|
12 |
+
|
13 |
[Technoculture/MD7b-alpha](https://huggingface.co/Technoculture/MD7b-alpha) adapter merged with its Base Model (Meditron 7B)
|
14 |
|
15 |
# Evaluations
|
|
|
16 |
## Open LLM Leaderboard
|
17 |
|
18 |
| Model | ARC |HellaSwag| MMLU |TruthfulQA|Winogrande|GSM8K|
|
19 |
|---------------------------------------------------|----:|--------:|--------------------------|---------:|---------:|----:|
|
20 |
|[MT7Bi](https://huggingface.co/Technoculture/MT7Bi)|50.94| 73.24|Error: File does not exist| 43.04| 72.06|22.52|
|
21 |
|
22 |
+
### ARC: 50.94%
|
23 |
| Task |Version| Metric | Value | |Stderr|
|
24 |
|-------------|-------|--------------------|-------------|---|------|
|
25 |
|arc_challenge|Yaml |acc,none | 0.48| | |
|
|
|
28 |
| | |acc_norm_stderr,none| 0.01| | |
|
29 |
| | |alias |arc_challenge| | |
|
30 |
|
31 |
+
### HellaSwag: 73.24%
|
|
|
|
|
32 |
| Task |Version| Metric | Value | |Stderr|
|
33 |
|---------|-------|--------------------|---------|---|------|
|
34 |
|hellaswag|Yaml |acc,none | 0.54| | |
|
|
|
37 |
| | |acc_norm_stderr,none| 0| | |
|
38 |
| | |alias |hellaswag| | |
|
39 |
|
40 |
+
### TruthfulQA: 43.04%
|
|
|
|
|
|
|
|
|
|
|
|
|
41 |
| Task |Version| Metric | Value | |Stderr|
|
42 |
|--------------|-------|-----------------------|-----------------|---|------|
|
43 |
|truthfulqa |N/A |bleu_max,none | 16.17| | |
|
|
|
99 |
| | |acc_stderr,none | 0.01| | |
|
100 |
| | |alias | - truthfulqa_mc2| | |
|
101 |
|
102 |
+
### Winogrande: 72.06%
|
|
|
|
|
103 |
| Task |Version| Metric | Value | |Stderr|
|
104 |
|----------|-------|---------------|----------|---|------|
|
105 |
|winogrande|Yaml |acc,none | 0.72| | |
|
106 |
| | |acc_stderr,none| 0.01| | |
|
107 |
| | |alias |winogrande| | |
|
108 |
|
109 |
+
### GSM8K: 22.52%
|
|
|
|
|
110 |
|Task |Version| Metric |Value| |Stderr|
|
111 |
|-----|-------|-----------------------------|-----|---|------|
|
112 |
|gsm8k|Yaml |exact_match,get-answer | 0.23| | |
|
113 |
| | |exact_match_stderr,get-answer| 0.01| | |
|
114 |
| | |alias |gsm8k| | |
|
115 |
|
|
|
|
|
|
|
|
|
116 |
Elapsed time: 03:56:55
|