Commit
•
cdbd008
1
Parent(s):
2e95049
Adding Evaluation Results (#1)
Browse files- Adding Evaluation Results (7891f7e9f82300705fcb84921ff6a878534c957a)
Co-authored-by: Open LLM Leaderboard PR Bot <[email protected]>
README.md
CHANGED
@@ -194,4 +194,17 @@ Quantized models:
|
|
194 |
* [TRURL 13b - 8bit](https://huggingface.co/Voicelab/trurl-2-13b-8bit/)
|
195 |
* [TRURL 7b - 8bit](https://huggingface.co/Voicelab/trurl-2-7b-8bit/)
|
196 |
|
197 |
-
The work was supported by [#NASK](https://www.nask.pl/)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
194 |
* [TRURL 13b - 8bit](https://huggingface.co/Voicelab/trurl-2-13b-8bit/)
|
195 |
* [TRURL 7b - 8bit](https://huggingface.co/Voicelab/trurl-2-7b-8bit/)
|
196 |
|
197 |
+
The work was supported by [#NASK](https://www.nask.pl/)
|
198 |
+
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
|
199 |
+
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_Voicelab__trurl-2-13b-academic)
|
200 |
+
|
201 |
+
| Metric | Value |
|
202 |
+
|-----------------------|---------------------------|
|
203 |
+
| Avg. | 52.7 |
|
204 |
+
| ARC (25-shot) | 57.94 |
|
205 |
+
| HellaSwag (10-shot) | 79.55 |
|
206 |
+
| MMLU (5-shot) | 55.2 |
|
207 |
+
| TruthfulQA (0-shot) | 43.46 |
|
208 |
+
| Winogrande (5-shot) | 76.56 |
|
209 |
+
| GSM8K (5-shot) | 10.92 |
|
210 |
+
| DROP (3-shot) | 45.28 |
|