File size: 1,464 Bytes
3727a46
 
 
 
 
 
 
 
 
 
96288c2
 
3727a46
 
96288c2
3727a46
 
 
96288c2
 
 
 
 
 
 
 
 
 
18b6f3b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
96288c2
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
---
license: apache-2.0
tags:
- merge
- mergekit
- lazymergekit
- mlabonne/Marcoro14-7B-slerp
- mlabonne/NeuralBeagle14-7B
---

![1702099743576398.png](https://cdn-uploads.huggingface.co/production/uploads/65d1f383351255ba48a4f831/qkD9fnyeOVlo5eQod1EOH.png)

# Maverick-7B

This model is a merge of the following models:
* [mlabonne/Marcoro14-7B-slerp](https://huggingface.co./mlabonne/Marcoro14-7B-slerp)
* [mlabonne/NeuralBeagle14-7B](https://huggingface.co./mlabonne/NeuralBeagle14-7B)


## 🏆 Evaluation

### TruthfulQA

|    Task     |Version|Metric|Value |   |Stderr|
|-------------|------:|------|-----:|---|-----:|
|truthfulqa_mc|      1|mc1   |0.5165|±  |0.0175|
|             |       |mc2   |0.6661|±  |0.0152|

### GPT4ALL

|    Task     |Version| Metric |Value |   |Stderr|
|-------------|------:|--------|-----:|---|-----:|
|arc_challenge|      0|acc     |0.6442|±  |0.0140|
|             |       |acc_norm|0.6570|±  |0.0139|
|arc_easy     |      0|acc     |0.8645|±  |0.0070|
|             |       |acc_norm|0.8304|±  |0.0077|
|boolq        |      1|acc     |0.8850|±  |0.0056|
|hellaswag    |      0|acc     |0.6813|±  |0.0047|
|             |       |acc_norm|0.8571|±  |0.0035|
|openbookqa   |      0|acc     |0.3640|±  |0.0215|
|             |       |acc_norm|0.4800|±  |0.0224|
|piqa         |      0|acc     |0.8324|±  |0.0087|
|             |       |acc_norm|0.8460|±  |0.0084|
|winogrande   |      0|acc     |0.7869|±  |0.0115|