Maverick-7B / README.md
feeltheAGI's picture
Update README.md
d1cfa8a verified
|
raw
history blame
2.72 kB
metadata
license: apache-2.0
tags:
  - merge
  - mergekit
  - lazymergekit
  - mlabonne/Marcoro14-7B-slerp
  - mlabonne/NeuralBeagle14-7B

1702099593364787.png

Maverick-7B

This model is a merge of the following models:

🏆 Evaluation

TruthfulQA

Task Version Metric Value Stderr
truthfulqa_mc 1 mc1 0.5165 ± 0.0175
mc2 0.6661 ± 0.0152

GPT4ALL

Task Version Metric Value Stderr
arc_challenge 0 acc 0.6442 ± 0.0140
acc_norm 0.6570 ± 0.0139
arc_easy 0 acc 0.8645 ± 0.0070
acc_norm 0.8304 ± 0.0077
boolq 1 acc 0.8850 ± 0.0056
hellaswag 0 acc 0.6813 ± 0.0047
acc_norm 0.8571 ± 0.0035
openbookqa 0 acc 0.3640 ± 0.0215
acc_norm 0.4800 ± 0.0224
piqa 0 acc 0.8324 ± 0.0087
acc_norm 0.8460 ± 0.0084
winogrande 0 acc 0.7869 ± 0.0115

AGIEval

Task Version Metric Value Stderr
agieval_aqua_rat 0 acc 0.2717 ± 0.0280
acc_norm 0.2559 ± 0.0274
agieval_logiqa_en 0 acc 0.3902 ± 0.0191
acc_norm 0.3856 ± 0.0191
agieval_lsat_ar 0 acc 0.2565 ± 0.0289
acc_norm 0.2478 ± 0.0285
agieval_lsat_lr 0 acc 0.5118 ± 0.0222
acc_norm 0.5216 ± 0.0221
agieval_lsat_rc 0 acc 0.6543 ± 0.0291
acc_norm 0.6506 ± 0.0291
agieval_sat_en 0 acc 0.7961 ± 0.0281
acc_norm 0.8010 ± 0.0279
agieval_sat_en_without_passage 0 acc 0.4660 ± 0.0348
acc_norm 0.4709 ± 0.0349
agieval_sat_math 0 acc 0.3227 ± 0.0316
acc_norm 0.3045 ± 0.0311