Files changed (1) hide show
  1. README.md +392 -2
README.md CHANGED
@@ -10,10 +10,11 @@ tags:
10
  ---
11
 
12
 
 
13
 
14
- ## Examples
15
-
16
 
 
17
 
18
 
19
  A 23-year-old pregnant woman at 22 weeks gestation presents with burning upon urination. She states it started 1 day ago and has been worsening despite drinking more water and taking cranberry extract. She otherwise feels well and is followed by a doctor for her pregnancy. Her temperature is 97.7°F (36.5°C), blood pressure is 122/77 mmHg, pulse is 80/min, respirations are 19/min, and oxygen saturation is 98% on room air. Physical exam is notable for an absence of costovertebral angle tenderness and a gravid uterus. Which of the following is the best treatment for this patient?
@@ -134,4 +135,393 @@ Step 4: Solve the subtraction next. Subtracting 8 from 25 gives you 17.
134
  Step 5: Finally, add 3 to the result of the previous step. Adding 3 to 17 gives you a final answer of 20.
135
 
136
  So, 25 - 4 * 2 + 3 = 20.</s>
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
137
  ```
 
10
  ---
11
 
12
 
13
+ ## Quantized mode
14
 
15
+ Here is the list of GGUF models quantized from 2 to 8 bits: https://huggingface.co/MaziyarPanahi/Bioxtral-4x7B-v0.1-GGUF
 
16
 
17
+ ## Examples
18
 
19
 
20
  A 23-year-old pregnant woman at 22 weeks gestation presents with burning upon urination. She states it started 1 day ago and has been worsening despite drinking more water and taking cranberry extract. She otherwise feels well and is followed by a doctor for her pregnancy. Her temperature is 97.7°F (36.5°C), blood pressure is 122/77 mmHg, pulse is 80/min, respirations are 19/min, and oxygen saturation is 98% on room air. Physical exam is notable for an absence of costovertebral angle tenderness and a gravid uterus. Which of the following is the best treatment for this patient?
 
135
  Step 5: Finally, add 3 to the result of the previous step. Adding 3 to 17 gives you a final answer of 20.
136
 
137
  So, 25 - 4 * 2 + 3 = 20.</s>
138
+ ```
139
+
140
+
141
+ ## Eval
142
+
143
+ source: https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Bioxtral-4x7B-v0.1
144
+
145
+ ```python
146
+ {
147
+ "all": {
148
+ "acc": 0.6390815384774987,
149
+ "acc_stderr": 0.03233527173865626,
150
+ "acc_norm": 0.6405373328568302,
151
+ "acc_norm_stderr": 0.032994557880045274,
152
+ "mc1": 0.5152998776009792,
153
+ "mc1_stderr": 0.017495304473187902,
154
+ "mc2": 0.6845419346695587,
155
+ "mc2_stderr": 0.014829461272743373
156
+ },
157
+ "harness|arc:challenge|25": {
158
+ "acc": 0.658703071672355,
159
+ "acc_stderr": 0.01385583128749772,
160
+ "acc_norm": 0.6834470989761092,
161
+ "acc_norm_stderr": 0.013592431519068079
162
+ },
163
+ "harness|hellaswag|10": {
164
+ "acc": 0.6946823341963753,
165
+ "acc_stderr": 0.004596006250433548,
166
+ "acc_norm": 0.8727345150368453,
167
+ "acc_norm_stderr": 0.003325890225529856
168
+ },
169
+ "harness|hendrycksTest-abstract_algebra|5": {
170
+ "acc": 0.31,
171
+ "acc_stderr": 0.04648231987117316,
172
+ "acc_norm": 0.31,
173
+ "acc_norm_stderr": 0.04648231987117316
174
+ },
175
+ "harness|hendrycksTest-anatomy|5": {
176
+ "acc": 0.6370370370370371,
177
+ "acc_stderr": 0.04153948404742397,
178
+ "acc_norm": 0.6370370370370371,
179
+ "acc_norm_stderr": 0.04153948404742397
180
+ },
181
+ "harness|hendrycksTest-astronomy|5": {
182
+ "acc": 0.7105263157894737,
183
+ "acc_stderr": 0.03690677986137283,
184
+ "acc_norm": 0.7105263157894737,
185
+ "acc_norm_stderr": 0.03690677986137283
186
+ },
187
+ "harness|hendrycksTest-business_ethics|5": {
188
+ "acc": 0.63,
189
+ "acc_stderr": 0.04852365870939099,
190
+ "acc_norm": 0.63,
191
+ "acc_norm_stderr": 0.04852365870939099
192
+ },
193
+ "harness|hendrycksTest-clinical_knowledge|5": {
194
+ "acc": 0.6943396226415094,
195
+ "acc_stderr": 0.028353298073322663,
196
+ "acc_norm": 0.6943396226415094,
197
+ "acc_norm_stderr": 0.028353298073322663
198
+ },
199
+ "harness|hendrycksTest-college_biology|5": {
200
+ "acc": 0.7222222222222222,
201
+ "acc_stderr": 0.037455547914624555,
202
+ "acc_norm": 0.7222222222222222,
203
+ "acc_norm_stderr": 0.037455547914624555
204
+ },
205
+ "harness|hendrycksTest-college_chemistry|5": {
206
+ "acc": 0.44,
207
+ "acc_stderr": 0.04988876515698589,
208
+ "acc_norm": 0.44,
209
+ "acc_norm_stderr": 0.04988876515698589
210
+ },
211
+ "harness|hendrycksTest-college_computer_science|5": {
212
+ "acc": 0.56,
213
+ "acc_stderr": 0.049888765156985884,
214
+ "acc_norm": 0.56,
215
+ "acc_norm_stderr": 0.049888765156985884
216
+ },
217
+ "harness|hendrycksTest-college_mathematics|5": {
218
+ "acc": 0.29,
219
+ "acc_stderr": 0.04560480215720684,
220
+ "acc_norm": 0.29,
221
+ "acc_norm_stderr": 0.04560480215720684
222
+ },
223
+ "harness|hendrycksTest-college_medicine|5": {
224
+ "acc": 0.6184971098265896,
225
+ "acc_stderr": 0.03703851193099521,
226
+ "acc_norm": 0.6184971098265896,
227
+ "acc_norm_stderr": 0.03703851193099521
228
+ },
229
+ "harness|hendrycksTest-college_physics|5": {
230
+ "acc": 0.43137254901960786,
231
+ "acc_stderr": 0.04928099597287534,
232
+ "acc_norm": 0.43137254901960786,
233
+ "acc_norm_stderr": 0.04928099597287534
234
+ },
235
+ "harness|hendrycksTest-computer_security|5": {
236
+ "acc": 0.78,
237
+ "acc_stderr": 0.041633319989322605,
238
+ "acc_norm": 0.78,
239
+ "acc_norm_stderr": 0.041633319989322605
240
+ },
241
+ "harness|hendrycksTest-conceptual_physics|5": {
242
+ "acc": 0.5829787234042553,
243
+ "acc_stderr": 0.03223276266711712,
244
+ "acc_norm": 0.5829787234042553,
245
+ "acc_norm_stderr": 0.03223276266711712
246
+ },
247
+ "harness|hendrycksTest-econometrics|5": {
248
+ "acc": 0.45614035087719296,
249
+ "acc_stderr": 0.04685473041907789,
250
+ "acc_norm": 0.45614035087719296,
251
+ "acc_norm_stderr": 0.04685473041907789
252
+ },
253
+ "harness|hendrycksTest-electrical_engineering|5": {
254
+ "acc": 0.5310344827586206,
255
+ "acc_stderr": 0.04158632762097828,
256
+ "acc_norm": 0.5310344827586206,
257
+ "acc_norm_stderr": 0.04158632762097828
258
+ },
259
+ "harness|hendrycksTest-elementary_mathematics|5": {
260
+ "acc": 0.42328042328042326,
261
+ "acc_stderr": 0.025446365634406786,
262
+ "acc_norm": 0.42328042328042326,
263
+ "acc_norm_stderr": 0.025446365634406786
264
+ },
265
+ "harness|hendrycksTest-formal_logic|5": {
266
+ "acc": 0.47619047619047616,
267
+ "acc_stderr": 0.04467062628403273,
268
+ "acc_norm": 0.47619047619047616,
269
+ "acc_norm_stderr": 0.04467062628403273
270
+ },
271
+ "harness|hendrycksTest-global_facts|5": {
272
+ "acc": 0.28,
273
+ "acc_stderr": 0.04512608598542128,
274
+ "acc_norm": 0.28,
275
+ "acc_norm_stderr": 0.04512608598542128
276
+ },
277
+ "harness|hendrycksTest-high_school_biology|5": {
278
+ "acc": 0.7516129032258064,
279
+ "acc_stderr": 0.024580028921481003,
280
+ "acc_norm": 0.7516129032258064,
281
+ "acc_norm_stderr": 0.024580028921481003
282
+ },
283
+ "harness|hendrycksTest-high_school_chemistry|5": {
284
+ "acc": 0.4975369458128079,
285
+ "acc_stderr": 0.03517945038691063,
286
+ "acc_norm": 0.4975369458128079,
287
+ "acc_norm_stderr": 0.03517945038691063
288
+ },
289
+ "harness|hendrycksTest-high_school_computer_science|5": {
290
+ "acc": 0.65,
291
+ "acc_stderr": 0.047937248544110196,
292
+ "acc_norm": 0.65,
293
+ "acc_norm_stderr": 0.047937248544110196
294
+ },
295
+ "harness|hendrycksTest-high_school_european_history|5": {
296
+ "acc": 0.7757575757575758,
297
+ "acc_stderr": 0.032568666616811015,
298
+ "acc_norm": 0.7757575757575758,
299
+ "acc_norm_stderr": 0.032568666616811015
300
+ },
301
+ "harness|hendrycksTest-high_school_geography|5": {
302
+ "acc": 0.7878787878787878,
303
+ "acc_stderr": 0.029126522834586815,
304
+ "acc_norm": 0.7878787878787878,
305
+ "acc_norm_stderr": 0.029126522834586815
306
+ },
307
+ "harness|hendrycksTest-high_school_government_and_politics|5": {
308
+ "acc": 0.8808290155440415,
309
+ "acc_stderr": 0.02338193534812142,
310
+ "acc_norm": 0.8808290155440415,
311
+ "acc_norm_stderr": 0.02338193534812142
312
+ },
313
+ "harness|hendrycksTest-high_school_macroeconomics|5": {
314
+ "acc": 0.6666666666666666,
315
+ "acc_stderr": 0.02390115797940254,
316
+ "acc_norm": 0.6666666666666666,
317
+ "acc_norm_stderr": 0.02390115797940254
318
+ },
319
+ "harness|hendrycksTest-high_school_mathematics|5": {
320
+ "acc": 0.3333333333333333,
321
+ "acc_stderr": 0.028742040903948485,
322
+ "acc_norm": 0.3333333333333333,
323
+ "acc_norm_stderr": 0.028742040903948485
324
+ },
325
+ "harness|hendrycksTest-high_school_microeconomics|5": {
326
+ "acc": 0.6890756302521008,
327
+ "acc_stderr": 0.030066761582977927,
328
+ "acc_norm": 0.6890756302521008,
329
+ "acc_norm_stderr": 0.030066761582977927
330
+ },
331
+ "harness|hendrycksTest-high_school_physics|5": {
332
+ "acc": 0.36423841059602646,
333
+ "acc_stderr": 0.03929111781242742,
334
+ "acc_norm": 0.36423841059602646,
335
+ "acc_norm_stderr": 0.03929111781242742
336
+ },
337
+ "harness|hendrycksTest-high_school_psychology|5": {
338
+ "acc": 0.8165137614678899,
339
+ "acc_stderr": 0.01659525971039931,
340
+ "acc_norm": 0.8165137614678899,
341
+ "acc_norm_stderr": 0.01659525971039931
342
+ },
343
+ "harness|hendrycksTest-high_school_statistics|5": {
344
+ "acc": 0.5,
345
+ "acc_stderr": 0.034099716973523674,
346
+ "acc_norm": 0.5,
347
+ "acc_norm_stderr": 0.034099716973523674
348
+ },
349
+ "harness|hendrycksTest-high_school_us_history|5": {
350
+ "acc": 0.803921568627451,
351
+ "acc_stderr": 0.027865942286639318,
352
+ "acc_norm": 0.803921568627451,
353
+ "acc_norm_stderr": 0.027865942286639318
354
+ },
355
+ "harness|hendrycksTest-high_school_world_history|5": {
356
+ "acc": 0.7932489451476793,
357
+ "acc_stderr": 0.02636165166838909,
358
+ "acc_norm": 0.7932489451476793,
359
+ "acc_norm_stderr": 0.02636165166838909
360
+ },
361
+ "harness|hendrycksTest-human_aging|5": {
362
+ "acc": 0.6816143497757847,
363
+ "acc_stderr": 0.03126580522513713,
364
+ "acc_norm": 0.6816143497757847,
365
+ "acc_norm_stderr": 0.03126580522513713
366
+ },
367
+ "harness|hendrycksTest-human_sexuality|5": {
368
+ "acc": 0.7480916030534351,
369
+ "acc_stderr": 0.03807387116306085,
370
+ "acc_norm": 0.7480916030534351,
371
+ "acc_norm_stderr": 0.03807387116306085
372
+ },
373
+ "harness|hendrycksTest-international_law|5": {
374
+ "acc": 0.7851239669421488,
375
+ "acc_stderr": 0.037494924487096966,
376
+ "acc_norm": 0.7851239669421488,
377
+ "acc_norm_stderr": 0.037494924487096966
378
+ },
379
+ "harness|hendrycksTest-jurisprudence|5": {
380
+ "acc": 0.7314814814814815,
381
+ "acc_stderr": 0.042844679680521934,
382
+ "acc_norm": 0.7314814814814815,
383
+ "acc_norm_stderr": 0.042844679680521934
384
+ },
385
+ "harness|hendrycksTest-logical_fallacies|5": {
386
+ "acc": 0.7484662576687117,
387
+ "acc_stderr": 0.03408997886857529,
388
+ "acc_norm": 0.7484662576687117,
389
+ "acc_norm_stderr": 0.03408997886857529
390
+ },
391
+ "harness|hendrycksTest-machine_learning|5": {
392
+ "acc": 0.44642857142857145,
393
+ "acc_stderr": 0.047184714852195886,
394
+ "acc_norm": 0.44642857142857145,
395
+ "acc_norm_stderr": 0.047184714852195886
396
+ },
397
+ "harness|hendrycksTest-management|5": {
398
+ "acc": 0.7669902912621359,
399
+ "acc_stderr": 0.04185832598928315,
400
+ "acc_norm": 0.7669902912621359,
401
+ "acc_norm_stderr": 0.04185832598928315
402
+ },
403
+ "harness|hendrycksTest-marketing|5": {
404
+ "acc": 0.8717948717948718,
405
+ "acc_stderr": 0.02190190511507333,
406
+ "acc_norm": 0.8717948717948718,
407
+ "acc_norm_stderr": 0.02190190511507333
408
+ },
409
+ "harness|hendrycksTest-medical_genetics|5": {
410
+ "acc": 0.74,
411
+ "acc_stderr": 0.04408440022768079,
412
+ "acc_norm": 0.74,
413
+ "acc_norm_stderr": 0.04408440022768079
414
+ },
415
+ "harness|hendrycksTest-miscellaneous|5": {
416
+ "acc": 0.8186462324393359,
417
+ "acc_stderr": 0.013778693778464074,
418
+ "acc_norm": 0.8186462324393359,
419
+ "acc_norm_stderr": 0.013778693778464074
420
+ },
421
+ "harness|hendrycksTest-moral_disputes|5": {
422
+ "acc": 0.7254335260115607,
423
+ "acc_stderr": 0.02402774515526502,
424
+ "acc_norm": 0.7254335260115607,
425
+ "acc_norm_stderr": 0.02402774515526502
426
+ },
427
+ "harness|hendrycksTest-moral_scenarios|5": {
428
+ "acc": 0.47374301675977654,
429
+ "acc_stderr": 0.016699427672784768,
430
+ "acc_norm": 0.47374301675977654,
431
+ "acc_norm_stderr": 0.016699427672784768
432
+ },
433
+ "harness|hendrycksTest-nutrition|5": {
434
+ "acc": 0.7058823529411765,
435
+ "acc_stderr": 0.026090162504279053,
436
+ "acc_norm": 0.7058823529411765,
437
+ "acc_norm_stderr": 0.026090162504279053
438
+ },
439
+ "harness|hendrycksTest-philosophy|5": {
440
+ "acc": 0.7009646302250804,
441
+ "acc_stderr": 0.02600330111788514,
442
+ "acc_norm": 0.7009646302250804,
443
+ "acc_norm_stderr": 0.02600330111788514
444
+ },
445
+ "harness|hendrycksTest-prehistory|5": {
446
+ "acc": 0.7098765432098766,
447
+ "acc_stderr": 0.025251173936495033,
448
+ "acc_norm": 0.7098765432098766,
449
+ "acc_norm_stderr": 0.025251173936495033
450
+ },
451
+ "harness|hendrycksTest-professional_accounting|5": {
452
+ "acc": 0.4645390070921986,
453
+ "acc_stderr": 0.02975238965742705,
454
+ "acc_norm": 0.4645390070921986,
455
+ "acc_norm_stderr": 0.02975238965742705
456
+ },
457
+ "harness|hendrycksTest-professional_law|5": {
458
+ "acc": 0.42894393741851367,
459
+ "acc_stderr": 0.012640625443067358,
460
+ "acc_norm": 0.42894393741851367,
461
+ "acc_norm_stderr": 0.012640625443067358
462
+ },
463
+ "harness|hendrycksTest-professional_medicine|5": {
464
+ "acc": 0.6727941176470589,
465
+ "acc_stderr": 0.028501452860396553,
466
+ "acc_norm": 0.6727941176470589,
467
+ "acc_norm_stderr": 0.028501452860396553
468
+ },
469
+ "harness|hendrycksTest-professional_psychology|5": {
470
+ "acc": 0.6437908496732027,
471
+ "acc_stderr": 0.019373332420724507,
472
+ "acc_norm": 0.6437908496732027,
473
+ "acc_norm_stderr": 0.019373332420724507
474
+ },
475
+ "harness|hendrycksTest-public_relations|5": {
476
+ "acc": 0.7090909090909091,
477
+ "acc_stderr": 0.04350271442923243,
478
+ "acc_norm": 0.7090909090909091,
479
+ "acc_norm_stderr": 0.04350271442923243
480
+ },
481
+ "harness|hendrycksTest-security_studies|5": {
482
+ "acc": 0.7061224489795919,
483
+ "acc_stderr": 0.02916273841024977,
484
+ "acc_norm": 0.7061224489795919,
485
+ "acc_norm_stderr": 0.02916273841024977
486
+ },
487
+ "harness|hendrycksTest-sociology|5": {
488
+ "acc": 0.8009950248756219,
489
+ "acc_stderr": 0.028231365092758406,
490
+ "acc_norm": 0.8009950248756219,
491
+ "acc_norm_stderr": 0.028231365092758406
492
+ },
493
+ "harness|hendrycksTest-us_foreign_policy|5": {
494
+ "acc": 0.88,
495
+ "acc_stderr": 0.03265986323710906,
496
+ "acc_norm": 0.88,
497
+ "acc_norm_stderr": 0.03265986323710906
498
+ },
499
+ "harness|hendrycksTest-virology|5": {
500
+ "acc": 0.5542168674698795,
501
+ "acc_stderr": 0.03869543323472101,
502
+ "acc_norm": 0.5542168674698795,
503
+ "acc_norm_stderr": 0.03869543323472101
504
+ },
505
+ "harness|hendrycksTest-world_religions|5": {
506
+ "acc": 0.8362573099415205,
507
+ "acc_stderr": 0.028380919596145866,
508
+ "acc_norm": 0.8362573099415205,
509
+ "acc_norm_stderr": 0.028380919596145866
510
+ },
511
+ "harness|truthfulqa:mc|0": {
512
+ "mc1": 0.5152998776009792,
513
+ "mc1_stderr": 0.017495304473187902,
514
+ "mc2": 0.6845419346695587,
515
+ "mc2_stderr": 0.014829461272743373
516
+ },
517
+ "harness|winogrande|5": {
518
+ "acc": 0.8287292817679558,
519
+ "acc_stderr": 0.010588417294962524
520
+ },
521
+ "harness|gsm8k|5": {
522
+ "acc": 0.5663381349507203,
523
+ "acc_stderr": 0.013650728047064688
524
+ }
525
+ }
526
+
527
  ```