eval_name
stringlengths
12
111
Precision
stringclasses
3 values
Type
stringclasses
6 values
T
stringclasses
6 values
Weight type
stringclasses
3 values
Architecture
stringclasses
59 values
Model
stringlengths
355
689
fullname
stringlengths
4
102
Model sha
stringlengths
0
40
Average ⬆️
float64
1.03
52
Hub License
stringclasses
25 values
Hub ❤️
int64
0
5.96k
#Params (B)
float64
-1
141
Available on the hub
bool
2 classes
MoE
bool
2 classes
Flagged
bool
2 classes
Chat Template
bool
2 classes
CO₂ cost (kg)
float64
0.03
107
IFEval Raw
float64
0
0.9
IFEval
float64
0
90
BBH Raw
float64
0.27
0.75
BBH
float64
0.81
63.5
MATH Lvl 5 Raw
float64
0
0.51
MATH Lvl 5
float64
0
50.7
GPQA Raw
float64
0.22
0.44
GPQA
float64
0
24.9
MUSR Raw
float64
0.29
0.6
MUSR
float64
0
38.5
MMLU-PRO Raw
float64
0.1
0.73
MMLU-PRO
float64
0
70
Merged
bool
2 classes
Official Providers
bool
2 classes
Upload To Hub Date
stringclasses
457 values
Submission Date
stringclasses
200 values
Generation
int64
0
10
Base Model
stringlengths
4
102
zelk12_MT5-Gen2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/MT5-Gen2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__MT5-Gen2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen2-gemma-2-9B
3ee2822fcba6708bd9032b79249a2789e5996b6a
32.600392
1
10.159
false
false
false
true
1.858381
0.796244
79.624397
0.610541
44.113215
0.103474
10.347432
0.35151
13.534676
0.416292
10.436458
0.437916
37.546173
false
false
2024-11-23
2024-11-23
1
zelk12/MT5-Gen2-gemma-2-9B (Merge)
zelk12_MT5-Gen3-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/MT5-Gen3-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen3-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__MT5-Gen3-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen3-gemma-2-9B
4b3811c689fec5c9cc483bb1ed696734e5e88fcf
32.801838
0
10.159
false
false
false
true
1.937333
0.78253
78.253035
0.609049
43.885913
0.115559
11.555891
0.35151
13.534676
0.423052
12.08151
0.4375
37.5
false
false
2024-12-08
2024-12-08
1
zelk12/MT5-Gen3-gemma-2-9B (Merge)
zelk12_MT5-Gen4-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/MT5-Gen4-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen4-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__MT5-Gen4-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen4-gemma-2-9B
2f826d76460a5b7f150622a57f2d5419adfc464f
33.765135
gemma
0
10.159
true
false
false
true
1.82172
0.783455
78.345457
0.613106
44.323211
0.170695
17.069486
0.353188
13.758389
0.422833
11.354167
0.439661
37.7401
true
false
2024-12-20
2024-12-20
1
zelk12/MT5-Gen4-gemma-2-9B (Merge)
zelk12_MT5-Gen5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/MT5-Gen5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-Gen5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__MT5-Gen5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-Gen5-gemma-2-9B
d1f68652d7dda810da8207a371d26376c6a6e847
32.091454
gemma
1
10.159
true
false
false
true
1.891645
0.79472
79.472023
0.611166
44.115081
0.073263
7.326284
0.348154
13.087248
0.419115
11.55599
0.432929
36.992095
true
false
2024-12-29
2024-12-29
1
zelk12/MT5-Gen5-gemma-2-9B (Merge)
zelk12_MT5-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/MT5-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MT5-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__MT5-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MT5-gemma-2-9B
b627ae7d796b1ae85b59c55e0e043b8d3ae73d83
32.595305
0
10.159
false
false
false
true
3.26983
0.804787
80.478685
0.611223
44.271257
0.095166
9.516616
0.343121
12.416107
0.420385
11.48151
0.436669
37.407654
false
false
2024-10-19
2024-10-21
1
zelk12/MT5-gemma-2-9B (Merge)
zelk12_MTM-Merge-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/MTM-Merge-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/MTM-Merge-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__MTM-Merge-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/MTM-Merge-gemma-2-9B
843f23c68cf50f5bdc0206f93e72ce0f9feeca6e
33.758993
gemma
2
10.159
true
false
false
true
1.793346
0.779808
77.980758
0.613335
44.380677
0.166163
16.616314
0.354866
13.982103
0.426771
11.946354
0.43883
37.647754
true
false
2025-01-01
2025-01-01
1
zelk12/MTM-Merge-gemma-2-9B (Merge)
zelk12_Rv0.4DMv1t0.25-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/Rv0.4DMv1t0.25-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4DMv1t0.25-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__Rv0.4DMv1t0.25-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Rv0.4DMv1t0.25-gemma-2-9B
23e7337dabbf023177c25ded4923286a2e3936fc
33.585317
0
10.159
false
false
false
true
1.918645
0.749658
74.965758
0.606971
43.664764
0.194109
19.410876
0.345638
12.751678
0.430927
12.932552
0.440076
37.786274
false
false
2024-12-31
2024-12-31
1
zelk12/Rv0.4DMv1t0.25-gemma-2-9B (Merge)
zelk12_Rv0.4DMv1t0.25Tt0.25-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__Rv0.4DMv1t0.25Tt0.25-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B
28fbcc2fa23f46aaaed327984784251527c78815
32.304
gemma
0
10.159
true
false
false
true
1.912583
0.76462
76.46201
0.609786
43.914819
0.112538
11.253776
0.342282
12.304251
0.428292
12.703125
0.434674
37.186022
true
false
2024-12-31
2024-12-31
1
zelk12/Rv0.4DMv1t0.25Tt0.25-gemma-2-9B (Merge)
zelk12_Rv0.4MT4g2-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/Rv0.4MT4g2-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Rv0.4MT4g2-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__Rv0.4MT4g2-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Rv0.4MT4g2-gemma-2-9B
ef595241d2c62203c27d84e6643d384a7cf99bd4
33.004199
gemma
1
10.159
true
false
false
true
1.853253
0.732022
73.202215
0.60412
43.199046
0.179758
17.975831
0.353188
13.758389
0.423083
11.91875
0.441739
37.970966
true
false
2025-01-04
2025-01-04
1
zelk12/Rv0.4MT4g2-gemma-2-9B (Merge)
zelk12_T31122024203920-gemma-2-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/T31122024203920-gemma-2-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/T31122024203920-gemma-2-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__T31122024203920-gemma-2-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/T31122024203920-gemma-2-9B
25cb58c73a3adf43cee33b50238b1d332b5ccc13
33.101317
gemma
0
10.159
true
false
false
true
1.866369
0.767618
76.76177
0.609563
43.728997
0.138973
13.897281
0.350671
13.422819
0.432198
13.32474
0.437251
37.472296
true
false
2024-12-31
2024-12-31
1
zelk12/T31122024203920-gemma-2-9B (Merge)
zelk12_Test01012025155054_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/Test01012025155054" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__Test01012025155054-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Test01012025155054
c607186b0b079975e3305e0223e0a55f0cbc19e5
3.591417
0
3.817
false
false
false
true
0.700474
0.155523
15.55229
0.28295
1.280547
0
0
0.241611
0
0.367021
3.710937
0.109043
1.004728
false
false
2025-01-01
2025-01-01
1
zelk12/Test01012025155054 (Merge)
zelk12_Test01012025155054t0.5_gemma-2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/Test01012025155054t0.5_gemma-2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/Test01012025155054t0.5_gemma-2</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__Test01012025155054t0.5_gemma-2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/Test01012025155054t0.5_gemma-2
14fcae0d420d303df84bd9b9c8744a6f0fa147fb
3.591417
0
3.817
false
false
false
true
0.697964
0.155523
15.55229
0.28295
1.280547
0
0
0.241611
0
0.367021
3.710937
0.109043
1.004728
false
false
2025-01-01
2025-01-01
1
zelk12/Test01012025155054t0.5_gemma-2 (Merge)
zelk12_gemma-2-S2MTM-9B_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/gemma-2-S2MTM-9B" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/gemma-2-S2MTM-9B</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__gemma-2-S2MTM-9B-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/gemma-2-S2MTM-9B
fd6860743943114eeca6fc2e800e27c87873bcc5
31.148621
gemma
0
10.159
true
false
false
true
1.765103
0.782256
78.225553
0.606084
43.115728
0.04003
4.003021
0.345638
12.751678
0.421844
12.163802
0.429688
36.631944
true
false
2024-12-11
2024-12-11
1
zelk12/gemma-2-S2MTM-9B (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1
b4208ddf6c741884c16c77b9433d9ead8f216354
30.344893
2
10.159
false
false
false
true
3.443191
0.764895
76.489492
0.607451
43.706516
0.013595
1.359517
0.349832
13.310962
0.413625
10.303125
0.432098
36.899749
false
false
2024-10-03
2024-10-03
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1 (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25
e652c9e07265526851dad994f4640aa265b9ab56
33.300246
1
10.159
false
false
false
true
3.194991
0.770665
77.066517
0.607543
43.85035
0.155589
15.558912
0.343121
12.416107
0.43226
13.132552
0.439993
37.777039
false
false
2024-10-04
2024-10-04
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.25 (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75
eb0e589291630ba20328db650f74af949d217a97
28.421762
0
10.159
false
false
false
true
3.751453
0.720806
72.080635
0.59952
42.487153
0
0
0.349832
13.310962
0.395115
7.75599
0.414063
34.895833
false
false
2024-10-04
2024-10-04
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.1-t0.75 (Merge)
zelk12_recoilme-gemma-2-Ataraxy-9B-v0.2_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ataraxy-9B-v0.2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2
76f56b25bf6d8704282f8c77bfda28ca384883bc
30.113979
1
10.159
false
false
false
true
3.413675
0.759999
75.999902
0.606626
43.633588
0.012085
1.208459
0.348154
13.087248
0.410958
9.836458
0.432264
36.918218
false
false
2024-10-07
2024-10-11
1
zelk12/recoilme-gemma-2-Ataraxy-9B-v0.2 (Merge)
zelk12_recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1
1e3e623e9f0b386bfd967c629dd39c87daef5bed
31.626376
1
10.159
false
false
false
true
6.461752
0.761523
76.152276
0.609878
43.941258
0.073263
7.326284
0.341443
12.192394
0.431021
13.310937
0.431516
36.835106
false
false
2024-10-07
2024-10-07
1
zelk12/recoilme-gemma-2-Gutenberg-Doppel-9B-v0.1 (Merge)
zelk12_recoilme-gemma-2-Ifable-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/recoilme-gemma-2-Ifable-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-Ifable-9B-v0.1</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-Ifable-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-Ifable-9B-v0.1
8af6620b39c9a36239879b6b2bd88f66e9e9d930
32.254423
0
10.159
false
false
false
true
6.542869
0.794396
79.439554
0.60644
43.39057
0.09139
9.138973
0.35151
13.534676
0.420229
11.095313
0.432347
36.927453
false
false
2024-10-07
2024-10-07
1
zelk12/recoilme-gemma-2-Ifable-9B-v0.1 (Merge)
zelk12_recoilme-gemma-2-psy10k-mental_healt-9B-v0.1_bfloat16
bfloat16
🤝 base merges and moerges
🤝
Original
Gemma2ForCausalLM
<a target="_blank" href="https://huggingface.co./zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zelk12__recoilme-gemma-2-psy10k-mental_healt-9B-v0.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1
ced039b03be6f65ac0f713efcee76c6534e65639
32.448061
1
10.159
false
false
false
true
3.13222
0.744537
74.453672
0.597759
42.132683
0.180514
18.05136
0.34396
12.527964
0.429469
12.183594
0.418052
35.339096
false
false
2024-10-07
2024-10-07
1
zelk12/recoilme-gemma-2-psy10k-mental_healt-9B-v0.1 (Merge)
zetasepic_Qwen2.5-32B-Instruct-abliterated-v2_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co./zetasepic/Qwen2.5-32B-Instruct-abliterated-v2" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-32B-Instruct-abliterated-v2</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zetasepic__Qwen2.5-32B-Instruct-abliterated-v2-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zetasepic/Qwen2.5-32B-Instruct-abliterated-v2
5894fbf0a900e682dfc0ed794db337093bd8d26b
36.969237
apache-2.0
4
32.764
true
false
false
true
6.744789
0.833413
83.341312
0.693402
56.533818
0
0
0.36745
15.659955
0.435427
14.928385
0.562168
51.35195
false
false
2024-10-11
2024-12-07
2
Qwen/Qwen2.5-32B
zetasepic_Qwen2.5-72B-Instruct-abliterated_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
Qwen2ForCausalLM
<a target="_blank" href="https://huggingface.co./zetasepic/Qwen2.5-72B-Instruct-abliterated" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zetasepic/Qwen2.5-72B-Instruct-abliterated</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zetasepic__Qwen2.5-72B-Instruct-abliterated-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zetasepic/Qwen2.5-72B-Instruct-abliterated
af94b3c05c9857dbac73afb1cbce00e4833ec9ef
45.293139
other
16
72.706
true
false
false
false
18.809182
0.715261
71.526106
0.715226
59.912976
0.46148
46.148036
0.406879
20.917226
0.471917
19.122917
0.587184
54.131575
false
false
2024-10-01
2024-11-08
2
Qwen/Qwen2.5-72B
zhengr_MixTAO-7Bx2-MoE-v8.1_bfloat16
bfloat16
🔶 fine-tuned on domain-specific datasets
🔶
Original
MixtralForCausalLM
<a target="_blank" href="https://huggingface.co./zhengr/MixTAO-7Bx2-MoE-v8.1" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">zhengr/MixTAO-7Bx2-MoE-v8.1</a> <a target="_blank" href="https://huggingface.co./datasets/open-llm-leaderboard/zhengr__MixTAO-7Bx2-MoE-v8.1-details" style="color: var(--link-text-color); text-decoration: underline;text-decoration-style: dotted;">📑</a>
zhengr/MixTAO-7Bx2-MoE-v8.1
828e963abf2db0f5af9ed0d4034e538fc1cf5f40
17.168311
apache-2.0
55
12.879
true
true
false
true
0.92739
0.418781
41.878106
0.420194
19.176907
0.066465
6.646526
0.298658
6.487696
0.397625
8.303125
0.284658
20.517509
false
false
2024-02-26
2024-06-27
0
zhengr/MixTAO-7Bx2-MoE-v8.1