WizardLM-2-4x7B-MoE-exl2-5_0bpw / mergekit_moe_config.yml
Skylaude's picture
Upload 7 files
6319d19 verified
raw
history blame contribute delete
251 Bytes
base_model: models/WizardLM-2-7B
gate_mode: random
dtype: float16
experts_per_token: 4
experts:
- source_model: models/WizardLM-2-7B
- source_model: models/WizardLM-2-7B
- source_model: models/WizardLM-2-7B
- source_model: models/WizardLM-2-7B