WizardLM-2-4x7B-MoE-exl2-8_0bpw / mergekit_moe_config.yml
Skylaude's picture
Upload 7 files
59c739a verified
raw
history blame
251 Bytes
base_model: models/WizardLM-2-7B
gate_mode: random
dtype: float16
experts_per_token: 4
experts:
- source_model: models/WizardLM-2-7B
- source_model: models/WizardLM-2-7B
- source_model: models/WizardLM-2-7B
- source_model: models/WizardLM-2-7B