frqwen2.5-from7b-duable4layers-it / mergekit_config.yml
ehristoforu's picture
Upload folder using huggingface_hub
3322c00 verified
raw
history blame contribute delete
312 Bytes
slices:
- sources:
- layer_range: [0, 14]
model: Qwen/Qwen2.5-7B-Instruct
- sources:
- layer_range: [10, 14]
model: Qwen/Qwen2.5-7B-Instruct
- sources:
- layer_range: [14, 28]
model: Qwen/Qwen2.5-7B-Instruct
merge_method: passthrough
dtype: bfloat16
tokenizer_source: "Qwen/Qwen2.5-7B-Instruct"