GreenScorpius2-xb / mergekit_config.yml
powermove72's picture
Upload folder using huggingface_hub
1287240 verified
raw
history blame contribute delete
266 Bytes
slices:
- sources:
- model: viethq188/LeoScorpius-7B-Chat-DPO
layer_range: [0, 26]
- sources:
- model: GreenNode/GreenNode-mini-7B-multilingual-v1olet
layer_range: [8, 32]
merge_method: passthrough
tokenizer_source: union
dtype: float16