llama-3-experiment-v1-9B / mergekit_config.yml
grimjim's picture
Initial release
db86ef1
raw
history blame
294 Bytes
slices:
- sources:
- model: C:./text-generation-webui/models/meta-llama_Meta-Llama-3-8B-Instruct
layer_range: [0, 12]
- sources:
- model: C:./text-generation-webui/models/meta-llama_Meta-Llama-3-8B-Instruct
layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16