--- base_model: - meta-llama/Llama-3.2-3B - meta-llama/Llama-3.2-3B-Instruct - PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B library_name: transformers tags: - mergekit - merge license: llama3.2 --- # LLaMa-3.2-Instruct-JankMixBread-v0.1-3B This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the breadcrumbs_ties merge method using [meta-llama/Llama-3.2-3B](https://huggingface.co./meta-llama/Llama-3.2-3B) as a base. ### Models Merged The following models were included in the merge: * [meta-llama/Llama-3.2-3B-Instruct](https://huggingface.co./meta-llama/Llama-3.2-3B-Instruct) * [PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B](https://huggingface.co./PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B) ### Configuration The following YAML configuration was used to produce this model: ```yaml merge_method: breadcrumbs_ties base_model: meta-llama/Llama-3.2-3B tokenizer_source: PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B dtype: bfloat16 parameters: normalize: true models: - model: meta-llama/Llama-3.2-3B-Instruct parameters: weight: 1 density: 0.9 gamma: 0.01 - model: PJMixers-Dev/LLaMa-3.2-Instruct-JankMix-v0.1-SFT-3B parameters: weight: 1 density: 0.9 gamma: 0.01 ```