metadata
base_model:
- davidkim205/ko-gemma-2-9b-it
- valeriojob/MedGPT-Gemma2-9B-BA-v.1
- Shaleen123/gemma2-9b-medical
- anthracite-org/magnum-v3-9b-customgemma2
- ChuGyouk/ko-med-gemma-2-9b-it-merge2
library_name: transformers
tags:
- mergekit
- merge
merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the breadcrumbs_ties merge method using anthracite-org/magnum-v3-9b-customgemma2 as a base.
Models Merged
The following models were included in the merge:
- davidkim205/ko-gemma-2-9b-it
- valeriojob/MedGPT-Gemma2-9B-BA-v.1
- Shaleen123/gemma2-9b-medical
- ChuGyouk/ko-med-gemma-2-9b-it-merge2
Configuration
The following YAML configuration was used to produce this model:
models:
- model: davidkim205/ko-gemma-2-9b-it
layer_range: [0, 42]
parameters:
weight: 1
density: 0.7
gamma: 0.03
- model: ChuGyouk/ko-med-gemma-2-9b-it-merge2
layer_range: [0, 42]
parameters:
weight: 1
density: 0.42
gamma: 0.03
- model: Shaleen123/gemma2-9b-medical
layer_range: [0, 42]
parameters:
weight: 1
density: 0.42
gamma: 0.03
- model: valeriojob/MedGPT-Gemma2-9B-BA-v.1
layer_range: [0, 42]
parameters:
weight: 1
density: 0.42
gamma: 0.03
merge_method: breadcrumbs_ties
base_model: anthracite-org/magnum-v3-9b-customgemma2 #rtzr/ko-gemma-2-9b-it+ghost613/gemma9_on_korean_summary_events # lora model loading
dtype: float16