WomboCombo-R1-14B-Preview
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the sce merge method using Qwen/Qwen2.5-14B-Instruct as a base.
Models Merged
The following models were included in the merge:
- deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
- arcee-ai/Virtuoso-Small
- Krystalan/DRT-o1-14B
- qingy2024/Fusion4-14B-Instruct
Configuration
The following YAML configuration was used to produce this model:
models:
# Pivot model
- model: Qwen/Qwen2.5-14B-Instruct
# Target models
- model: deepseek-ai/DeepSeek-R1-Distill-Qwen-14B
- model: Krystalan/DRT-o1-14B
- model: arcee-ai/Virtuoso-Small
- model: qingy2024/Fusion4-14B-Instruct
merge_method: sce
base_model: Qwen/Qwen2.5-14B-Instruct
parameters:
select_topk: 1.0
dtype: bfloat16
- Downloads last month
- 18
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for RDson/WomboCombo-R1-14B-Preview
Merge model
this model