nemo-12b-rp-merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the della_linear merge method using /www/mistralai/Mistral-Nemo-Base-2407 as a base.

Models Merged

The following models were included in the merge:

  • /www/nemo-12b-rp/checkpoint-154
  • /www/mistralai/Mistral-Nemo-Instruct-2407

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: /www/nemo-12b-rp/checkpoint-154
    parameters:
      weight: 0.3
      density: 0.5
  - model: /www/mistralai/Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.7
      density: 0.8
merge_method: della_linear
base_model: /www/mistralai/Mistral-Nemo-Base-2407
parameters:
  epsilon: 0.05
  lambda: 1
  int8_mask: true
dtype: bfloat16
Downloads last month
6
Safetensors
Model size
12.2B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for taozi555/nemo-12b-rp-merge

Quantizations
2 models