Edit model card

SyntheticMoist-v2

RP Model, Solar. Higher density+LimaRP led to better performance, Use Alpaca/Vicuna.

image/png

Thanks mradermacher for the quants!

Quants

merge

This is a merge of pre-trained language models created using mergekit.

Merge Details

Merge Method

This model was merged using the DARE TIES merge method using Sao10K/Fimbulvetr-11B-v2 as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: Himitsui/MedMitsu-Instruct-11B
    parameters:
      weight: 0.13
      density: 0.60
  - model: Himitsui/Kaiju-11B
    parameters:
      weight: 0.22
      density: 0.73
  - model: migtissera/Synthia-v3.0-11B+jeiku/Re-Host_Limarp_Mistral
    parameters:
      weight: 0.28
      density: 0.80
  - model: TheDrummer/Moistral-11B-v3
    parameters:
      weight: 0.37
      density: 0.85
merge_method: dare_ties
base_model: Sao10K/Fimbulvetr-11B-v2
parameters:
  int8_mask: true
dtype: bfloat16
Downloads last month
24
Safetensors
Model size
10.7B params
Tensor type
BF16
Β·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for v000000/SyntheticMoist-11B-v2

Spaces using v000000/SyntheticMoist-11B-v2 3

Collection including v000000/SyntheticMoist-11B-v2