WTF this is

image/jpeg

This is a merge of pre-trained language models created using mergekit.

Merge Details

Basically. How many fucking models can you merge into 1 and still stay coherent, 24 is the answer (Though i might do a bigger one)

Quants

FP8: https://huggingface.co./SanXM1/Driftwood-12b-FP8/ EXL2: https://huggingface.co./NewEden/Delta-Vector_driftwood-exl2

Merge Method

This model was merged using the Model Stock merge method using IntervitensInc/Mistral-Nemo-Base-2407-chatml as a base.

Models Merged

The following models were included in the merge:

Configuration

The following YAML configuration was used to produce this model:

models:
  - model: Delta-Vector/Rei-12B
  - model: natong19/Mistral-Nemo-Instruct-2407-abliterated
  - model: Nitral-AI/Captain-Eris_Violet-GRPO-v0.420
  - model: Nitral-AI/Wayfarer_Eris_Noctis-12B
  - model: LatitudeGames/Wayfarer-12B
  - model: PygmalionAI/Pygmalion-3-12B
  - model: allura-org/Bigger-Body-12b
  - model: allura-org/MN-12b-RP-Ink
  - model: PocketDoc/Dans-SakuraKaze-V1.0.0-12b
  - model: PocketDoc/Dans-DangerousWinds-V1.1.0-12b
  - model: PocketDoc/Dans-PersonalityEngine-V1.1.0-12b
  - model: Delta-Vector/Ohashi-NeMo-12B
  - model: Delta-Vector/Francois-Huali-12B  
  - model: anthracite-org/magnum-v4-12b
  - model: Undi95/LocalC-12B-e2.0
  - model: NeverSleep/Lumimaid-v0.2-12B
  - model: Fizzarolli/MN-12b-Sunrose
  - model: anthracite-org/magnum-v2.5-12b-kto
  - model: elinas/Chronos-Gold-12B-1.0
  - model: nbeerbower/mistral-nemo-bophades-12B
  - model: nbeerbower/mistral-nemo-gutenberg-12B-v4
  - model: nbeerbower/mistral-nemo-wissenschaft-12B
  - model: nbeerbower/Mistral-Nemo-Prism-12B
  - model: nbeerbower/Lyra4-Gutenberg2-12B
merge_method: model_stock
base_model: IntervitensInc/Mistral-Nemo-Base-2407-chatml
normalize: false
int8_mask: true
dtype: bfloat16
Downloads last month
26
Safetensors
Model size
12.2B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for SanXM1/Driftwood-12B