Kosmos-8B-v1

The serenity of infinity is not the end.

KosmosLogo256.png

This is an interesting merge of 14 cool models, created using mergekit. Enjoy exploring :)

Merge Details

Method

This model was merged using the multistep process and remerge with some model variations for best result.

Models

The following models were included in the merge:

Configuration

The following YAML configurations was used to produce this model:

# Cursed-UnalignedCosmicSaiga-8B-v1
models:
  - model: SicariusSicariiStuff/LLAMA-3_8B_Unaligned_BETA
  - model: aloobun/CosmicBun-8B-DPO
  - model: IlyaGusev/saiga_llama3_8b
merge_method: model_stock
base_model: Khetterman/CursedMatrix-8B-v9
dtype: bfloat16

# Cursed-BlueRainbowMaid-8B-v1
models:
  - model: v000000/L3-8B-BlueSerpentine
  - model: invisietch/L3.1-EtherealRainbow-v1.0-rc1-8B
  - model: bluuwhale/L3-SthenoMaidBlackroot-8B-V1
merge_method: model_stock
base_model: Khetterman/CursedMatrix-8B-v9
dtype: bfloat16

# Cursed-AverageLunaFusion-8B-v1
models:
  - model: jeiku/Average_Normie_v3.69_8B
  - model: Casual-Autopsy/L3-Luna-8B
  - model: ZeroXClem/Llama3.1-TheiaFire-DarkFusion-8B
merge_method: model_stock
base_model: Khetterman/CursedMatrix-8B-v9
dtype: bfloat16

# InfectedKosmos-8B-v1
models:
  - model: F:/Cursed-UnalignedCosmicSaiga-8B-v1
  - model: F:/Cursed-BlueRainbowMaid-8B-v1
  - model: F:/Cursed-AverageLunaFusion-8B-v1
merge_method: model_stock
base_model: Khetterman/CursedMatrix-8B-v9
dtype: bfloat16

# ZeroArkana-A
models:
  - model: ZeroXClem/L3SAO-Mix-SuperHermes-NovaPurosani-8B
    parameters:
      weight:  [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
      density: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
  - model: Arkana08/LexiMaid-L3-8B
    parameters:
      weight:  [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
      density: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
merge_method: della
parameters:
  epsilon: 0.1
  lambda:  1.0
base_model: F:/InfectedKosmos-8B-v1
dtype: bfloat16

# ZeroArkana-B
models:
  - model: ZeroXClem/Llama-3-Aetheric-Hermes-Lexi-Smaug-8B
    parameters:
      weight:  [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
      density: [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
  - model: Arkana08/Mythorica-L3-8B
    parameters:
      weight:  [0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50]
      density: [0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50, 0.65, 0.75, 0.80, 0.75, 0.65, 0.50, 0.35, 0.25, 0.20, 0.25, 0.35, 0.50]
merge_method: della
parameters:
  epsilon: 0.1
  lambda:  1.0
base_model: F:/InfectedKosmos-8B-v1
dtype: bfloat16

# Kosmos-8B-v1
models:
  - model: F:/ZeroArkana-A
  - model: F:/ZeroArkana-B
merge_method: model_stock
base_model: F:/InfectedKosmos-8B-v1
dtype: bfloat16

My thanks to the authors of the original models, your work is incredible. Have a good time 🖤

Downloads last month
22
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for Khetterman/Kosmos-8B-v1