Marcoro14-7B-slerp

Marcoro14-7B-slerp is a merge of the following models using mergekit:

🧩 Configuration

  models:
    - model: meta-llama/Meta-Llama-3-8B
      # no parameters necessary for base model
    - model: mistralai/Mistral-7B-Instruct-v0.1
      parameters:
        density: 0.5
        weight: 0.5
  merge_method: ties
  base_model: meta-llama/Meta-Llama-3-8B
  parameters:
    normalize: true
  dtype: float16
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.