OpenHermes-2.5-neural-chat-v3-3-openchat-3.5-1210-Slerp
This is the model for OpenHermes-2.5-neural-chat-v3-3-openchat-3.5-1210-Slerp. I used mergekit to merge models.
Yaml Config
slices:
- sources:
- model: Weyaxi/OpenHermes-2.5-neural-chat-v3-3-Slerp
layer_range: [0, 32]
- model: openchat/openchat-3.5-1210
layer_range: [0, 32]
merge_method: slerp
base_model: mistralai/Mistral-7B-v0.1
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5 # fallback for rest of tensors
tokenizer_source: union
dtype: bfloat16
- Downloads last month
- 64
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for Weyaxi/OpenHermes-2.5-neural-chat-v3-3-openchat-3.5-1210-Slerp
Merge model
this model