Remix
Quants Thanks to Lewdiculus: https://huggingface.co./Lewdiculous/Eris_Remix_7B-GGUF-IQ-Imatrix
Exl2 bpw here: https://huggingface.co./Test157t/ChaoticNeutrals-Eris_Remix_7B-exl2-5bpw
Configuration
The following YAML configuration was used to produce this model:
slices:
- sources:
- model: SpecialEdition
layer_range: [0, 32]
- model: Remix
layer_range: [0, 32]
merge_method: slerp
base_model: SpecialEdition
parameters:
t:
- filter: self_attn
value: [0, 0.5, 0.3, 0.7, 1]
- filter: mlp
value: [1, 0.5, 0.7, 0.3, 0]
- value: 0.5
dtype: bfloat16
- Downloads last month
- 248
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.