metadata
base_model: []
library_name: transformers
tags:
- mergekit
- merge
- alpaca
- mistral
- not-for-all-audiences
- nsfw
- exl2
license: cc-by-nc-4.0
IceCocoaRP-7b-8bpw-exl2
8bpw-exl2 quant of icefog72/IceCocoaRP-7b
Rules-lorebook and settings I'm using you can find here
Merge Details
The best one so far for me.
Merge Method
This model was merged using the TIES merge method using NeuralBeagleJaskier as a base.
Models Merged
The following models were included in the merge:
- NeuralBeagleJaskier
- IceBlendedCoffeeRP-7b (slerp bfloat16)
- IceCoffeeRP-7b
- IceBlendedLatteRP-7b base
Configuration
The following YAML configuration was used to produce this model:
models:
- model: NeuralBeagleJaskier
parameters:
density: 0.9
weight: 0.5
- model: IceBlendedCoffeeRP-7b
parameters:
density: 0.5
weight: 0.3
merge_method: ties
base_model: NeuralBeagleJaskier
parameters:
normalize: true
int8_mask: true
dtype: float16