nbeerbower's picture
Upload folder using huggingface_hub
bc1893d verified
|
raw
history blame
1.19 kB
---
base_model:
- flammenai/Mahou-1.5-mistral-nemo-12B
- nbeerbower/Mistral-Nemo-12B-abliterated-LORA
library_name: transformers
tags:
- mergekit
- merge
---
# Mahou-1.5-mistral-nemo-12B-lorablated
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [task arithmetic](https://arxiv.org/abs/2212.04089) merge method using [flammenai/Mahou-1.5-mistral-nemo-12B](https://huggingface.co./flammenai/Mahou-1.5-mistral-nemo-12B) + [nbeerbower/Mistral-Nemo-12B-abliterated-LORA](https://huggingface.co./nbeerbower/Mistral-Nemo-12B-abliterated-LORA) as a base.
### Models Merged
The following models were included in the merge:
### Configuration
The following YAML configuration was used to produce this model:
```yaml
base_model: flammenai/Mahou-1.5-mistral-nemo-12B+nbeerbower/Mistral-Nemo-12B-abliterated-LORA
dtype: bfloat16
merge_method: task_arithmetic
parameters:
normalize: false
slices:
- sources:
- layer_range: [0, 32]
model: flammenai/Mahou-1.5-mistral-nemo-12B+nbeerbower/Mistral-Nemo-12B-abliterated-LORA
parameters:
weight: 1.0
```