|
--- |
|
base_model: |
|
- mistralai/Mistral-Nemo-Instruct-2407 |
|
- natong19/Mistral-Nemo-Instruct-2407-abliterated |
|
library_name: transformers |
|
tags: |
|
- mergekit |
|
- peft |
|
license: apache-2.0 |
|
--- |
|
# Mistral-Nemo-12B-abliterated-LORA |
|
|
|
This is a LoRA extracted from a language model. It was extracted using [mergekit](https://github.com/arcee-ai/mergekit). |
|
|
|
## LoRA Details |
|
|
|
This LoRA adapter was extracted from [natong19/Mistral-Nemo-Instruct-2407-abliterated](https://huggingface.co./natong19/Mistral-Nemo-Instruct-2407-abliterated) and uses [mistralai/Mistral-Nemo-Instruct-2407](https://huggingface.co./mistralai/Mistral-Nemo-Instruct-2407) as a base. |
|
|
|
### Parameters |
|
|
|
The following command was used to extract this LoRA adapter: |
|
|
|
```sh |
|
mergekit-extract-lora natong19/Mistral-Nemo-Instruct-2407-abliterated mistralai/Mistral-Nemo-Instruct-2407 OUTPUT_PATH --rank=64 |
|
``` |