Suparious's picture
Update README.md
c58b9cc verified
|
raw
history blame
1.86 kB
---
language:
- en
- ru
model_creator: LakoMoor
model_type: mistral
license: cc-by-nc-4.0
library_name: transformers
tags:
- 4-bit
- AWQ
- text-generation
- autotrain_compatible
- endpoints_compatible
- not-for-all-audiences
- nsfw
model_name: Silicon-Alice-7B
base_model:
- LakoMoor/Silicon-Masha-7B
pipeline_tag: text-generation
inference: false
quantized_by: Suparious
---
# LakoMoor/Silicon-Alice-7B AWQ
- Model creator: [LakoMoor](https://huggingface.co./LakoMoor)
- Original model: [Silicon-Alice-7B](https://huggingface.co./LakoMoor/Silicon-Alice-7B)
![Silicon-Alice-7B](https://huggingface.co./LakoMoor/Silicon-Alice-7B/resolve/main/assets/alice.png)
## Model Summary
Silicon-Alice-7B is a model based on [Silicon-Masha-7B](https://huggingface.co./LakoMoor/Silicon-Alice-7B) aiming to be both strong in RP, be smart **and** understand Russian, that can follow character maps very well. This model understands Russian better than the previous one. It is suitable for RP/ERP and general use.
## Prompt Template (Alpaca)
I found the best SillyTavern results from using the Noromaid template but please try other templates! Let me know if you find anything good.
SillyTavern config files: [Context](https://huggingface.co./LakoMoor/Silicon-Alice-7B/resolve/main/assets/context.json), [Instruct](https://huggingface.co./LakoMoor/Silicon-Alice-7B/resolve/main/assets/instruct.json).
Additionally, here is my highly recommended [Text Completion preset](https://huggingface.co./LakoMoor/Silicon-Alice-7B/resolve/main/assets/MinP.json). You can tweak this by adjusting temperature up or dropping min p to boost creativity or raise min p to increase stability. You shouldn't need to touch anything else!
```
Below is an instruction that describes a task. Write a response that appropriately completes the request.
### Instruction:
{prompt}
### Response:
```