|
--- |
|
library_name: transformers |
|
license: apache-2.0 |
|
base_model: mistralai/Mistral-7B-v0.1 |
|
datasets: |
|
- andysalerno/ansalern-nectar-inputoutput |
|
--- |
|
|
|
This is [mistralai/Mistral-7B-v0.1](https://huggingface.co./mistralai/Mistral-7B-v0.1), but with the special tokens added for ChatML, and then lightly finetuned with sft using a ChatML formatted dataset: [andysalerno/ansalern-nectar-inputoutput](https://huggingface.co./datasets/andysalerno/ansalern-nectar-inputoutput) |
|
|
|
The training was very light, so while this model correctly follows ChatML formatting, it is not intended to be a chat model. |
|
|
|
Rather, it is intended to be a base for further fine-tuning models that will use ChatML. |