Mistral 7B - Holodeck

Model Description

Mistral 7B-Holodeck is a finetune created using Mistral's 7B model.

Training data

The training data contains around 3000 ebooks in various genres. Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>]

### Limitations and Biases
Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).
Downloads last month
98
Safetensors
Model size
7.24B params
Tensor type
FP16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for KoboldAI/Mistral-7B-Holodeck-1

Finetunes
1 model
Merges
14 models
Quantizations
2 models

Spaces using KoboldAI/Mistral-7B-Holodeck-1 5