DataLLM v2 Mixtral 7b

This is a fine-tuned model of mistralai/Mixtral-8x7B-v0.1, that has been fine-tuned for a single epoch using mostlyai/datallm-instructs-v2 for the purpose of efficient answering to row completion prompts.

See https://github.com/mostly-ai/datallm for more.

Downloads last month
12
Safetensors
Model size
46.7B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for mostlyai/datallm-v2-mixtral-8x7b-v0.1

Quantizations
2 models

Dataset used to train mostlyai/datallm-v2-mixtral-8x7b-v0.1