Model Card for Model mistral7b-v0.3-ultrachat200k

Trained using LoRA with r=32 for 1 epoch on 208k examples of chats from HuggingFaceH4/ultrachat_200k with max_seq_len 16384.

Downloads last month
28
Safetensors
Model size
7.25B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.