Text Generation
Transformers
Safetensors
mistral
medical
text-generation-inference
Inference Endpoints
Edit model card

This is an ongoing experiment in training and retraining boundaries.

The model is currently overtrained and is purposely so to investigate the paths out of overtraining.

This is purely an experiment on depths and depravity of repetitive training. Don't bother messing around with it much.

Downloads last month
10
Safetensors
Model size
248M params
Tensor type
F32
·
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Datasets used to train jtatman/tinymistral-mediqa-248m