Edit model card

The official base model weights for "Efficient Continual Pre-training by Mitigating the Stability Gap".

The model has been continually pretrained on a high-quality medical sub-corpus from the RefinedWeb dataset.

Downloads last month
15
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for YiDuo1999/Llama-3-Physician-8B-Base

Quantizations
1 model