Edit model card

Overview

DerivationGeneration8B is a QLoRA fine-tuned from a quantised LLaMa-3.1-8B checkpoint on 15K (LaTeX) synthetic mathematical derivations (containing 4 - 10 equations) via a custom script using ROUGE as the validation metric for early stopping (total 6 epochs). This approach outperforms MathT5 in both in-distribution and perturbed evaluation cases presented in related work.

How to use

A notebook for inference is available here. The model was exposed to prompts through the chatml template (all in the notebook).

Example prompt

prompt = "Given \\cos{(q)} = \\theta{(q)}, then derive - \\sin{(q)} = \\frac{d}{d q} \\theta{(q)}, then obtain (- \\sin{(q)})^{q} (\\frac{d}{d q} \\cos{(q)})^{q} = (- \\sin{(q)})^{2 q}"

  • PEFT 0.12.0
Downloads last month
20
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for jmeadows17/DerivationGeneration8B

Adapter
(33)
this model