File size: 1,015 Bytes
cee6b84 aa788d2 cee6b84 b9e46de cee6b84 b9e46de cee6b84 b9e46de cee6b84 b9e46de aa788d2 b9e46de aa788d2 b9e46de aa788d2 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 |
---
base_model: unsloth/Meta-Llama-3.1-8B-bnb-4bit
library_name: peft
---
**Overview**
DerivationGeneration8B is a QLoRA fine-tuned from a quantised LLaMa-3.1-8B checkpoint on 15K (LaTeX) synthetic mathematical derivations (containing 4 - 10 equations) via a custom early stopping script using ROUGE as the validation metric. This LoRA was trained for 6 epochs, and the approach outperforms MathT5 ```https://huggingface.co./jmeadows17/MathT5-large``` in both in-distribution and perturbed evaluation cases presented in related work ```https://arxiv.org/abs/2307.09998```.
**How to use**
A notebook for inference is available here ```https://github.com/jmeadows17/deriving-equations-with-LLMs/blob/main/llama_evaluation.ipynb```. Training scripts are also available in the repository.
**Example prompt**
```prompt = "Given \\cos{(q)} = \\theta{(q)}, then derive - \\sin{(q)} = \\frac{d}{d q} \\theta{(q)}, then obtain (- \\sin{(q)})^{q} (\\frac{d}{d q} \\cos{(q)})^{q} = (- \\sin{(q)})^{2 q}"```
- PEFT 0.12.0 |