MathT5-large / README.md
jmeadows17's picture
Update README.md
a6d13ee
|
raw
history blame
985 Bytes
---
license: openrail
pipeline_tag: text-generation
---
**Overview**
MathT5-large is a version of FLAN-T5-large fine-tuned for 25 epochs on 15K (LaTeX) synthetic mathematical derivations (containing 5 - 9 equations), that were generated using a symbolic solver (SymPy).
It outperforms GPT-4 and ChatGPT (paper link soon) on a derivation generation task in ROUGE, BLEU, BLEURT, and GLEU scores, and shows some generalisation capabilities.
It was trained on 155 physics symbols, but struggles with out-of-vocabulary symbols.
**Example prompt:**
```prompt = "Given \\cos{(q)} = \\theta{(q)},
then derive - \\sin{(q)} = \\frac{d}{d q} \\theta{(q)},
then obtain (- \\sin{(q)})^{q} (\\frac{d}{d q} \\cos{(q)})^{q} = (- \\sin{(q)})^{2 q}"```
**To use MathT5 easily:**
1. Download ```MathT5.py``` to your working directory.
2. ```from MathT5 import load_model, inference```
3. ```tokenizer, model = load_model("jmeadows17/MathT5-large")```
4. ```inference(prompt, tokenizer, model)```