metadata
license: apache-2.0
Contains files for a Transformer model that answers 5,6,etc-digit addition and/or subtractions questions (e.g. 123450-345670=-0123230).
The model can have 1, 2 or 3 layers, 3 or 4 attention heads, d-model = 510, d-head = 170. The model can be trained to do addition, subtraction or both (aka "mixed"). An untrained mixed model can be initialised with a previously trained addition model.
The file are generated by two CoLabs:
- One is used to train the model outputing a "pt" file: https://github.com/apartresearch/Verified_addition/blob/main/assets/Accurate_Math_Train.ipynb
- One is used to analyse the model outputing a "json" file: https://github.com/apartresearch/Verified_addition/blob/main/assets/Accurate_Math_Analyse.ipynb