File size: 904 Bytes
f2afbad 4cfad08 f2afbad f5cb88a 0971c99 f5cb88a 1673100 d782ab6 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 |
---
language:
- en
tags:
- text generation
- pytorch
- causal-lm
- gpt_neox
license: mit
datasets:
- hoskinson-center/proof-pile
---
# ProofGPT-v0.1
# Model Description
ProofGPT-v0.1 is a 1.3B parameter language model based on the GPT-NeoX architecture and trained on the [proof-pile](https://huggingface.co./datasets/hoskinson-center/proof-pile) (v1.1).
The model is initialized with [pythia-1.3b](https://huggingface.co./EleutherAI/pythia-1.3b) weights. ProofGPT-v0.1's Weights & Biases training log is viewable [here](https://wandb.ai/zhangir-azerbayev/math-lm/groups/1.3B%20preliminary_227qly9c/workspace?workspace=user-zhangir-azerbayev).
Detailed evaluations coming soon :)
**Note**: Commit `3bcdc4e` replaced the weights with a model trained on proof-pile v1.1, as opposed to previous commits which were trained on v1.0. Commit `9695b51` updated the tokenizer to have bos, eos, and unk tokens. |