|
--- |
|
language: |
|
- en |
|
tags: |
|
- text generation |
|
- pytorch |
|
- causal-lm |
|
- gpt_neox |
|
license: mit |
|
datasets: |
|
- hoskinson-center/proof-pile |
|
--- |
|
|
|
# ProofGPT-v0.1 |
|
|
|
# Model Description |
|
ProofGPT-v0.1 is a 1.3B parameter language model based on the GPT-NeoX architecture and trained on the [proof-pile](https://huggingface.co./datasets/hoskinson-center/proof-pile) (v1.1). |
|
The model is initialized with [pythia-1.3b](https://huggingface.co./EleutherAI/pythia-1.3b) weights. ProofGPT-v0.1's Weights & Biases training log is viewable [here](https://wandb.ai/zhangir-azerbayev/math-lm/groups/1.3B%20preliminary_227qly9c/workspace?workspace=user-zhangir-azerbayev). |
|
|
|
Detailed evaluations coming soon :) |
|
|
|
**Note**: Commit `9695b51` updated the tokenizer to have bos, eos, and unk tokens. |