proofGPT-v0.1 / README.md
zhangirazerbayev's picture
Update README.md
4cfad08
|
raw
history blame
675 Bytes
---
language:
- en
tags:
- text generation
- pytorch
- causal-lm
- gpt_neox
license: mit
datasets:
- hoskinson-center/proof-pile
---
# ProofGPT-v0.1
# Model Description
ProofGPT-v0.1 is a 1.3B parameter language model based on the GPT-NeoX architecture and trained on the [proof-pile](https://huggingface.co./datasets/hoskinson-center/proof-pile).
The model is initialized with [pythia-1.3b](https://huggingface.co./EleutherAI/pythia-1.3b) weights. ProofGPT-v0.1's Weights & Biases training log is viewable [here](https://wandb.ai/zhangir-azerbayev/math-lm/groups/1.3B%20preliminary_227qly9c/workspace?workspace=user-zhangir-azerbayev).
Detailed evaluations coming soon :)