File size: 573 Bytes
02f405f
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
language:
- en
tags:
- text generation
- pytorch
- causal-lm
- gpt_neox
license: mit
datasets:
- hoskinson-center/proof-pile
---

# ProofGPT-v0.1

# Model Description
ProofGPT-v0.1 is a 6.7B parameter language model based on the GPT-NeoX architecture and trained on the [proof-pile](https://huggingface.co./datasets/hoskinson-center/proof-pile) (v1.1). 
We initiailized training with pythia-6.7b weights, a precursor to the [pythia-6.9b](https://huggingface.co./EleutherAI/pythia-6.9b) model that has roughly equivalent performance. 

Detailed evaluations coming soon :)