GPT2 Models
Collection
All gpt2 models were trained from scratch
•
3 items
•
Updated
This is the polish gpt2 model in small architecture.
This model was released on 30.11.2023 and it is the newest version of radlab/polish-gpt2-small
(https://huggingface.co./radlab/polish-gpt2-small)
Data which are used to train this model:
It is about 30,5 GB of data which is 3 times more than the prevoius version.