File size: 448 Bytes
756a6d8 |
1 2 3 4 5 6 7 |
# ai-msgbot GPT2-L + daily dialogues
_NOTE: this model card is a WIP_
GPT2-L (774M parameters) trained on the Wizard of Wikipedia dataset for 40k steps with 34/36 layers frozen using `aitextgen`. This model was then subsequently trained on the [Daily Dialogues](http://yanran.li/dailydialog) dataset for an additional 40k steps, this time with **35** of 36 layers frozen.
Designed for use with [ai-msgbot](https://github.com/pszemraj/ai-msgbot). |