Update README.md
Browse files
README.md
CHANGED
@@ -30,7 +30,7 @@ print(pipe(news_text))
|
|
30 |
|
31 |
## Training Details
|
32 |
|
33 |
-
The News2Topic T5-base model was trained on a 21K sample of the "
|
34 |
|
35 |
The model was trained for 3 epochs, with a learning rate of 0.00001, a maximum sequence length of 512, and a training batch size of 12.
|
36 |
|
|
|
30 |
|
31 |
## Training Details
|
32 |
|
33 |
+
The News2Topic T5-base model was trained on a 21K sample of the "Newsroom" dataset (https://lil.nlp.cornell.edu/newsroom/index.html) annotated with synthetic data generated by GPT-3.5-turbo
|
34 |
|
35 |
The model was trained for 3 epochs, with a learning rate of 0.00001, a maximum sequence length of 512, and a training batch size of 12.
|
36 |
|