File size: 1,566 Bytes
afacb17 6d49b80 afacb17 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 |
---
license: mit
language:
- en
pipeline_tag: text2text-generation
---
# Model Card for Model ID
## Model Details
### Model Description
- **Model type:** Text-to-Text Generation
- **Language(s) (NLP):** English
- **License:** MIT License
- **Finetuned from model:** T5 Base Model (Google AI)
## Uses
The News2Topic T5-base model is designed for automatic generation of topic names from news articles or news-like text. It can be integrated into news aggregation platforms, content management systems, or used for enhancing news browsing and searching experiences by providing concise topics.
## How to Get Started with the Model
```
from transformers import pipeline
pipe = pipeline("text2text-generation", model="textgain/News2Topic-T5-base")
```
# Example usage
```
news_text = "Your news text here."
print(pipe(news_text))
```
## Training Details
### Training Data
The News2Topic T5-base model was trained on a 21K sample of the "newsroom" dataset annotated with synthetic data generated by GPT-3.5-turbo
### Training Procedure
The model was trained for 3 epochs, with a learning rate of 0.00001, a maximum sequence length of 512, and a training batch size of 12.
## Citation
**BibTeX:**
```
@article{Kosar_DePauw_Daelemans_2024,
title={Comparative Evaluation of Topic Detection: Humans vs. LLMs}, volume={13},
url={https://www.clinjournal.org/clinj/article/view/173}, journal={Computational Linguistics in the Netherlands Journal},
author={Kosar, Andriy and De Pauw, Guy and Daelemans, Walter},
year={2024},
month={Mar.},
pages={91–120} }
``` |