MARTINI_enrich_BERTopic_afldscc
This is a BERTopic model. BERTopic is a flexible and modular topic modeling framework that allows for the generation of easily interpretable topics from large datasets.
Usage
To use this model, please install BERTopic:
pip install -U bertopic
You can use the model as follows:
from bertopic import BERTopic
topic_model = BERTopic.load("AIDA-UPM/MARTINI_enrich_BERTopic_afldscc")
topic_model.get_topic_info()
Topic overview
- Number of topics: 14
- Number of training documents: 1099
Click here for an overview of all topics.
Topic ID | Topic Keywords | Topic Frequency | Label |
---|---|---|---|
-1 | vaccine - cdc - ivermectin - zelenko - 2022 | 24 | -1_vaccine_cdc_ivermectin_zelenko |
0 | physicians - rilegislature - california - unconstitutional - hb2280 | 549 | 0_physicians_rilegislature_california_unconstitutional |
1 | vaccine - reinstated - mandates - cuomo - refusing | 76 | 1_vaccine_reinstated_mandates_cuomo |
2 | freedrgold - simone - supporters - pma - sentencing | 57 | 2_freedrgold_simone_supporters_pma |
3 | reawaken - stateline - clark - event - speedway | 55 | 3_reawaken_stateline_clark_event |
4 | freedom - days - injustices - flyer - defendants | 50 | 4_freedom_days_injustices_flyer |
5 | scotus - redress - tyranny - senators - brunson | 47 | 5_scotus_redress_tyranny_senators |
6 | vaccine - myocarditis - paxlovid - deaths - 2021 | 42 | 6_vaccine_myocarditis_paxlovid_deaths |
7 | citizencorps - aflds - meeting - dana - joined | 38 | 7_citizencorps_aflds_meeting_dana |
8 | pfizer - fauci - publicis - disinformation - fbi | 38 | 8_pfizer_fauci_publicis_disinformation |
9 | homeschool - educate - resources - christa - explore | 37 | 9_homeschool_educate_resources_christa |
10 | novavax - fda - injections - infants - 2022 | 31 | 10_novavax_fda_injections_infants |
11 | lockdowns - masks - effects - harmful - kaiser | 29 | 11_lockdowns_masks_effects_harmful |
12 | pandemics - stopthewho - sovereignty - amendments - geneva | 26 | 12_pandemics_stopthewho_sovereignty_amendments |
Training hyperparameters
- calculate_probabilities: True
- language: None
- low_memory: False
- min_topic_size: 10
- n_gram_range: (1, 1)
- nr_topics: None
- seed_topic_list: None
- top_n_words: 10
- verbose: False
- zeroshot_min_similarity: 0.7
- zeroshot_topic_list: None
Framework versions
- Numpy: 1.26.4
- HDBSCAN: 0.8.40
- UMAP: 0.5.7
- Pandas: 2.2.3
- Scikit-Learn: 1.5.2
- Sentence-transformers: 3.3.1
- Transformers: 4.46.3
- Numba: 0.60.0
- Plotly: 5.24.1
- Python: 3.10.12
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.