metadata
license: cc-by-sa-4.0
language:
- de
- en
- es
- da
- pl
- sv
- nl
metrics:
- accuracy
pipeline_tag: text-classification
tags:
- partypress
- political science
- parties
- press releases
PARTYPRESS multilingual
Fine-tuned model in seven languages on texts from nine countries, based on bert-base-multilingual-cased. It used in Erfort et al. (2023).
Model description
tbs
Model variations
tbd (monolingual)
Intended uses & limitations
tbd
How to use
tbd
Limitations and bias
tbd
Training data
For the training data, please refer to bert-base-multilingual-cased
Training procedure
Preprocessing
For the preprocessing, please refer to bert-base-multilingual-cased
Pretraining
For the pretraining, please refer to bert-base-multilingual-cased
Evaluation results
Fine-tuned on our downstream task, this model achieves the following results:
BibTeX entry and citation info
@article{erfort_partypress_2023,
author = {Cornelius Erfort and
Lukas F. Stoetzer and
Heike Klüver},
title = {The PARTYPRESS Database: A New Comparative Database of Parties’ Press Releases},
journal = {Research and Politics},
volume = {forthcoming},
year = {2023},
}