Finetuned models
Collection
3 items
•
Updated
The National Library of Sweden/KBLab releases a robust, multi-label sentiment classifier finetuned on Megatron-BERT-large-165K. The model was trained on approximately 75K Swedish texts from multiple linguistic domains and datasets.
There is a post on the KBLab blog describing the model in further detail.
@misc{hägglöf2023a,
author = {Hägglöf, Hillevi},
title = {The KBLab Blog: A robust, multi-label sentiment classifier for Swedish},
url = {https://kb-labb.github.io/posts/2023-06-16-a-robust-multi-label-sentiment-classifier-for-swedish/},
year = {2023}
}