BERT for Sentiment Analysis of Japanese Twitter
This model was finetuned from BERT for Japanese Twitter, which was adapted from Japanese BERT by Tohoku NLP by continuing MLM on a Twitter corpus.
It used Japanese Twitter Sentiment 1k (JTS1k) for finetuning, omitting the mixed examples.
Labels
0 -> Negative; 1 -> Neutral; 2 -> Positive
Example Pipeline
from transformers import pipeline
sentiment = pipeline("sentiment-analysis", model="LoneWolfgang/bert-for-japanese-twitter-sentiment")
sentiment ("こちらのカフェ、サービスが残念でした。二度と行かないかな…😞 #がっかり")
[{'label': 'negative', 'score': 0.8242}]
- Downloads last month
- 447
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.