rttl-ai/SentyBert
Model Details
Model Description: This model is a fine-tune checkpoint of bert-large-uncased, fine-tuned on SST-2. This model reaches an accuracy of 99.92 on the dev set.
- Developed by: rttl-ai
- Model Type: Text Classification
- Language(s): English
- License: Apache-2.0
- Resources for more information:
- The model was pre-trained with task-adaptive pre-training TAPT with an increased masking rate, no corruption strategy, and using WWM, following this paper
- fine-tuned on sst with subtrees
- fine-tuned on sst2
- Downloads last month
- 15
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Datasets used to train rttl-ai/bert-large-uncased-sentiment
Evaluation results
- F1 Macro on sst2validation set self-reported0.999
- Accuracy on sst2validation set self-reported0.999