metadata
language:
- en
thumbnail: >-
https://cdn.theatlantic.com/thumbor/lx3Vy9ojT2A0mHVooAUARLRpUEc=/0x215:3500x2184/976x549/media/img/mt/2018/03/RTR4F51Y/original.jpg
tags:
- text-classification
- sentiment-analysis
- poem-sentiment-detection
- poem-sentiment
license: apache-2.0
datasets:
- poem_sentment
metrics:
- Accuracy, F1 score
nickwong64/bert-base-uncased-poems-sentiment
Bert is a Transformer Bidirectional Encoder based Architecture trained on MLM(Mask Language Modeling) objective. bert-base-uncased finetuned on the poem_sentiment dataset using HuggingFace Trainer with below training parameters.
learning rate 2e-5,
batch size 8,
num_train_epochs=8,
Model Performance
Epoch | Training Loss | Validation Loss | Accuracy | F1 |
---|---|---|---|---|
8 | 0.468200 | 0.458632 | 0.904762 | 0.899756 |
How to Use the Model
from transformers import pipeline
nlp = pipeline(task='text-classification',
model='nickwong64/bert-base-uncased-poems-sentiment')
p1 = "No man is an island, Entire of itself, Every man is a piece of the continent, A part of the main."
p2 = "Ten years, dead and living dim and draw apart. I don’t try to remember, But forgetting is hard."
p3 = "My mind to me a kingdom is; Such present joys therein I find,That it excels all other bliss"
print(nlp(p1))
print(nlp(p2))
print(nlp(p3))
"""
output:
[{'label': 'no_impact', 'score': 0.9982421398162842}]
[{'label': 'negative', 'score': 0.9856176972389221}]
[{'label': 'positive', 'score': 0.9931322932243347}]
"""
Dataset
Labels
{0: 'negative', 1: 'positive', 2: 'no_impact', 3: 'mixed'}
Evaluation
{'test_loss': 0.4359096586704254,
'test_accuracy': 0.9142857142857143,
'test_f1': 0.9120554830816401,
'test_runtime': 0.5689,
'test_samples_per_second': 184.582,
'test_steps_per_second': 24.611}