Model Trained Using AutoTrain

  • Problem type: Text Classification

Request Example

from transformers import pipeline

# Ensure the model and tokenizer are loaded on the GPU by setting device=0
emotion_classifier = pipeline(
    "text-classification",
    model="XuehangCang/Emotion-Classification",
    # device=0  # Use the first GPU device
)

texts = [
    "I'm so happy today!",
    "This is really sad.",
    "I'm a bit nervous about what's going to happen.",
    "This news makes me angry."
]

for text in texts:
    result = emotion_classifier(text)
    print(f"Text: {text}")
    print(f"Emotion classification result: {result}\n")

"""
Device set to use cpu
Text: I'm so happy today!
Emotion classification result: [{'label': 'joy', 'score': 0.9994311928749084}]

Text: This is really sad.
Emotion classification result: [{'label': 'sadness', 'score': 0.9989039897918701}]

Text: I'm a bit nervous about what's going to happen.
Emotion classification result: [{'label': 'fear', 'score': 0.998763918876648}]

Text: This news makes me angry.
Emotion classification result: [{'label': 'anger', 'score': 0.9977891445159912}]
"""

Validation Metrics

loss: 0.13341853022575378

f1_macro: 0.9169826832623412

f1_micro: 0.943

f1_weighted: 0.9427985114313238

precision_macro: 0.9227534317185495

precision_micro: 0.943

precision_weighted: 0.9430912986498113

recall_macro: 0.9119580961776227

recall_micro: 0.943

recall_weighted: 0.943

accuracy: 0.943

License

CC-0

Downloads last month
7
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
Unable to determine this model's library. Check the docs .

Model tree for XuehangCang/Emotion-Classification

Finetuned
(2412)
this model

Dataset used to train XuehangCang/Emotion-Classification