Edit model card

roberta-base

This model is a fine-tuned model that was trained using Amazon SageMaker and the new Hugging Face Deep Learning container.

  • Problem type: Multi Class Text Classification (emotion detection).

It achieves the following results on the evaluation set:

  • Loss: 0.1613253802061081
  • f1: 0.9413321705151999

Hyperparameters

{
    "epochs": 10,
    "train_batch_size": 16,
    "learning_rate": 3e-5, 
    "weight_decay":0.01,
    "load_best_model_at_end": true,
    "model_name":"roberta-base",
    "do_eval": True,
    "load_best_model_at_end":True
}

Validation Metrics

key value
eval_accuracy 0.941
eval_f1 0.9413321705151999
eval_loss 0.1613253802061081
eval_recall 0.941
eval_precision 0.9419519436781406
Downloads last month
87
Safetensors
Model size
125M params
Tensor type
I64
·
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train Jorgeutd/sagemaker-roberta-base-emotion

Evaluation results