hos_sentiment_bert

This model is a fine-tuned version of google-bert/bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2168
  • F1: 0.9326

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 40.0

Training results

Training Loss Epoch Step Validation Loss F1
No log 1.0 331 0.2183 0.9258
0.239 2.0 662 0.2168 0.9326
0.239 3.0 993 0.2527 0.9282
0.1255 4.0 1324 0.2896 0.9288
0.0662 5.0 1655 0.3389 0.9266
0.0662 6.0 1986 0.3793 0.9294
0.0453 7.0 2317 0.4110 0.9252
0.0257 8.0 2648 0.4656 0.9205
0.0257 9.0 2979 0.4953 0.9263
0.0196 10.0 3310 0.5412 0.9265
0.0125 11.0 3641 0.5528 0.9245
0.0125 12.0 3972 0.5527 0.9262
0.0141 13.0 4303 0.5683 0.9276
0.0097 14.0 4634 0.5835 0.9239
0.0097 15.0 4965 0.5905 0.9280
0.0107 16.0 5296 0.5799 0.9298
0.009 17.0 5627 0.6127 0.9266
0.009 18.0 5958 0.5911 0.9284
0.0084 19.0 6289 0.5900 0.9303
0.008 20.0 6620 0.5923 0.9283
0.008 21.0 6951 0.6186 0.9305
0.0068 22.0 7282 0.6076 0.9292
0.0064 23.0 7613 0.5782 0.9303
0.0064 24.0 7944 0.6077 0.9320
0.0048 25.0 8275 0.6446 0.9282
0.0046 26.0 8606 0.6417 0.9315
0.0046 27.0 8937 0.6656 0.9283
0.0053 28.0 9268 0.6541 0.9288
0.0043 29.0 9599 0.6703 0.9277
0.0043 30.0 9930 0.6871 0.9252
0.0041 31.0 10261 0.6735 0.9286
0.0034 32.0 10592 0.6651 0.9306
0.0034 33.0 10923 0.6799 0.9305
0.0032 34.0 11254 0.6753 0.9297
0.0031 35.0 11585 0.6855 0.9310
0.0031 36.0 11916 0.6885 0.9306
0.003 37.0 12247 0.6960 0.9293
0.0026 38.0 12578 0.6950 0.9292
0.0026 39.0 12909 0.6964 0.9297
0.0033 40.0 13240 0.6954 0.9290

Framework versions

  • Transformers 4.48.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 3.1.0
  • Tokenizers 0.21.0
Downloads last month
40
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for hts98/hos_sentiment_bert

Finetuned
(2370)
this model