soravoid's picture
End of training
e2720f9
|
raw
history blame
8.56 kB
metadata
base_model: finiteautomata/bertweet-base-sentiment-analysis
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: bertweet-finetuned_twitch-sentiment-analysis
    results: []

bertweet-finetuned_twitch-sentiment-analysis

This model is a fine-tuned version of finiteautomata/bertweet-base-sentiment-analysis on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 2.3828
  • Accuracy: 0.6513
  • F1: 0.6513

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 1.0 79 0.9173 0.5424 0.5424
0.9476 2.0 158 0.9454 0.5701 0.5701
0.8032 3.0 237 0.8781 0.6107 0.6107
0.7289 4.0 316 0.9143 0.6218 0.6218
0.7289 5.0 395 0.8310 0.6513 0.6513
0.5873 6.0 474 0.9353 0.6624 0.6624
0.4568 7.0 553 0.9365 0.6734 0.6734
0.3544 8.0 632 1.0126 0.6494 0.6494
0.3161 9.0 711 1.0378 0.6494 0.6494
0.3161 10.0 790 1.2249 0.6568 0.6568
0.2757 11.0 869 1.1352 0.6808 0.6808
0.2619 12.0 948 1.2467 0.6697 0.6697
0.2292 13.0 1027 1.3262 0.6716 0.6716
0.2115 14.0 1106 1.3367 0.6697 0.6697
0.2115 15.0 1185 1.3757 0.6882 0.6882
0.1848 16.0 1264 1.3650 0.6697 0.6697
0.1916 17.0 1343 1.4940 0.6587 0.6587
0.1734 18.0 1422 1.5929 0.6808 0.6808
0.1715 19.0 1501 1.5662 0.6734 0.6734
0.1715 20.0 1580 1.6073 0.6845 0.6845
0.1711 21.0 1659 1.5038 0.6808 0.6808
0.1735 22.0 1738 1.8104 0.6587 0.6587
0.142 23.0 1817 1.4715 0.6900 0.6900
0.142 24.0 1896 1.7028 0.6863 0.6863
0.1504 25.0 1975 1.5413 0.6900 0.6900
0.1536 26.0 2054 1.7148 0.6624 0.6624
0.1405 27.0 2133 1.5510 0.6624 0.6624
0.1296 28.0 2212 1.6857 0.6863 0.6863
0.1296 29.0 2291 1.6228 0.6679 0.6679
0.1247 30.0 2370 1.7248 0.6716 0.6716
0.1181 31.0 2449 1.7833 0.6716 0.6716
0.1342 32.0 2528 1.9463 0.6661 0.6661
0.1412 33.0 2607 1.9416 0.6734 0.6734
0.1412 34.0 2686 1.7277 0.6679 0.6679
0.1114 35.0 2765 1.7833 0.6734 0.6734
0.1139 36.0 2844 1.8031 0.6753 0.6753
0.1143 37.0 2923 1.7150 0.6716 0.6716
0.1031 38.0 3002 1.9060 0.6827 0.6827
0.1031 39.0 3081 1.8854 0.6587 0.6587
0.1162 40.0 3160 1.8868 0.6753 0.6753
0.1115 41.0 3239 1.7967 0.6808 0.6808
0.1118 42.0 3318 1.9692 0.6661 0.6661
0.1118 43.0 3397 1.9876 0.6661 0.6661
0.1017 44.0 3476 1.9332 0.6642 0.6642
0.1172 45.0 3555 1.8807 0.6679 0.6679
0.1128 46.0 3634 1.9357 0.7011 0.7011
0.1196 47.0 3713 2.0208 0.6679 0.6679
0.1196 48.0 3792 1.9668 0.6679 0.6679
0.0955 49.0 3871 2.0051 0.6661 0.6661
0.0959 50.0 3950 1.9267 0.6661 0.6661
0.1144 51.0 4029 2.0940 0.6716 0.6716
0.107 52.0 4108 2.1097 0.6697 0.6697
0.107 53.0 4187 2.0383 0.6624 0.6624
0.1176 54.0 4266 1.9996 0.6587 0.6587
0.112 55.0 4345 2.0815 0.6716 0.6716
0.1033 56.0 4424 1.8365 0.6661 0.6661
0.116 57.0 4503 2.0785 0.6679 0.6679
0.116 58.0 4582 2.0580 0.6624 0.6624
0.1048 59.0 4661 2.0619 0.6863 0.6863
0.0907 60.0 4740 2.0260 0.6753 0.6753
0.1021 61.0 4819 2.0572 0.6753 0.6753
0.1021 62.0 4898 1.9949 0.6753 0.6753
0.0921 63.0 4977 2.0043 0.6808 0.6808
0.099 64.0 5056 2.1510 0.6697 0.6697
0.0792 65.0 5135 2.1658 0.6642 0.6642
0.1056 66.0 5214 2.0118 0.6734 0.6734
0.1056 67.0 5293 2.1683 0.6661 0.6661
0.0994 68.0 5372 2.1810 0.6734 0.6734
0.1054 69.0 5451 2.0225 0.6900 0.6900
0.0975 70.0 5530 2.1230 0.6679 0.6679
0.0885 71.0 5609 2.0770 0.6808 0.6808
0.0885 72.0 5688 2.0654 0.6771 0.6771
0.0939 73.0 5767 2.1239 0.6624 0.6624
0.1028 74.0 5846 2.1897 0.6771 0.6771
0.0851 75.0 5925 2.0848 0.6790 0.6790
0.0783 76.0 6004 2.1199 0.6734 0.6734
0.0783 77.0 6083 2.2011 0.6734 0.6734
0.0874 78.0 6162 2.1734 0.6679 0.6679
0.0878 79.0 6241 2.1986 0.6624 0.6624
0.0939 80.0 6320 2.2401 0.6642 0.6642
0.0939 81.0 6399 2.3477 0.6605 0.6605
0.0835 82.0 6478 2.3740 0.6605 0.6605
0.0887 83.0 6557 2.3200 0.6661 0.6661
0.0943 84.0 6636 2.3248 0.6642 0.6642
0.0875 85.0 6715 2.3079 0.6605 0.6605
0.0875 86.0 6794 2.3209 0.6568 0.6568
0.0822 87.0 6873 2.3303 0.6587 0.6587
0.0846 88.0 6952 2.3620 0.6531 0.6531
0.0909 89.0 7031 2.3498 0.6587 0.6587
0.0871 90.0 7110 2.3323 0.6513 0.6513
0.0871 91.0 7189 2.3494 0.6513 0.6513
0.0796 92.0 7268 2.3677 0.6513 0.6513
0.0797 93.0 7347 2.3887 0.6513 0.6513
0.0959 94.0 7426 2.3747 0.6513 0.6513
0.0861 95.0 7505 2.3896 0.6550 0.6550
0.0861 96.0 7584 2.3786 0.6531 0.6531
0.089 97.0 7663 2.3692 0.6531 0.6531
0.0764 98.0 7742 2.3789 0.6494 0.6494
0.0874 99.0 7821 2.3833 0.6513 0.6513
0.0852 100.0 7900 2.3828 0.6513 0.6513

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.1+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0