Jasontth's picture
climate_verfication_model
bce1e56
|
raw
history blame
No virus
13.7 kB
metadata
license: mit
base_model: roberta-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
model-index:
  - name: results
    results: []

results

This model is a fine-tuned version of roberta-base on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6970
  • Accuracy: 0.7288
  • F1: 0.7229

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 10
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
1.108 0.02 10 1.1080 0.2174 0.1291
1.1078 0.05 20 1.1070 0.2214 0.1295
1.1054 0.07 30 1.1045 0.2409 0.1356
1.1127 0.1 40 1.0994 0.3112 0.2653
1.0924 0.12 50 1.0923 0.5664 0.5046
1.0844 0.15 60 1.0841 0.6120 0.4647
1.074 0.17 70 1.0756 0.6120 0.4647
1.0725 0.19 80 1.0660 0.6120 0.4647
1.0546 0.22 90 1.0541 0.6120 0.4647
1.0407 0.24 100 1.0401 0.6120 0.4647
1.0107 0.27 110 1.0227 0.6120 0.4647
1.0217 0.29 120 1.0030 0.6120 0.4647
0.9785 0.32 130 0.9774 0.6120 0.4647
1.0076 0.34 140 0.9498 0.6120 0.4647
0.9475 0.36 150 0.9313 0.6120 0.4647
0.8933 0.39 160 0.9104 0.6120 0.4647
1.0152 0.41 170 0.9052 0.6120 0.4647
1.0132 0.44 180 0.9086 0.6120 0.4647
0.9295 0.46 190 0.9178 0.6120 0.4647
0.9264 0.49 200 0.9104 0.6120 0.4647
0.9901 0.51 210 0.9087 0.6120 0.4647
0.9287 0.53 220 0.9140 0.6120 0.4647
0.9729 0.56 230 0.9108 0.6120 0.4647
1.0134 0.58 240 0.9184 0.6120 0.4647
0.9293 0.61 250 0.9016 0.6120 0.4647
0.9546 0.63 260 0.8928 0.6120 0.4647
0.9028 0.66 270 0.8910 0.6120 0.4647
0.8572 0.68 280 0.8872 0.6120 0.4647
0.9085 0.7 290 0.8813 0.6120 0.4647
0.9711 0.73 300 0.8845 0.6120 0.4647
0.8595 0.75 310 0.8768 0.6120 0.4647
0.8392 0.78 320 0.8635 0.6120 0.4647
0.8645 0.8 330 0.8700 0.6120 0.4647
0.886 0.83 340 0.8746 0.6120 0.4647
0.9011 0.85 350 0.8624 0.6120 0.4647
0.866 0.87 360 0.8375 0.6120 0.4647
0.9093 0.9 370 0.8616 0.6120 0.4647
0.8792 0.92 380 0.8254 0.6120 0.4647
0.7503 0.95 390 0.8279 0.6120 0.4647
0.8007 0.97 400 0.8319 0.6120 0.4647
0.9182 1.0 410 0.8737 0.6120 0.4647
0.89 1.02 420 0.8689 0.6120 0.4647
0.8556 1.04 430 0.8321 0.6185 0.4917
0.8988 1.07 440 0.8146 0.6263 0.4981
0.8161 1.09 450 0.8289 0.6159 0.4735
0.8428 1.12 460 0.8441 0.6237 0.4908
0.8503 1.14 470 0.8284 0.6562 0.6118
0.7648 1.17 480 0.8277 0.6224 0.5989
0.8573 1.19 490 0.8402 0.6328 0.5723
0.7526 1.21 500 0.8147 0.6367 0.6037
0.8221 1.24 510 0.8205 0.6276 0.5986
0.83 1.26 520 0.7885 0.6471 0.5935
0.7811 1.29 530 0.7936 0.6497 0.6471
0.7587 1.31 540 0.7992 0.6510 0.6003
0.7823 1.33 550 0.7637 0.6589 0.6498
0.806 1.36 560 0.7986 0.6510 0.5994
0.6892 1.38 570 0.7657 0.6576 0.6338
0.7004 1.41 580 0.7759 0.6628 0.6604
0.76 1.43 590 0.7915 0.6497 0.6319
0.7296 1.46 600 0.7696 0.6536 0.6543
0.7777 1.48 610 0.7408 0.6615 0.6516
0.689 1.5 620 0.7559 0.6732 0.6359
0.7462 1.53 630 0.7471 0.6641 0.6622
0.7586 1.55 640 0.7719 0.6602 0.6484
0.7149 1.58 650 0.7450 0.6615 0.6556
0.7634 1.6 660 0.7440 0.6615 0.6499
0.6967 1.63 670 0.7679 0.6615 0.6295
0.8081 1.65 680 0.7868 0.6497 0.6525
0.7743 1.67 690 0.7756 0.6471 0.6513
0.6511 1.7 700 0.7339 0.6966 0.6700
0.7563 1.72 710 0.8288 0.6107 0.6282
0.7533 1.75 720 0.7225 0.6784 0.6716
0.6474 1.77 730 0.7119 0.7070 0.6915
0.6677 1.8 740 0.7168 0.6992 0.6879
0.6215 1.82 750 0.7381 0.6823 0.6725
0.7862 1.84 760 0.8190 0.6380 0.6555
0.661 1.87 770 0.7201 0.6953 0.6803
0.6256 1.89 780 0.7576 0.6732 0.6558
0.7411 1.92 790 0.8308 0.6263 0.6354
0.5917 1.94 800 0.7480 0.6875 0.6627
0.7315 1.97 810 0.7350 0.6862 0.6777
0.7161 1.99 820 0.7271 0.6862 0.6789
0.6705 2.01 830 0.7650 0.6888 0.6583
0.6363 2.04 840 0.7582 0.6602 0.6668
0.5478 2.06 850 0.7336 0.6875 0.6760
0.5762 2.09 860 0.7453 0.6797 0.6756
0.5043 2.11 870 0.7730 0.6706 0.6751
0.6707 2.14 880 0.7607 0.6797 0.6795
0.6797 2.16 890 0.7392 0.6966 0.6903
0.5108 2.18 900 0.7410 0.6992 0.6777
0.6752 2.21 910 0.7795 0.6641 0.6701
0.5653 2.23 920 0.7427 0.6927 0.6897
0.4893 2.26 930 0.7870 0.6719 0.6800
0.6131 2.28 940 0.7231 0.6992 0.6908
0.5764 2.31 950 0.7240 0.6784 0.6764
0.5644 2.33 960 0.7325 0.6758 0.6808
0.5864 2.35 970 0.7196 0.7083 0.7077
0.5273 2.38 980 0.7491 0.6979 0.7000
0.5442 2.4 990 0.7273 0.6979 0.6962
0.5273 2.43 1000 0.7619 0.6940 0.6971
0.5559 2.45 1010 0.7602 0.6927 0.6759
0.5739 2.48 1020 0.8416 0.6510 0.6620
0.6714 2.5 1030 0.7206 0.6901 0.6833
0.4798 2.52 1040 0.7417 0.6966 0.6967
0.5155 2.55 1050 0.7524 0.6836 0.6756
0.665 2.57 1060 0.7805 0.6836 0.6851
0.5047 2.6 1070 0.7259 0.7005 0.6911
0.4928 2.62 1080 0.7296 0.7070 0.6989
0.6354 2.65 1090 0.7149 0.7057 0.6942
0.5179 2.67 1100 0.7392 0.7005 0.7025
0.565 2.69 1110 0.9225 0.6211 0.6397
0.568 2.72 1120 0.7576 0.6927 0.6620
0.6313 2.74 1130 0.7672 0.6823 0.6870
0.5991 2.77 1140 0.7014 0.6953 0.6949
0.5064 2.79 1150 0.6919 0.7201 0.7108
0.5132 2.82 1160 0.7176 0.7109 0.7122
0.4623 2.84 1170 0.7508 0.7083 0.7116
0.5912 2.86 1180 0.6912 0.7188 0.7097
0.6299 2.89 1190 0.6937 0.7214 0.7108
0.526 2.91 1200 0.8388 0.6680 0.6729
0.6121 2.94 1210 0.7092 0.7227 0.7078
0.505 2.96 1220 0.7108 0.7057 0.7069
0.5917 2.99 1230 0.7166 0.6992 0.6991
0.4392 3.01 1240 0.7017 0.7135 0.7125
0.3661 3.03 1250 0.7366 0.7148 0.7077
0.4179 3.06 1260 0.7762 0.7135 0.7123
0.5012 3.08 1270 0.7817 0.6901 0.6943
0.455 3.11 1280 0.7387 0.7031 0.7018
0.45 3.13 1290 0.7666 0.6849 0.6895
0.3803 3.16 1300 0.7289 0.7057 0.7055
0.3249 3.18 1310 0.7702 0.7057 0.7057
0.4053 3.2 1320 0.8736 0.6693 0.6762
0.6543 3.23 1330 0.7545 0.7083 0.7046
0.5145 3.25 1340 0.7623 0.7044 0.7065
0.4317 3.28 1350 0.7426 0.7096 0.7085
0.3173 3.3 1360 0.7538 0.7201 0.7088
0.3904 3.33 1370 0.7851 0.6966 0.7013
0.4739 3.35 1380 0.7529 0.7096 0.7090
0.3597 3.37 1390 0.7475 0.7135 0.7049
0.5589 3.4 1400 0.7390 0.7057 0.7068
0.4127 3.42 1410 0.7603 0.6992 0.7039
0.4193 3.45 1420 0.7565 0.7031 0.6982
0.4774 3.47 1430 0.7831 0.6966 0.6999
0.5156 3.5 1440 0.8372 0.6875 0.6948
0.4646 3.52 1450 0.7770 0.7083 0.7079
0.4435 3.54 1460 0.8211 0.6914 0.6981
0.4664 3.57 1470 0.7730 0.7109 0.7116
0.4468 3.59 1480 0.7884 0.6966 0.6972
0.4693 3.62 1490 0.7881 0.7018 0.7049
0.4677 3.64 1500 0.7521 0.7018 0.6935
0.3911 3.67 1510 0.8343 0.6693 0.6750
0.4981 3.69 1520 0.7461 0.7057 0.7003
0.432 3.71 1530 0.7555 0.7227 0.7085
0.5283 3.74 1540 0.8265 0.6497 0.6596
0.4641 3.76 1550 0.7541 0.7005 0.6920
0.42 3.79 1560 0.7664 0.6979 0.6916
0.6015 3.81 1570 0.8471 0.6484 0.6541
0.5301 3.83 1580 0.7240 0.6979 0.6946
0.4583 3.86 1590 0.7755 0.6888 0.6921
0.5194 3.88 1600 0.7334 0.7122 0.7088
0.3624 3.91 1610 0.7659 0.6940 0.6951
0.543 3.93 1620 0.7718 0.6992 0.7027
0.3838 3.96 1630 0.7798 0.6940 0.6994
0.4389 3.98 1640 0.7479 0.7201 0.7159
0.3009 4.0 1650 0.7924 0.7031 0.7035
0.3812 4.03 1660 0.8021 0.7201 0.7186
0.3271 4.05 1670 0.8095 0.7188 0.7180
0.2551 4.08 1680 0.8355 0.7083 0.7107
0.3143 4.1 1690 0.8294 0.7096 0.7109
0.4337 4.13 1700 0.8897 0.6823 0.6873
0.5192 4.15 1710 0.8754 0.6758 0.6819
0.278 4.17 1720 0.8021 0.7096 0.7061
0.2782 4.2 1730 0.8350 0.6992 0.7031
0.2952 4.22 1740 0.8248 0.6966 0.6998

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.15.0