ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k1_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6504
  • Qwk: 0.5578
  • Mse: 0.6504
  • Rmse: 0.8065

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.4 2 4.0923 0.0 4.0923 2.0229
No log 0.8 4 2.9759 -0.0242 2.9759 1.7251
No log 1.2 6 1.4468 0.0 1.4468 1.2028
No log 1.6 8 1.0334 0.2857 1.0334 1.0166
No log 2.0 10 0.9606 0.2544 0.9606 0.9801
No log 2.4 12 1.0353 0.1962 1.0353 1.0175
No log 2.8 14 1.1293 0.1361 1.1293 1.0627
No log 3.2 16 1.0317 0.1997 1.0317 1.0157
No log 3.6 18 0.8784 0.3162 0.8784 0.9373
No log 4.0 20 0.7874 0.3821 0.7874 0.8874
No log 4.4 22 0.7384 0.4660 0.7384 0.8593
No log 4.8 24 0.9148 0.3944 0.9148 0.9565
No log 5.2 26 0.9277 0.4560 0.9277 0.9632
No log 5.6 28 0.6802 0.5725 0.6802 0.8247
No log 6.0 30 0.6621 0.6197 0.6621 0.8137
No log 6.4 32 0.6916 0.6235 0.6916 0.8316
No log 6.8 34 0.7130 0.6120 0.7129 0.8444
No log 7.2 36 0.7368 0.6241 0.7368 0.8583
No log 7.6 38 0.8262 0.6089 0.8262 0.9089
No log 8.0 40 0.9690 0.4894 0.9690 0.9844
No log 8.4 42 0.8814 0.5727 0.8814 0.9388
No log 8.8 44 0.8681 0.5026 0.8681 0.9317
No log 9.2 46 0.7804 0.6019 0.7804 0.8834
No log 9.6 48 0.7676 0.6242 0.7676 0.8761
No log 10.0 50 0.6799 0.5898 0.6799 0.8245
No log 10.4 52 0.6863 0.6094 0.6863 0.8284
No log 10.8 54 0.7178 0.5815 0.7178 0.8472
No log 11.2 56 0.8726 0.5965 0.8726 0.9341
No log 11.6 58 0.8296 0.6225 0.8296 0.9108
No log 12.0 60 0.9171 0.5896 0.9171 0.9576
No log 12.4 62 0.7419 0.6239 0.7419 0.8614
No log 12.8 64 0.6620 0.6197 0.6620 0.8136
No log 13.2 66 0.6618 0.6160 0.6618 0.8135
No log 13.6 68 0.8943 0.5948 0.8943 0.9457
No log 14.0 70 0.8615 0.5896 0.8615 0.9282
No log 14.4 72 0.6340 0.6042 0.6340 0.7962
No log 14.8 74 0.6028 0.6695 0.6028 0.7764
No log 15.2 76 0.6113 0.6695 0.6113 0.7818
No log 15.6 78 0.6103 0.6022 0.6103 0.7812
No log 16.0 80 0.7449 0.5988 0.7449 0.8631
No log 16.4 82 0.7361 0.5988 0.7361 0.8580
No log 16.8 84 0.6563 0.5945 0.6563 0.8101
No log 17.2 86 0.6119 0.6160 0.6119 0.7822
No log 17.6 88 0.5946 0.6244 0.5946 0.7711
No log 18.0 90 0.6269 0.6598 0.6269 0.7917
No log 18.4 92 0.6063 0.6482 0.6063 0.7787
No log 18.8 94 0.5969 0.6383 0.5969 0.7726
No log 19.2 96 0.5764 0.6244 0.5764 0.7592
No log 19.6 98 0.6075 0.6005 0.6075 0.7794
No log 20.0 100 0.7462 0.6136 0.7462 0.8639
No log 20.4 102 0.7146 0.6126 0.7146 0.8453
No log 20.8 104 0.5959 0.6022 0.5959 0.7719
No log 21.2 106 0.6028 0.6872 0.6028 0.7764
No log 21.6 108 0.6379 0.7155 0.6379 0.7987
No log 22.0 110 0.6157 0.6215 0.6157 0.7847
No log 22.4 112 0.7477 0.6067 0.7477 0.8647
No log 22.8 114 0.9173 0.6157 0.9173 0.9578
No log 23.2 116 0.8441 0.6002 0.8441 0.9187
No log 23.6 118 0.6946 0.6051 0.6946 0.8334
No log 24.0 120 0.6811 0.6471 0.6811 0.8253
No log 24.4 122 0.6469 0.6570 0.6469 0.8043
No log 24.8 124 0.6075 0.6051 0.6075 0.7794
No log 25.2 126 0.6577 0.5536 0.6577 0.8110
No log 25.6 128 0.6863 0.6461 0.6863 0.8284
No log 26.0 130 0.6679 0.5578 0.6679 0.8172
No log 26.4 132 0.6490 0.6129 0.6490 0.8056
No log 26.8 134 0.6686 0.6129 0.6686 0.8177
No log 27.2 136 0.6978 0.5657 0.6978 0.8353
No log 27.6 138 0.8516 0.6038 0.8516 0.9228
No log 28.0 140 0.9436 0.5712 0.9436 0.9714
No log 28.4 142 0.8213 0.6022 0.8213 0.9063
No log 28.8 144 0.6700 0.5470 0.6700 0.8186
No log 29.2 146 0.6646 0.6479 0.6646 0.8152
No log 29.6 148 0.6617 0.6900 0.6617 0.8134
No log 30.0 150 0.6549 0.6112 0.6549 0.8092
No log 30.4 152 0.7518 0.6026 0.7518 0.8671
No log 30.8 154 0.7626 0.6193 0.7626 0.8733
No log 31.2 156 0.6756 0.5466 0.6756 0.8219
No log 31.6 158 0.6638 0.5859 0.6638 0.8147
No log 32.0 160 0.7096 0.6005 0.7096 0.8424
No log 32.4 162 0.7150 0.5462 0.7150 0.8456
No log 32.8 164 0.6981 0.5470 0.6981 0.8356
No log 33.2 166 0.6615 0.5955 0.6615 0.8133
No log 33.6 168 0.6568 0.5955 0.6568 0.8105
No log 34.0 170 0.6557 0.5567 0.6557 0.8098
No log 34.4 172 0.6679 0.5470 0.6679 0.8173
No log 34.8 174 0.6898 0.5275 0.6898 0.8306
No log 35.2 176 0.6970 0.5470 0.6970 0.8349
No log 35.6 178 0.7395 0.5593 0.7395 0.8600
No log 36.0 180 0.7886 0.5862 0.7886 0.8880
No log 36.4 182 0.7544 0.5698 0.7544 0.8686
No log 36.8 184 0.6773 0.5859 0.6773 0.8230
No log 37.2 186 0.6379 0.5955 0.6379 0.7987
No log 37.6 188 0.6222 0.5895 0.6222 0.7888
No log 38.0 190 0.6045 0.6051 0.6045 0.7775
No log 38.4 192 0.5981 0.6041 0.5981 0.7734
No log 38.8 194 0.6066 0.6138 0.6066 0.7789
No log 39.2 196 0.6333 0.5955 0.6333 0.7958
No log 39.6 198 0.6592 0.5859 0.6592 0.8119
No log 40.0 200 0.7012 0.5835 0.7012 0.8374
No log 40.4 202 0.6785 0.5686 0.6785 0.8237
No log 40.8 204 0.6495 0.5859 0.6495 0.8059
No log 41.2 206 0.6380 0.6341 0.6380 0.7987
No log 41.6 208 0.6314 0.6041 0.6314 0.7946
No log 42.0 210 0.6384 0.5859 0.6384 0.7990
No log 42.4 212 0.6760 0.5678 0.6760 0.8222
No log 42.8 214 0.7192 0.6575 0.7192 0.8481
No log 43.2 216 0.7171 0.6041 0.7171 0.8468
No log 43.6 218 0.6859 0.6303 0.6859 0.8282
No log 44.0 220 0.6667 0.5927 0.6667 0.8165
No log 44.4 222 0.6281 0.5955 0.6281 0.7925
No log 44.8 224 0.6088 0.5955 0.6088 0.7803
No log 45.2 226 0.6130 0.6186 0.6130 0.7830
No log 45.6 228 0.6484 0.6154 0.6484 0.8052
No log 46.0 230 0.6507 0.6154 0.6507 0.8067
No log 46.4 232 0.6340 0.5784 0.6340 0.7962
No log 46.8 234 0.6286 0.5955 0.6286 0.7928
No log 47.2 236 0.6360 0.6051 0.6360 0.7975
No log 47.6 238 0.6469 0.5567 0.6469 0.8043
No log 48.0 240 0.6573 0.5915 0.6573 0.8107
No log 48.4 242 0.6990 0.5676 0.6990 0.8360
No log 48.8 244 0.7242 0.5688 0.7242 0.8510
No log 49.2 246 0.7101 0.6014 0.7101 0.8427
No log 49.6 248 0.6501 0.5805 0.6501 0.8063
No log 50.0 250 0.6156 0.5945 0.6156 0.7846
No log 50.4 252 0.6187 0.6041 0.6187 0.7866
No log 50.8 254 0.6141 0.6041 0.6141 0.7836
No log 51.2 256 0.6229 0.6051 0.6229 0.7892
No log 51.6 258 0.6646 0.5773 0.6646 0.8152
No log 52.0 260 0.6961 0.6045 0.6961 0.8343
No log 52.4 262 0.6643 0.6244 0.6643 0.8151
No log 52.8 264 0.6314 0.5470 0.6314 0.7946
No log 53.2 266 0.6270 0.5567 0.6270 0.7919
No log 53.6 268 0.6288 0.5955 0.6288 0.7930
No log 54.0 270 0.6265 0.5955 0.6265 0.7915
No log 54.4 272 0.6214 0.5955 0.6214 0.7883
No log 54.8 274 0.6212 0.5955 0.6212 0.7881
No log 55.2 276 0.6176 0.5955 0.6176 0.7859
No log 55.6 278 0.6036 0.5955 0.6036 0.7769
No log 56.0 280 0.5958 0.5955 0.5958 0.7719
No log 56.4 282 0.6000 0.5849 0.6000 0.7746
No log 56.8 284 0.6204 0.5921 0.6204 0.7876
No log 57.2 286 0.6546 0.5795 0.6546 0.8091
No log 57.6 288 0.6758 0.5990 0.6758 0.8221
No log 58.0 290 0.6858 0.6014 0.6858 0.8281
No log 58.4 292 0.6777 0.6014 0.6777 0.8232
No log 58.8 294 0.6617 0.6014 0.6617 0.8134
No log 59.2 296 0.6568 0.5783 0.6568 0.8104
No log 59.6 298 0.6333 0.5794 0.6333 0.7958
No log 60.0 300 0.6323 0.5794 0.6323 0.7952
No log 60.4 302 0.6313 0.5466 0.6313 0.7945
No log 60.8 304 0.6308 0.5941 0.6308 0.7942
No log 61.2 306 0.6321 0.5941 0.6321 0.7950
No log 61.6 308 0.6371 0.5941 0.6371 0.7982
No log 62.0 310 0.6300 0.5955 0.6300 0.7937
No log 62.4 312 0.6272 0.5955 0.6272 0.7920
No log 62.8 314 0.6392 0.5667 0.6392 0.7995
No log 63.2 316 0.6623 0.5688 0.6623 0.8138
No log 63.6 318 0.6765 0.5876 0.6765 0.8225
No log 64.0 320 0.6770 0.5688 0.6770 0.8228
No log 64.4 322 0.6622 0.5169 0.6622 0.8138
No log 64.8 324 0.6528 0.5955 0.6528 0.8080
No log 65.2 326 0.6450 0.5955 0.6450 0.8031
No log 65.6 328 0.6351 0.5955 0.6351 0.7969
No log 66.0 330 0.6252 0.5955 0.6252 0.7907
No log 66.4 332 0.6165 0.5567 0.6165 0.7852
No log 66.8 334 0.6194 0.5817 0.6194 0.7870
No log 67.2 336 0.6214 0.5567 0.6214 0.7883
No log 67.6 338 0.6262 0.5567 0.6262 0.7913
No log 68.0 340 0.6388 0.5567 0.6388 0.7993
No log 68.4 342 0.6555 0.5167 0.6555 0.8096
No log 68.8 344 0.6685 0.5169 0.6685 0.8176
No log 69.2 346 0.6898 0.5576 0.6898 0.8305
No log 69.6 348 0.6894 0.5576 0.6894 0.8303
No log 70.0 350 0.6711 0.5369 0.6711 0.8192
No log 70.4 352 0.6578 0.5369 0.6578 0.8110
No log 70.8 354 0.6405 0.5859 0.6405 0.8003
No log 71.2 356 0.6284 0.5859 0.6284 0.7927
No log 71.6 358 0.6273 0.5470 0.6273 0.7920
No log 72.0 360 0.6270 0.5470 0.6270 0.7918
No log 72.4 362 0.6297 0.5470 0.6297 0.7935
No log 72.8 364 0.6329 0.5470 0.6329 0.7956
No log 73.2 366 0.6257 0.5470 0.6257 0.7910
No log 73.6 368 0.6235 0.5470 0.6235 0.7896
No log 74.0 370 0.6200 0.5567 0.6200 0.7874
No log 74.4 372 0.6185 0.5567 0.6185 0.7864
No log 74.8 374 0.6214 0.5567 0.6214 0.7883
No log 75.2 376 0.6262 0.5470 0.6262 0.7913
No log 75.6 378 0.6323 0.5470 0.6323 0.7951
No log 76.0 380 0.6410 0.5470 0.6410 0.8006
No log 76.4 382 0.6577 0.5589 0.6577 0.8110
No log 76.8 384 0.6857 0.5936 0.6857 0.8281
No log 77.2 386 0.7132 0.6126 0.7132 0.8445
No log 77.6 388 0.7144 0.5864 0.7144 0.8452
No log 78.0 390 0.6959 0.5773 0.6959 0.8342
No log 78.4 392 0.6804 0.6063 0.6804 0.8248
No log 78.8 394 0.6629 0.5572 0.6629 0.8142
No log 79.2 396 0.6452 0.5572 0.6452 0.8033
No log 79.6 398 0.6359 0.5572 0.6359 0.7974
No log 80.0 400 0.6384 0.5706 0.6384 0.7990
No log 80.4 402 0.6424 0.5940 0.6424 0.8015
No log 80.8 404 0.6404 0.5706 0.6404 0.8003
No log 81.2 406 0.6365 0.5706 0.6365 0.7978
No log 81.6 408 0.6285 0.5578 0.6285 0.7927
No log 82.0 410 0.6242 0.5470 0.6242 0.7901
No log 82.4 412 0.6219 0.5955 0.6219 0.7886
No log 82.8 414 0.6239 0.5955 0.6239 0.7899
No log 83.2 416 0.6261 0.5955 0.6261 0.7912
No log 83.6 418 0.6286 0.5859 0.6286 0.7928
No log 84.0 420 0.6329 0.5859 0.6329 0.7955
No log 84.4 422 0.6396 0.5578 0.6396 0.7997
No log 84.8 424 0.6456 0.5686 0.6456 0.8035
No log 85.2 426 0.6475 0.5686 0.6475 0.8047
No log 85.6 428 0.6521 0.5686 0.6521 0.8075
No log 86.0 430 0.6541 0.5686 0.6541 0.8088
No log 86.4 432 0.6562 0.5686 0.6562 0.8101
No log 86.8 434 0.6592 0.5686 0.6592 0.8119
No log 87.2 436 0.6627 0.5678 0.6627 0.8141
No log 87.6 438 0.6638 0.5678 0.6638 0.8148
No log 88.0 440 0.6627 0.5578 0.6627 0.8141
No log 88.4 442 0.6576 0.5578 0.6576 0.8109
No log 88.8 444 0.6545 0.5578 0.6545 0.8090
No log 89.2 446 0.6570 0.5578 0.6570 0.8106
No log 89.6 448 0.6617 0.5578 0.6617 0.8135
No log 90.0 450 0.6626 0.5572 0.6626 0.8140
No log 90.4 452 0.6679 0.5678 0.6679 0.8172
No log 90.8 454 0.6741 0.5382 0.6741 0.8210
No log 91.2 456 0.6763 0.5382 0.6763 0.8223
No log 91.6 458 0.6752 0.5382 0.6752 0.8217
No log 92.0 460 0.6706 0.5382 0.6706 0.8189
No log 92.4 462 0.6635 0.5686 0.6635 0.8145
No log 92.8 464 0.6607 0.5686 0.6607 0.8128
No log 93.2 466 0.6605 0.5686 0.6605 0.8127
No log 93.6 468 0.6594 0.5686 0.6594 0.8120
No log 94.0 470 0.6567 0.5578 0.6567 0.8104
No log 94.4 472 0.6556 0.5578 0.6556 0.8097
No log 94.8 474 0.6537 0.5578 0.6537 0.8085
No log 95.2 476 0.6513 0.5578 0.6513 0.8070
No log 95.6 478 0.6488 0.5578 0.6488 0.8055
No log 96.0 480 0.6459 0.5578 0.6459 0.8037
No log 96.4 482 0.6438 0.5578 0.6438 0.8024
No log 96.8 484 0.6430 0.5470 0.6430 0.8019
No log 97.2 486 0.6439 0.5578 0.6439 0.8025
No log 97.6 488 0.6454 0.5578 0.6454 0.8033
No log 98.0 490 0.6468 0.5578 0.6468 0.8043
No log 98.4 492 0.6477 0.5578 0.6477 0.8048
No log 98.8 494 0.6487 0.5578 0.6487 0.8054
No log 99.2 496 0.6495 0.5578 0.6495 0.8059
No log 99.6 498 0.6501 0.5578 0.6501 0.8063
0.1647 100.0 500 0.6504 0.5578 0.6504 0.8065

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
7
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k1_task5_organization

Finetuned
(4205)
this model