MayBashendy commited on
Commit
2ae78e9
·
verified ·
1 Parent(s): 2b50be4

End of training

Browse files
Files changed (1) hide show
  1. README.md +61 -61
README.md CHANGED
@@ -4,21 +4,21 @@ base_model: aubmindlab/bert-base-arabertv02
4
  tags:
5
  - generated_from_trainer
6
  model-index:
7
- - name: arabert_baseline_organization_task1_fold0
8
  results: []
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
- # arabert_baseline_organization_task1_fold0
15
 
16
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
- - Loss: 0.6571
19
- - Qwk: 0.7618
20
- - Mse: 0.6571
21
- - Rmse: 0.8106
22
 
23
  ## Model description
24
 
@@ -49,61 +49,61 @@ The following hyperparameters were used during training:
49
 
50
  | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
  |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
- | No log | 0.1818 | 2 | 4.7070 | -0.0581 | 4.7070 | 2.1696 |
53
- | No log | 0.3636 | 4 | 2.5400 | 0.0 | 2.5400 | 1.5937 |
54
- | No log | 0.5455 | 6 | 1.5937 | 0.1750 | 1.5937 | 1.2624 |
55
- | No log | 0.7273 | 8 | 1.0172 | 0.1582 | 1.0172 | 1.0086 |
56
- | No log | 0.9091 | 10 | 0.9129 | 0.4929 | 0.9129 | 0.9555 |
57
- | No log | 1.0909 | 12 | 0.8654 | 0.4763 | 0.8654 | 0.9302 |
58
- | No log | 1.2727 | 14 | 0.8549 | 0.5268 | 0.8549 | 0.9246 |
59
- | No log | 1.4545 | 16 | 0.7991 | 0.4934 | 0.7991 | 0.8939 |
60
- | No log | 1.6364 | 18 | 0.8510 | 0.5288 | 0.8510 | 0.9225 |
61
- | No log | 1.8182 | 20 | 0.8834 | 0.5288 | 0.8834 | 0.9399 |
62
- | No log | 2.0 | 22 | 0.9534 | 0.5304 | 0.9534 | 0.9764 |
63
- | No log | 2.1818 | 24 | 0.8760 | 0.5698 | 0.8760 | 0.9360 |
64
- | No log | 2.3636 | 26 | 0.7263 | 0.6182 | 0.7263 | 0.8522 |
65
- | No log | 2.5455 | 28 | 0.7632 | 0.5093 | 0.7632 | 0.8736 |
66
- | No log | 2.7273 | 30 | 0.7407 | 0.5845 | 0.7407 | 0.8606 |
67
- | No log | 2.9091 | 32 | 0.8857 | 0.5674 | 0.8857 | 0.9411 |
68
- | No log | 3.0909 | 34 | 0.9844 | 0.6301 | 0.9844 | 0.9922 |
69
- | No log | 3.2727 | 36 | 0.9963 | 0.6301 | 0.9963 | 0.9981 |
70
- | No log | 3.4545 | 38 | 0.8665 | 0.6209 | 0.8665 | 0.9308 |
71
- | No log | 3.6364 | 40 | 0.7832 | 0.6354 | 0.7832 | 0.8850 |
72
- | No log | 3.8182 | 42 | 0.8425 | 0.6715 | 0.8425 | 0.9179 |
73
- | No log | 4.0 | 44 | 0.8849 | 0.5591 | 0.8849 | 0.9407 |
74
- | No log | 4.1818 | 46 | 0.8290 | 0.5862 | 0.8290 | 0.9105 |
75
- | No log | 4.3636 | 48 | 0.7263 | 0.6866 | 0.7263 | 0.8523 |
76
- | No log | 4.5455 | 50 | 0.7922 | 0.7786 | 0.7922 | 0.8901 |
77
- | No log | 4.7273 | 52 | 0.9204 | 0.7037 | 0.9204 | 0.9594 |
78
- | No log | 4.9091 | 54 | 0.8995 | 0.6836 | 0.8995 | 0.9484 |
79
- | No log | 5.0909 | 56 | 0.7549 | 0.6968 | 0.7549 | 0.8688 |
80
- | No log | 5.2727 | 58 | 0.6694 | 0.6539 | 0.6694 | 0.8182 |
81
- | No log | 5.4545 | 60 | 0.6618 | 0.6102 | 0.6618 | 0.8135 |
82
- | No log | 5.6364 | 62 | 0.6895 | 0.7449 | 0.6895 | 0.8304 |
83
- | No log | 5.8182 | 64 | 0.7854 | 0.7181 | 0.7854 | 0.8863 |
84
- | No log | 6.0 | 66 | 0.8328 | 0.7181 | 0.8328 | 0.9126 |
85
- | No log | 6.1818 | 68 | 0.8134 | 0.6764 | 0.8134 | 0.9019 |
86
- | No log | 6.3636 | 70 | 0.8151 | 0.7363 | 0.8151 | 0.9028 |
87
- | No log | 6.5455 | 72 | 0.7853 | 0.7363 | 0.7853 | 0.8862 |
88
- | No log | 6.7273 | 74 | 0.7443 | 0.7786 | 0.7443 | 0.8627 |
89
- | No log | 6.9091 | 76 | 0.7261 | 0.7618 | 0.7261 | 0.8521 |
90
- | No log | 7.0909 | 78 | 0.7315 | 0.7008 | 0.7315 | 0.8553 |
91
- | No log | 7.2727 | 80 | 0.7472 | 0.7181 | 0.7472 | 0.8644 |
92
- | No log | 7.4545 | 82 | 0.7769 | 0.7181 | 0.7769 | 0.8814 |
93
- | No log | 7.6364 | 84 | 0.7755 | 0.6893 | 0.7755 | 0.8807 |
94
- | No log | 7.8182 | 86 | 0.7231 | 0.7008 | 0.7231 | 0.8504 |
95
- | No log | 8.0 | 88 | 0.6836 | 0.7008 | 0.6836 | 0.8268 |
96
- | No log | 8.1818 | 90 | 0.6640 | 0.7618 | 0.6640 | 0.8148 |
97
- | No log | 8.3636 | 92 | 0.6503 | 0.7618 | 0.6503 | 0.8064 |
98
- | No log | 8.5455 | 94 | 0.6490 | 0.7371 | 0.6490 | 0.8056 |
99
- | No log | 8.7273 | 96 | 0.6511 | 0.7618 | 0.6511 | 0.8069 |
100
- | No log | 8.9091 | 98 | 0.6544 | 0.7618 | 0.6544 | 0.8090 |
101
- | No log | 9.0909 | 100 | 0.6537 | 0.7618 | 0.6537 | 0.8085 |
102
- | No log | 9.2727 | 102 | 0.6535 | 0.7618 | 0.6535 | 0.8084 |
103
- | No log | 9.4545 | 104 | 0.6564 | 0.7618 | 0.6564 | 0.8102 |
104
- | No log | 9.6364 | 106 | 0.6574 | 0.7618 | 0.6574 | 0.8108 |
105
- | No log | 9.8182 | 108 | 0.6573 | 0.7618 | 0.6573 | 0.8107 |
106
- | No log | 10.0 | 110 | 0.6571 | 0.7618 | 0.6571 | 0.8106 |
107
 
108
 
109
  ### Framework versions
 
4
  tags:
5
  - generated_from_trainer
6
  model-index:
7
+ - name: arabert_baseline_organization_task1_fold1
8
  results: []
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # arabert_baseline_organization_task1_fold1
15
 
16
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.7406
19
+ - Qwk: 0.7336
20
+ - Mse: 0.7406
21
+ - Rmse: 0.8606
22
 
23
  ## Model description
24
 
 
49
 
50
  | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
  |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.1818 | 2 | 3.0376 | 0.0094 | 3.0376 | 1.7429 |
53
+ | No log | 0.3636 | 4 | 1.6849 | -0.0143 | 1.6849 | 1.2980 |
54
+ | No log | 0.5455 | 6 | 1.2236 | 0.0597 | 1.2236 | 1.1061 |
55
+ | No log | 0.7273 | 8 | 1.2683 | -0.0302 | 1.2683 | 1.1262 |
56
+ | No log | 0.9091 | 10 | 1.1217 | -0.0161 | 1.1217 | 1.0591 |
57
+ | No log | 1.0909 | 12 | 0.9264 | -0.0467 | 0.9264 | 0.9625 |
58
+ | No log | 1.2727 | 14 | 0.8294 | -0.0284 | 0.8294 | 0.9107 |
59
+ | No log | 1.4545 | 16 | 0.8008 | 0.0427 | 0.8008 | 0.8949 |
60
+ | No log | 1.6364 | 18 | 0.7784 | 0.0427 | 0.7784 | 0.8823 |
61
+ | No log | 1.8182 | 20 | 0.7216 | 0.0 | 0.7216 | 0.8495 |
62
+ | No log | 2.0 | 22 | 0.7271 | 0.2857 | 0.7271 | 0.8527 |
63
+ | No log | 2.1818 | 24 | 0.7480 | 0.3090 | 0.7480 | 0.8649 |
64
+ | No log | 2.3636 | 26 | 0.6967 | 0.3109 | 0.6967 | 0.8347 |
65
+ | No log | 2.5455 | 28 | 0.6857 | 0.4 | 0.6857 | 0.8281 |
66
+ | No log | 2.7273 | 30 | 0.6479 | 0.5926 | 0.6479 | 0.8049 |
67
+ | No log | 2.9091 | 32 | 0.6868 | 0.5926 | 0.6868 | 0.8287 |
68
+ | No log | 3.0909 | 34 | 0.6055 | 0.6316 | 0.6055 | 0.7781 |
69
+ | No log | 3.2727 | 36 | 0.5330 | 0.5776 | 0.5330 | 0.7301 |
70
+ | No log | 3.4545 | 38 | 0.4870 | 0.6345 | 0.4870 | 0.6979 |
71
+ | No log | 3.6364 | 40 | 0.5030 | 0.6638 | 0.5030 | 0.7093 |
72
+ | No log | 3.8182 | 42 | 0.4888 | 0.5767 | 0.4888 | 0.6991 |
73
+ | No log | 4.0 | 44 | 0.4738 | 0.5767 | 0.4738 | 0.6884 |
74
+ | No log | 4.1818 | 46 | 0.5713 | 0.6410 | 0.5713 | 0.7559 |
75
+ | No log | 4.3636 | 48 | 0.7034 | 0.7482 | 0.7034 | 0.8387 |
76
+ | No log | 4.5455 | 50 | 0.7505 | 0.7426 | 0.7505 | 0.8663 |
77
+ | No log | 4.7273 | 52 | 0.7736 | 0.7426 | 0.7736 | 0.8796 |
78
+ | No log | 4.9091 | 54 | 0.6137 | 0.7390 | 0.6137 | 0.7834 |
79
+ | No log | 5.0909 | 56 | 0.6615 | 0.7390 | 0.6615 | 0.8133 |
80
+ | No log | 5.2727 | 58 | 0.8090 | 0.7138 | 0.8090 | 0.8995 |
81
+ | No log | 5.4545 | 60 | 0.7775 | 0.7287 | 0.7775 | 0.8818 |
82
+ | No log | 5.6364 | 62 | 0.6471 | 0.7107 | 0.6471 | 0.8045 |
83
+ | No log | 5.8182 | 64 | 0.5071 | 0.6547 | 0.5071 | 0.7121 |
84
+ | No log | 6.0 | 66 | 0.4469 | 0.6500 | 0.4469 | 0.6685 |
85
+ | No log | 6.1818 | 68 | 0.4640 | 0.6866 | 0.4640 | 0.6812 |
86
+ | No log | 6.3636 | 70 | 0.6052 | 0.7266 | 0.6052 | 0.7779 |
87
+ | No log | 6.5455 | 72 | 0.8728 | 0.6873 | 0.8728 | 0.9342 |
88
+ | No log | 6.7273 | 74 | 0.9762 | 0.7219 | 0.9762 | 0.9880 |
89
+ | No log | 6.9091 | 76 | 0.8461 | 0.6873 | 0.8461 | 0.9198 |
90
+ | No log | 7.0909 | 78 | 0.6379 | 0.7181 | 0.6379 | 0.7987 |
91
+ | No log | 7.2727 | 80 | 0.5480 | 0.6839 | 0.5480 | 0.7403 |
92
+ | No log | 7.4545 | 82 | 0.5355 | 0.6839 | 0.5355 | 0.7318 |
93
+ | No log | 7.6364 | 84 | 0.5699 | 0.7162 | 0.5699 | 0.7549 |
94
+ | No log | 7.8182 | 86 | 0.6654 | 0.6915 | 0.6654 | 0.8157 |
95
+ | No log | 8.0 | 88 | 0.8127 | 0.7 | 0.8127 | 0.9015 |
96
+ | No log | 8.1818 | 90 | 0.8907 | 0.7 | 0.8907 | 0.9438 |
97
+ | No log | 8.3636 | 92 | 0.8716 | 0.7 | 0.8716 | 0.9336 |
98
+ | No log | 8.5455 | 94 | 0.8174 | 0.72 | 0.8174 | 0.9041 |
99
+ | No log | 8.7273 | 96 | 0.7611 | 0.7336 | 0.7611 | 0.8724 |
100
+ | No log | 8.9091 | 98 | 0.7738 | 0.7336 | 0.7738 | 0.8797 |
101
+ | No log | 9.0909 | 100 | 0.7710 | 0.7336 | 0.7710 | 0.8781 |
102
+ | No log | 9.2727 | 102 | 0.7508 | 0.7336 | 0.7508 | 0.8665 |
103
+ | No log | 9.4545 | 104 | 0.7428 | 0.7336 | 0.7428 | 0.8619 |
104
+ | No log | 9.6364 | 106 | 0.7452 | 0.7336 | 0.7452 | 0.8633 |
105
+ | No log | 9.8182 | 108 | 0.7414 | 0.7336 | 0.7414 | 0.8611 |
106
+ | No log | 10.0 | 110 | 0.7406 | 0.7336 | 0.7406 | 0.8606 |
107
 
108
 
109
  ### Framework versions