End of training
Browse files
README.md
CHANGED
@@ -4,21 +4,21 @@ base_model: aubmindlab/bert-base-arabertv02
|
|
4 |
tags:
|
5 |
- generated_from_trainer
|
6 |
model-index:
|
7 |
-
- name:
|
8 |
results: []
|
9 |
---
|
10 |
|
11 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
should probably proofread and complete it, then remove this comment. -->
|
13 |
|
14 |
-
#
|
15 |
|
16 |
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- Loss: 0.
|
19 |
-
- Qwk: 0.
|
20 |
-
- Mse: 0.
|
21 |
-
- Rmse: 0.
|
22 |
|
23 |
## Model description
|
24 |
|
@@ -49,61 +49,61 @@ The following hyperparameters were used during training:
|
|
49 |
|
50 |
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|
51 |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
|
52 |
-
| No log | 0.1818 | 2 |
|
53 |
-
| No log | 0.3636 | 4 |
|
54 |
-
| No log | 0.5455 | 6 | 1.
|
55 |
-
| No log | 0.7273 | 8 | 1.
|
56 |
-
| No log | 0.9091 | 10 |
|
57 |
-
| No log | 1.0909 | 12 | 0.
|
58 |
-
| No log | 1.2727 | 14 | 0.
|
59 |
-
| No log | 1.4545 | 16 | 0.
|
60 |
-
| No log | 1.6364 | 18 | 0.
|
61 |
-
| No log | 1.8182 | 20 | 0.
|
62 |
-
| No log | 2.0 | 22 | 0.
|
63 |
-
| No log | 2.1818 | 24 | 0.
|
64 |
-
| No log | 2.3636 | 26 | 0.
|
65 |
-
| No log | 2.5455 | 28 | 0.
|
66 |
-
| No log | 2.7273 | 30 | 0.
|
67 |
-
| No log | 2.9091 | 32 | 0.
|
68 |
-
| No log | 3.0909 | 34 | 0.
|
69 |
-
| No log | 3.2727 | 36 | 0.
|
70 |
-
| No log | 3.4545 | 38 | 0.
|
71 |
-
| No log | 3.6364 | 40 | 0.
|
72 |
-
| No log | 3.8182 | 42 | 0.
|
73 |
-
| No log | 4.0 | 44 | 0.
|
74 |
-
| No log | 4.1818 | 46 | 0.
|
75 |
-
| No log | 4.3636 | 48 | 0.
|
76 |
-
| No log | 4.5455 | 50 | 0.
|
77 |
-
| No log | 4.7273 | 52 | 0.
|
78 |
-
| No log | 4.9091 | 54 | 0.
|
79 |
-
| No log | 5.0909 | 56 | 0.
|
80 |
-
| No log | 5.2727 | 58 | 0.
|
81 |
-
| No log | 5.4545 | 60 | 0.
|
82 |
-
| No log | 5.6364 | 62 | 0.
|
83 |
-
| No log | 5.8182 | 64 | 0.
|
84 |
-
| No log | 6.0 | 66 | 0.
|
85 |
-
| No log | 6.1818 | 68 | 0.
|
86 |
-
| No log | 6.3636 | 70 | 0.
|
87 |
-
| No log | 6.5455 | 72 | 0.
|
88 |
-
| No log | 6.7273 | 74 | 0.
|
89 |
-
| No log | 6.9091 | 76 | 0.
|
90 |
-
| No log | 7.0909 | 78 | 0.
|
91 |
-
| No log | 7.2727 | 80 | 0.
|
92 |
-
| No log | 7.4545 | 82 | 0.
|
93 |
-
| No log | 7.6364 | 84 | 0.
|
94 |
-
| No log | 7.8182 | 86 | 0.
|
95 |
-
| No log | 8.0 | 88 | 0.
|
96 |
-
| No log | 8.1818 | 90 | 0.
|
97 |
-
| No log | 8.3636 | 92 | 0.
|
98 |
-
| No log | 8.5455 | 94 | 0.
|
99 |
-
| No log | 8.7273 | 96 | 0.
|
100 |
-
| No log | 8.9091 | 98 | 0.
|
101 |
-
| No log | 9.0909 | 100 | 0.
|
102 |
-
| No log | 9.2727 | 102 | 0.
|
103 |
-
| No log | 9.4545 | 104 | 0.
|
104 |
-
| No log | 9.6364 | 106 | 0.
|
105 |
-
| No log | 9.8182 | 108 | 0.
|
106 |
-
| No log | 10.0 | 110 | 0.
|
107 |
|
108 |
|
109 |
### Framework versions
|
|
|
4 |
tags:
|
5 |
- generated_from_trainer
|
6 |
model-index:
|
7 |
+
- name: arabert_baseline_organization_task1_fold1
|
8 |
results: []
|
9 |
---
|
10 |
|
11 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
12 |
should probably proofread and complete it, then remove this comment. -->
|
13 |
|
14 |
+
# arabert_baseline_organization_task1_fold1
|
15 |
|
16 |
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- Loss: 0.7406
|
19 |
+
- Qwk: 0.7336
|
20 |
+
- Mse: 0.7406
|
21 |
+
- Rmse: 0.8606
|
22 |
|
23 |
## Model description
|
24 |
|
|
|
49 |
|
50 |
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
|
51 |
|:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
|
52 |
+
| No log | 0.1818 | 2 | 3.0376 | 0.0094 | 3.0376 | 1.7429 |
|
53 |
+
| No log | 0.3636 | 4 | 1.6849 | -0.0143 | 1.6849 | 1.2980 |
|
54 |
+
| No log | 0.5455 | 6 | 1.2236 | 0.0597 | 1.2236 | 1.1061 |
|
55 |
+
| No log | 0.7273 | 8 | 1.2683 | -0.0302 | 1.2683 | 1.1262 |
|
56 |
+
| No log | 0.9091 | 10 | 1.1217 | -0.0161 | 1.1217 | 1.0591 |
|
57 |
+
| No log | 1.0909 | 12 | 0.9264 | -0.0467 | 0.9264 | 0.9625 |
|
58 |
+
| No log | 1.2727 | 14 | 0.8294 | -0.0284 | 0.8294 | 0.9107 |
|
59 |
+
| No log | 1.4545 | 16 | 0.8008 | 0.0427 | 0.8008 | 0.8949 |
|
60 |
+
| No log | 1.6364 | 18 | 0.7784 | 0.0427 | 0.7784 | 0.8823 |
|
61 |
+
| No log | 1.8182 | 20 | 0.7216 | 0.0 | 0.7216 | 0.8495 |
|
62 |
+
| No log | 2.0 | 22 | 0.7271 | 0.2857 | 0.7271 | 0.8527 |
|
63 |
+
| No log | 2.1818 | 24 | 0.7480 | 0.3090 | 0.7480 | 0.8649 |
|
64 |
+
| No log | 2.3636 | 26 | 0.6967 | 0.3109 | 0.6967 | 0.8347 |
|
65 |
+
| No log | 2.5455 | 28 | 0.6857 | 0.4 | 0.6857 | 0.8281 |
|
66 |
+
| No log | 2.7273 | 30 | 0.6479 | 0.5926 | 0.6479 | 0.8049 |
|
67 |
+
| No log | 2.9091 | 32 | 0.6868 | 0.5926 | 0.6868 | 0.8287 |
|
68 |
+
| No log | 3.0909 | 34 | 0.6055 | 0.6316 | 0.6055 | 0.7781 |
|
69 |
+
| No log | 3.2727 | 36 | 0.5330 | 0.5776 | 0.5330 | 0.7301 |
|
70 |
+
| No log | 3.4545 | 38 | 0.4870 | 0.6345 | 0.4870 | 0.6979 |
|
71 |
+
| No log | 3.6364 | 40 | 0.5030 | 0.6638 | 0.5030 | 0.7093 |
|
72 |
+
| No log | 3.8182 | 42 | 0.4888 | 0.5767 | 0.4888 | 0.6991 |
|
73 |
+
| No log | 4.0 | 44 | 0.4738 | 0.5767 | 0.4738 | 0.6884 |
|
74 |
+
| No log | 4.1818 | 46 | 0.5713 | 0.6410 | 0.5713 | 0.7559 |
|
75 |
+
| No log | 4.3636 | 48 | 0.7034 | 0.7482 | 0.7034 | 0.8387 |
|
76 |
+
| No log | 4.5455 | 50 | 0.7505 | 0.7426 | 0.7505 | 0.8663 |
|
77 |
+
| No log | 4.7273 | 52 | 0.7736 | 0.7426 | 0.7736 | 0.8796 |
|
78 |
+
| No log | 4.9091 | 54 | 0.6137 | 0.7390 | 0.6137 | 0.7834 |
|
79 |
+
| No log | 5.0909 | 56 | 0.6615 | 0.7390 | 0.6615 | 0.8133 |
|
80 |
+
| No log | 5.2727 | 58 | 0.8090 | 0.7138 | 0.8090 | 0.8995 |
|
81 |
+
| No log | 5.4545 | 60 | 0.7775 | 0.7287 | 0.7775 | 0.8818 |
|
82 |
+
| No log | 5.6364 | 62 | 0.6471 | 0.7107 | 0.6471 | 0.8045 |
|
83 |
+
| No log | 5.8182 | 64 | 0.5071 | 0.6547 | 0.5071 | 0.7121 |
|
84 |
+
| No log | 6.0 | 66 | 0.4469 | 0.6500 | 0.4469 | 0.6685 |
|
85 |
+
| No log | 6.1818 | 68 | 0.4640 | 0.6866 | 0.4640 | 0.6812 |
|
86 |
+
| No log | 6.3636 | 70 | 0.6052 | 0.7266 | 0.6052 | 0.7779 |
|
87 |
+
| No log | 6.5455 | 72 | 0.8728 | 0.6873 | 0.8728 | 0.9342 |
|
88 |
+
| No log | 6.7273 | 74 | 0.9762 | 0.7219 | 0.9762 | 0.9880 |
|
89 |
+
| No log | 6.9091 | 76 | 0.8461 | 0.6873 | 0.8461 | 0.9198 |
|
90 |
+
| No log | 7.0909 | 78 | 0.6379 | 0.7181 | 0.6379 | 0.7987 |
|
91 |
+
| No log | 7.2727 | 80 | 0.5480 | 0.6839 | 0.5480 | 0.7403 |
|
92 |
+
| No log | 7.4545 | 82 | 0.5355 | 0.6839 | 0.5355 | 0.7318 |
|
93 |
+
| No log | 7.6364 | 84 | 0.5699 | 0.7162 | 0.5699 | 0.7549 |
|
94 |
+
| No log | 7.8182 | 86 | 0.6654 | 0.6915 | 0.6654 | 0.8157 |
|
95 |
+
| No log | 8.0 | 88 | 0.8127 | 0.7 | 0.8127 | 0.9015 |
|
96 |
+
| No log | 8.1818 | 90 | 0.8907 | 0.7 | 0.8907 | 0.9438 |
|
97 |
+
| No log | 8.3636 | 92 | 0.8716 | 0.7 | 0.8716 | 0.9336 |
|
98 |
+
| No log | 8.5455 | 94 | 0.8174 | 0.72 | 0.8174 | 0.9041 |
|
99 |
+
| No log | 8.7273 | 96 | 0.7611 | 0.7336 | 0.7611 | 0.8724 |
|
100 |
+
| No log | 8.9091 | 98 | 0.7738 | 0.7336 | 0.7738 | 0.8797 |
|
101 |
+
| No log | 9.0909 | 100 | 0.7710 | 0.7336 | 0.7710 | 0.8781 |
|
102 |
+
| No log | 9.2727 | 102 | 0.7508 | 0.7336 | 0.7508 | 0.8665 |
|
103 |
+
| No log | 9.4545 | 104 | 0.7428 | 0.7336 | 0.7428 | 0.8619 |
|
104 |
+
| No log | 9.6364 | 106 | 0.7452 | 0.7336 | 0.7452 | 0.8633 |
|
105 |
+
| No log | 9.8182 | 108 | 0.7414 | 0.7336 | 0.7414 | 0.8611 |
|
106 |
+
| No log | 10.0 | 110 | 0.7406 | 0.7336 | 0.7406 | 0.8606 |
|
107 |
|
108 |
|
109 |
### Framework versions
|