salbatarni
commited on
End of training
Browse files
README.md
CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
|
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
model-index:
|
6 |
-
- name:
|
7 |
results: []
|
8 |
---
|
9 |
|
10 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
11 |
should probably proofread and complete it, then remove this comment. -->
|
12 |
|
13 |
-
#
|
14 |
|
15 |
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
-
- Loss: 0.
|
18 |
-
- Qwk: 0.
|
19 |
-
- Mse: 0.
|
20 |
|
21 |
## Model description
|
22 |
|
@@ -45,83 +45,88 @@ The following hyperparameters were used during training:
|
|
45 |
|
46 |
### Training results
|
47 |
|
48 |
-
| Training Loss | Epoch
|
49 |
-
|
50 |
-
| No log | 0.
|
51 |
-
| No log | 0.
|
52 |
-
| No log | 0.
|
53 |
-
| No log | 0.
|
54 |
-
| No log | 0.
|
55 |
-
| No log | 0.
|
56 |
-
| No log | 0.
|
57 |
-
| No log | 1.
|
58 |
-
| No log | 1.
|
59 |
-
| No log | 1.
|
60 |
-
| No log | 1.
|
61 |
-
| No log | 1.
|
62 |
-
| No log | 1.
|
63 |
-
| No log | 1.
|
64 |
-
| No log |
|
65 |
-
| No log | 2.
|
66 |
-
| No log | 2.
|
67 |
-
| No log | 2.
|
68 |
-
| No log | 2.
|
69 |
-
| No log | 2.
|
70 |
-
| No log | 2.
|
71 |
-
| No log | 2.
|
72 |
-
| No log |
|
73 |
-
| No log | 3.
|
74 |
-
| No log | 3.
|
75 |
-
| No log | 3.
|
76 |
-
| No log | 3.
|
77 |
-
| No log | 3.
|
78 |
-
| No log | 3.
|
79 |
-
| No log |
|
80 |
-
| No log |
|
81 |
-
| No log | 4.
|
82 |
-
| No log | 4.
|
83 |
-
| No log | 4.
|
84 |
-
| No log | 4.
|
85 |
-
| No log | 4.
|
86 |
-
| No log | 4.
|
87 |
-
| No log |
|
88 |
-
| No log |
|
89 |
-
| No log | 5.
|
90 |
-
| No log | 5.
|
91 |
-
| No log | 5.
|
92 |
-
| No log | 5.
|
93 |
-
| No log | 5.
|
94 |
-
| No log |
|
95 |
-
| No log |
|
96 |
-
| No log |
|
97 |
-
| No log | 6.
|
98 |
-
| No log | 6.
|
99 |
-
| No log | 6.
|
100 |
-
| No log | 6.
|
101 |
-
| No log | 6.
|
102 |
-
| No log |
|
103 |
-
| No log |
|
104 |
-
| No log |
|
105 |
-
| No log | 7.
|
106 |
-
| No log | 7.
|
107 |
-
| No log | 7.
|
108 |
-
| No log | 7.
|
109 |
-
| No log |
|
110 |
-
| No log |
|
111 |
-
| No log |
|
112 |
-
| No log |
|
113 |
-
| No log | 8.
|
114 |
-
| No log | 8.
|
115 |
-
| No log | 8.
|
116 |
-
| No log | 8.
|
117 |
-
| No log |
|
118 |
-
| No log |
|
119 |
-
| No log |
|
120 |
-
| No log |
|
121 |
-
| No log | 9.
|
122 |
-
| No log | 9.
|
123 |
-
| No log | 9.
|
124 |
-
| No log |
|
|
|
|
|
|
|
|
|
|
|
125 |
|
126 |
|
127 |
### Framework versions
|
|
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
model-index:
|
6 |
+
- name: arabert_cross_organization_task1_fold4
|
7 |
results: []
|
8 |
---
|
9 |
|
10 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
11 |
should probably proofread and complete it, then remove this comment. -->
|
12 |
|
13 |
+
# arabert_cross_organization_task1_fold4
|
14 |
|
15 |
This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
|
16 |
It achieves the following results on the evaluation set:
|
17 |
+
- Loss: 0.4588
|
18 |
+
- Qwk: 0.6885
|
19 |
+
- Mse: 0.4588
|
20 |
|
21 |
## Model description
|
22 |
|
|
|
45 |
|
46 |
### Training results
|
47 |
|
48 |
+
| Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
|
49 |
+
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
|
50 |
+
| No log | 0.125 | 2 | 3.0429 | 0.0044 | 3.0429 |
|
51 |
+
| No log | 0.25 | 4 | 1.6578 | 0.1373 | 1.6578 |
|
52 |
+
| No log | 0.375 | 6 | 0.9445 | 0.3393 | 0.9445 |
|
53 |
+
| No log | 0.5 | 8 | 0.7446 | 0.4700 | 0.7446 |
|
54 |
+
| No log | 0.625 | 10 | 0.8109 | 0.4239 | 0.8109 |
|
55 |
+
| No log | 0.75 | 12 | 0.5652 | 0.6068 | 0.5652 |
|
56 |
+
| No log | 0.875 | 14 | 0.6877 | 0.6129 | 0.6877 |
|
57 |
+
| No log | 1.0 | 16 | 0.5401 | 0.6022 | 0.5401 |
|
58 |
+
| No log | 1.125 | 18 | 0.5571 | 0.5613 | 0.5571 |
|
59 |
+
| No log | 1.25 | 20 | 0.4854 | 0.6440 | 0.4854 |
|
60 |
+
| No log | 1.375 | 22 | 0.5443 | 0.7366 | 0.5443 |
|
61 |
+
| No log | 1.5 | 24 | 0.5077 | 0.7444 | 0.5077 |
|
62 |
+
| No log | 1.625 | 26 | 0.5015 | 0.6266 | 0.5015 |
|
63 |
+
| No log | 1.75 | 28 | 0.5012 | 0.6164 | 0.5012 |
|
64 |
+
| No log | 1.875 | 30 | 0.4504 | 0.7043 | 0.4504 |
|
65 |
+
| No log | 2.0 | 32 | 0.4864 | 0.7187 | 0.4864 |
|
66 |
+
| No log | 2.125 | 34 | 0.4305 | 0.7243 | 0.4305 |
|
67 |
+
| No log | 2.25 | 36 | 0.4572 | 0.6579 | 0.4572 |
|
68 |
+
| No log | 2.375 | 38 | 0.4545 | 0.7032 | 0.4545 |
|
69 |
+
| No log | 2.5 | 40 | 0.4159 | 0.7123 | 0.4159 |
|
70 |
+
| No log | 2.625 | 42 | 0.4122 | 0.7591 | 0.4122 |
|
71 |
+
| No log | 2.75 | 44 | 0.4424 | 0.7617 | 0.4424 |
|
72 |
+
| No log | 2.875 | 46 | 0.4110 | 0.7600 | 0.4110 |
|
73 |
+
| No log | 3.0 | 48 | 0.3993 | 0.7372 | 0.3993 |
|
74 |
+
| No log | 3.125 | 50 | 0.3990 | 0.7391 | 0.3990 |
|
75 |
+
| No log | 3.25 | 52 | 0.3923 | 0.7306 | 0.3923 |
|
76 |
+
| No log | 3.375 | 54 | 0.4375 | 0.7685 | 0.4375 |
|
77 |
+
| No log | 3.5 | 56 | 0.4628 | 0.7698 | 0.4628 |
|
78 |
+
| No log | 3.625 | 58 | 0.4089 | 0.7365 | 0.4089 |
|
79 |
+
| No log | 3.75 | 60 | 0.4113 | 0.7238 | 0.4113 |
|
80 |
+
| No log | 3.875 | 62 | 0.4117 | 0.7308 | 0.4117 |
|
81 |
+
| No log | 4.0 | 64 | 0.4183 | 0.7175 | 0.4183 |
|
82 |
+
| No log | 4.125 | 66 | 0.4326 | 0.7175 | 0.4326 |
|
83 |
+
| No log | 4.25 | 68 | 0.4439 | 0.7360 | 0.4439 |
|
84 |
+
| No log | 4.375 | 70 | 0.4530 | 0.7375 | 0.4530 |
|
85 |
+
| No log | 4.5 | 72 | 0.4458 | 0.7040 | 0.4458 |
|
86 |
+
| No log | 4.625 | 74 | 0.4431 | 0.7054 | 0.4431 |
|
87 |
+
| No log | 4.75 | 76 | 0.4403 | 0.6980 | 0.4403 |
|
88 |
+
| No log | 4.875 | 78 | 0.4350 | 0.7144 | 0.4350 |
|
89 |
+
| No log | 5.0 | 80 | 0.4311 | 0.7511 | 0.4311 |
|
90 |
+
| No log | 5.125 | 82 | 0.4257 | 0.7418 | 0.4257 |
|
91 |
+
| No log | 5.25 | 84 | 0.4298 | 0.7174 | 0.4298 |
|
92 |
+
| No log | 5.375 | 86 | 0.4420 | 0.6877 | 0.4420 |
|
93 |
+
| No log | 5.5 | 88 | 0.4344 | 0.7174 | 0.4344 |
|
94 |
+
| No log | 5.625 | 90 | 0.4324 | 0.7146 | 0.4324 |
|
95 |
+
| No log | 5.75 | 92 | 0.4363 | 0.7566 | 0.4363 |
|
96 |
+
| No log | 5.875 | 94 | 0.4499 | 0.7689 | 0.4499 |
|
97 |
+
| No log | 6.0 | 96 | 0.4217 | 0.7367 | 0.4217 |
|
98 |
+
| No log | 6.125 | 98 | 0.4252 | 0.7237 | 0.4252 |
|
99 |
+
| No log | 6.25 | 100 | 0.4235 | 0.7141 | 0.4235 |
|
100 |
+
| No log | 6.375 | 102 | 0.4211 | 0.7230 | 0.4211 |
|
101 |
+
| No log | 6.5 | 104 | 0.4285 | 0.7493 | 0.4285 |
|
102 |
+
| No log | 6.625 | 106 | 0.4367 | 0.7530 | 0.4367 |
|
103 |
+
| No log | 6.75 | 108 | 0.4214 | 0.7457 | 0.4214 |
|
104 |
+
| No log | 6.875 | 110 | 0.4380 | 0.6930 | 0.4380 |
|
105 |
+
| No log | 7.0 | 112 | 0.4555 | 0.6727 | 0.4555 |
|
106 |
+
| No log | 7.125 | 114 | 0.4358 | 0.6947 | 0.4358 |
|
107 |
+
| No log | 7.25 | 116 | 0.4270 | 0.7277 | 0.4270 |
|
108 |
+
| No log | 7.375 | 118 | 0.4349 | 0.7457 | 0.4349 |
|
109 |
+
| No log | 7.5 | 120 | 0.4430 | 0.7382 | 0.4430 |
|
110 |
+
| No log | 7.625 | 122 | 0.4539 | 0.7257 | 0.4539 |
|
111 |
+
| No log | 7.75 | 124 | 0.4623 | 0.7204 | 0.4623 |
|
112 |
+
| No log | 7.875 | 126 | 0.4640 | 0.7110 | 0.4640 |
|
113 |
+
| No log | 8.0 | 128 | 0.4644 | 0.7115 | 0.4644 |
|
114 |
+
| No log | 8.125 | 130 | 0.4639 | 0.7095 | 0.4639 |
|
115 |
+
| No log | 8.25 | 132 | 0.4612 | 0.7073 | 0.4612 |
|
116 |
+
| No log | 8.375 | 134 | 0.4652 | 0.6865 | 0.4652 |
|
117 |
+
| No log | 8.5 | 136 | 0.4689 | 0.6753 | 0.4689 |
|
118 |
+
| No log | 8.625 | 138 | 0.4608 | 0.6849 | 0.4608 |
|
119 |
+
| No log | 8.75 | 140 | 0.4553 | 0.6907 | 0.4553 |
|
120 |
+
| No log | 8.875 | 142 | 0.4538 | 0.6930 | 0.4538 |
|
121 |
+
| No log | 9.0 | 144 | 0.4537 | 0.7172 | 0.4537 |
|
122 |
+
| No log | 9.125 | 146 | 0.4564 | 0.7273 | 0.4564 |
|
123 |
+
| No log | 9.25 | 148 | 0.4582 | 0.7294 | 0.4582 |
|
124 |
+
| No log | 9.375 | 150 | 0.4572 | 0.7267 | 0.4572 |
|
125 |
+
| No log | 9.5 | 152 | 0.4559 | 0.7093 | 0.4559 |
|
126 |
+
| No log | 9.625 | 154 | 0.4566 | 0.7000 | 0.4566 |
|
127 |
+
| No log | 9.75 | 156 | 0.4582 | 0.6885 | 0.4582 |
|
128 |
+
| No log | 9.875 | 158 | 0.4588 | 0.6885 | 0.4588 |
|
129 |
+
| No log | 10.0 | 160 | 0.4588 | 0.6885 | 0.4588 |
|
130 |
|
131 |
|
132 |
### Framework versions
|