salbatarni commited on
Commit
168cdb5
·
verified ·
1 Parent(s): aec2594

End of training

Browse files
Files changed (1) hide show
  1. README.md +87 -82
README.md CHANGED
@@ -3,20 +3,20 @@ base_model: aubmindlab/bert-base-arabertv02
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
- - name: arabert_cross_relevance_task1_fold3
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
- # arabert_cross_relevance_task1_fold3
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
- - Loss: 0.2800
18
- - Qwk: 0.4057
19
- - Mse: 0.2800
20
 
21
  ## Model description
22
 
@@ -45,83 +45,88 @@ The following hyperparameters were used during training:
45
 
46
  ### Training results
47
 
48
- | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
- |:-------------:|:------:|:----:|:---------------:|:------:|:------:|
50
- | No log | 0.1333 | 2 | 0.4841 | 0.2229 | 0.4841 |
51
- | No log | 0.2667 | 4 | 0.4548 | 0.1951 | 0.4548 |
52
- | No log | 0.4 | 6 | 0.3736 | 0.2470 | 0.3736 |
53
- | No log | 0.5333 | 8 | 0.3640 | 0.2129 | 0.3640 |
54
- | No log | 0.6667 | 10 | 0.3899 | 0.1839 | 0.3899 |
55
- | No log | 0.8 | 12 | 0.3649 | 0.1986 | 0.3649 |
56
- | No log | 0.9333 | 14 | 0.3192 | 0.3366 | 0.3192 |
57
- | No log | 1.0667 | 16 | 0.3134 | 0.3522 | 0.3134 |
58
- | No log | 1.2 | 18 | 0.2989 | 0.3399 | 0.2989 |
59
- | No log | 1.3333 | 20 | 0.3038 | 0.3248 | 0.3038 |
60
- | No log | 1.4667 | 22 | 0.3097 | 0.3239 | 0.3097 |
61
- | No log | 1.6 | 24 | 0.2998 | 0.3258 | 0.2998 |
62
- | No log | 1.7333 | 26 | 0.2810 | 0.3516 | 0.2810 |
63
- | No log | 1.8667 | 28 | 0.2789 | 0.3522 | 0.2789 |
64
- | No log | 2.0 | 30 | 0.2800 | 0.3509 | 0.2800 |
65
- | No log | 2.1333 | 32 | 0.2906 | 0.3375 | 0.2906 |
66
- | No log | 2.2667 | 34 | 0.3124 | 0.2959 | 0.3124 |
67
- | No log | 2.4 | 36 | 0.3211 | 0.3784 | 0.3211 |
68
- | No log | 2.5333 | 38 | 0.3001 | 0.3428 | 0.3001 |
69
- | No log | 2.6667 | 40 | 0.2852 | 0.3575 | 0.2852 |
70
- | No log | 2.8 | 42 | 0.2865 | 0.3529 | 0.2865 |
71
- | No log | 2.9333 | 44 | 0.2806 | 0.3509 | 0.2806 |
72
- | No log | 3.0667 | 46 | 0.2874 | 0.3391 | 0.2874 |
73
- | No log | 3.2 | 48 | 0.3047 | 0.3615 | 0.3047 |
74
- | No log | 3.3333 | 50 | 0.3111 | 0.3705 | 0.3111 |
75
- | No log | 3.4667 | 52 | 0.3016 | 0.3428 | 0.3016 |
76
- | No log | 3.6 | 54 | 0.2849 | 0.3391 | 0.2849 |
77
- | No log | 3.7333 | 56 | 0.2726 | 0.3502 | 0.2726 |
78
- | No log | 3.8667 | 58 | 0.2737 | 0.3515 | 0.2737 |
79
- | No log | 4.0 | 60 | 0.2771 | 0.3515 | 0.2771 |
80
- | No log | 4.1333 | 62 | 0.2716 | 0.3509 | 0.2716 |
81
- | No log | 4.2667 | 64 | 0.2745 | 0.3480 | 0.2745 |
82
- | No log | 4.4 | 66 | 0.2815 | 0.3431 | 0.2815 |
83
- | No log | 4.5333 | 68 | 0.2772 | 0.3599 | 0.2772 |
84
- | No log | 4.6667 | 70 | 0.2685 | 0.3637 | 0.2685 |
85
- | No log | 4.8 | 72 | 0.2627 | 0.3536 | 0.2627 |
86
- | No log | 4.9333 | 74 | 0.2621 | 0.3549 | 0.2621 |
87
- | No log | 5.0667 | 76 | 0.2604 | 0.3640 | 0.2604 |
88
- | No log | 5.2 | 78 | 0.2644 | 0.3909 | 0.2644 |
89
- | No log | 5.3333 | 80 | 0.2834 | 0.4063 | 0.2834 |
90
- | No log | 5.4667 | 82 | 0.2855 | 0.3857 | 0.2855 |
91
- | No log | 5.6 | 84 | 0.2777 | 0.3709 | 0.2777 |
92
- | No log | 5.7333 | 86 | 0.2703 | 0.3543 | 0.2703 |
93
- | No log | 5.8667 | 88 | 0.2731 | 0.3495 | 0.2731 |
94
- | No log | 6.0 | 90 | 0.2756 | 0.3509 | 0.2756 |
95
- | No log | 6.1333 | 92 | 0.2782 | 0.3538 | 0.2782 |
96
- | No log | 6.2667 | 94 | 0.2856 | 0.3720 | 0.2856 |
97
- | No log | 6.4 | 96 | 0.2950 | 0.4113 | 0.2950 |
98
- | No log | 6.5333 | 98 | 0.3066 | 0.4735 | 0.3066 |
99
- | No log | 6.6667 | 100 | 0.2982 | 0.4566 | 0.2982 |
100
- | No log | 6.8 | 102 | 0.2915 | 0.4196 | 0.2915 |
101
- | No log | 6.9333 | 104 | 0.2806 | 0.3842 | 0.2806 |
102
- | No log | 7.0667 | 106 | 0.2757 | 0.3868 | 0.2757 |
103
- | No log | 7.2 | 108 | 0.2790 | 0.3895 | 0.2790 |
104
- | No log | 7.3333 | 110 | 0.2776 | 0.3868 | 0.2776 |
105
- | No log | 7.4667 | 112 | 0.2706 | 0.3605 | 0.2706 |
106
- | No log | 7.6 | 114 | 0.2683 | 0.3518 | 0.2683 |
107
- | No log | 7.7333 | 116 | 0.2688 | 0.3518 | 0.2688 |
108
- | No log | 7.8667 | 118 | 0.2706 | 0.3617 | 0.2706 |
109
- | No log | 8.0 | 120 | 0.2740 | 0.3712 | 0.2740 |
110
- | No log | 8.1333 | 122 | 0.2770 | 0.3712 | 0.2770 |
111
- | No log | 8.2667 | 124 | 0.2813 | 0.3756 | 0.2813 |
112
- | No log | 8.4 | 126 | 0.2862 | 0.3730 | 0.2862 |
113
- | No log | 8.5333 | 128 | 0.2926 | 0.4157 | 0.2926 |
114
- | No log | 8.6667 | 130 | 0.2956 | 0.4209 | 0.2956 |
115
- | No log | 8.8 | 132 | 0.2928 | 0.4211 | 0.2928 |
116
- | No log | 8.9333 | 134 | 0.2870 | 0.4052 | 0.2870 |
117
- | No log | 9.0667 | 136 | 0.2834 | 0.3945 | 0.2834 |
118
- | No log | 9.2 | 138 | 0.2828 | 0.4002 | 0.2828 |
119
- | No log | 9.3333 | 140 | 0.2815 | 0.3952 | 0.2815 |
120
- | No log | 9.4667 | 142 | 0.2814 | 0.4057 | 0.2814 |
121
- | No log | 9.6 | 144 | 0.2807 | 0.4057 | 0.2807 |
122
- | No log | 9.7333 | 146 | 0.2801 | 0.4057 | 0.2801 |
123
- | No log | 9.8667 | 148 | 0.2799 | 0.4057 | 0.2799 |
124
- | No log | 10.0 | 150 | 0.2800 | 0.4057 | 0.2800 |
 
 
 
 
 
125
 
126
 
127
  ### Framework versions
 
3
  tags:
4
  - generated_from_trainer
5
  model-index:
6
+ - name: arabert_cross_relevance_task1_fold4
7
  results: []
8
  ---
9
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
+ # arabert_cross_relevance_task1_fold4
14
 
15
  This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.2462
18
+ - Qwk: 0.2970
19
+ - Mse: 0.2462
20
 
21
  ## Model description
22
 
 
45
 
46
  ### Training results
47
 
48
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse |
49
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|
50
+ | No log | 0.125 | 2 | 1.1439 | 0.0152 | 1.1439 |
51
+ | No log | 0.25 | 4 | 0.4542 | 0.1058 | 0.4542 |
52
+ | No log | 0.375 | 6 | 0.3468 | 0.2290 | 0.3468 |
53
+ | No log | 0.5 | 8 | 0.3648 | 0.1816 | 0.3648 |
54
+ | No log | 0.625 | 10 | 0.4917 | 0.2054 | 0.4917 |
55
+ | No log | 0.75 | 12 | 0.3361 | 0.2720 | 0.3361 |
56
+ | No log | 0.875 | 14 | 0.2581 | 0.2606 | 0.2581 |
57
+ | No log | 1.0 | 16 | 0.2596 | 0.3452 | 0.2596 |
58
+ | No log | 1.125 | 18 | 0.2652 | 0.3776 | 0.2652 |
59
+ | No log | 1.25 | 20 | 0.2495 | 0.3605 | 0.2495 |
60
+ | No log | 1.375 | 22 | 0.2883 | 0.2801 | 0.2883 |
61
+ | No log | 1.5 | 24 | 0.3460 | 0.2241 | 0.3460 |
62
+ | No log | 1.625 | 26 | 0.3144 | 0.2483 | 0.3144 |
63
+ | No log | 1.75 | 28 | 0.2568 | 0.3617 | 0.2568 |
64
+ | No log | 1.875 | 30 | 0.2645 | 0.3732 | 0.2645 |
65
+ | No log | 2.0 | 32 | 0.2745 | 0.3354 | 0.2745 |
66
+ | No log | 2.125 | 34 | 0.2642 | 0.3416 | 0.2642 |
67
+ | No log | 2.25 | 36 | 0.2371 | 0.3323 | 0.2371 |
68
+ | No log | 2.375 | 38 | 0.2288 | 0.3278 | 0.2288 |
69
+ | No log | 2.5 | 40 | 0.2336 | 0.3517 | 0.2336 |
70
+ | No log | 2.625 | 42 | 0.2327 | 0.3836 | 0.2327 |
71
+ | No log | 2.75 | 44 | 0.2341 | 0.4092 | 0.2341 |
72
+ | No log | 2.875 | 46 | 0.2410 | 0.3449 | 0.2410 |
73
+ | No log | 3.0 | 48 | 0.2695 | 0.3349 | 0.2695 |
74
+ | No log | 3.125 | 50 | 0.2860 | 0.2593 | 0.2860 |
75
+ | No log | 3.25 | 52 | 0.2584 | 0.2899 | 0.2584 |
76
+ | No log | 3.375 | 54 | 0.2408 | 0.3216 | 0.2408 |
77
+ | No log | 3.5 | 56 | 0.2232 | 0.3190 | 0.2232 |
78
+ | No log | 3.625 | 58 | 0.2179 | 0.3172 | 0.2179 |
79
+ | No log | 3.75 | 60 | 0.2229 | 0.3029 | 0.2229 |
80
+ | No log | 3.875 | 62 | 0.2274 | 0.2855 | 0.2274 |
81
+ | No log | 4.0 | 64 | 0.2344 | 0.2787 | 0.2344 |
82
+ | No log | 4.125 | 66 | 0.2449 | 0.2616 | 0.2449 |
83
+ | No log | 4.25 | 68 | 0.2478 | 0.2753 | 0.2478 |
84
+ | No log | 4.375 | 70 | 0.2462 | 0.3017 | 0.2462 |
85
+ | No log | 4.5 | 72 | 0.2513 | 0.3343 | 0.2513 |
86
+ | No log | 4.625 | 74 | 0.2528 | 0.3607 | 0.2528 |
87
+ | No log | 4.75 | 76 | 0.2439 | 0.3638 | 0.2439 |
88
+ | No log | 4.875 | 78 | 0.2300 | 0.3536 | 0.2300 |
89
+ | No log | 5.0 | 80 | 0.2244 | 0.3127 | 0.2244 |
90
+ | No log | 5.125 | 82 | 0.2243 | 0.3034 | 0.2243 |
91
+ | No log | 5.25 | 84 | 0.2290 | 0.2891 | 0.2290 |
92
+ | No log | 5.375 | 86 | 0.2240 | 0.3203 | 0.2240 |
93
+ | No log | 5.5 | 88 | 0.2237 | 0.3522 | 0.2237 |
94
+ | No log | 5.625 | 90 | 0.2245 | 0.3408 | 0.2245 |
95
+ | No log | 5.75 | 92 | 0.2258 | 0.3271 | 0.2258 |
96
+ | No log | 5.875 | 94 | 0.2291 | 0.3021 | 0.2291 |
97
+ | No log | 6.0 | 96 | 0.2407 | 0.2882 | 0.2407 |
98
+ | No log | 6.125 | 98 | 0.2527 | 0.2783 | 0.2527 |
99
+ | No log | 6.25 | 100 | 0.2560 | 0.2686 | 0.2560 |
100
+ | No log | 6.375 | 102 | 0.2524 | 0.2882 | 0.2524 |
101
+ | No log | 6.5 | 104 | 0.2478 | 0.3136 | 0.2478 |
102
+ | No log | 6.625 | 106 | 0.2439 | 0.3199 | 0.2439 |
103
+ | No log | 6.75 | 108 | 0.2489 | 0.3136 | 0.2489 |
104
+ | No log | 6.875 | 110 | 0.2507 | 0.3033 | 0.2507 |
105
+ | No log | 7.0 | 112 | 0.2439 | 0.3109 | 0.2439 |
106
+ | No log | 7.125 | 114 | 0.2358 | 0.3238 | 0.2358 |
107
+ | No log | 7.25 | 116 | 0.2350 | 0.3238 | 0.2350 |
108
+ | No log | 7.375 | 118 | 0.2450 | 0.3069 | 0.2450 |
109
+ | No log | 7.5 | 120 | 0.2670 | 0.2709 | 0.2670 |
110
+ | No log | 7.625 | 122 | 0.2757 | 0.2529 | 0.2757 |
111
+ | No log | 7.75 | 124 | 0.2626 | 0.2709 | 0.2626 |
112
+ | No log | 7.875 | 126 | 0.2540 | 0.2774 | 0.2540 |
113
+ | No log | 8.0 | 128 | 0.2430 | 0.2904 | 0.2430 |
114
+ | No log | 8.125 | 130 | 0.2346 | 0.2878 | 0.2346 |
115
+ | No log | 8.25 | 132 | 0.2324 | 0.2981 | 0.2324 |
116
+ | No log | 8.375 | 134 | 0.2325 | 0.2981 | 0.2325 |
117
+ | No log | 8.5 | 136 | 0.2349 | 0.2878 | 0.2349 |
118
+ | No log | 8.625 | 138 | 0.2374 | 0.3005 | 0.2374 |
119
+ | No log | 8.75 | 140 | 0.2415 | 0.2942 | 0.2415 |
120
+ | No log | 8.875 | 142 | 0.2456 | 0.2942 | 0.2456 |
121
+ | No log | 9.0 | 144 | 0.2483 | 0.2942 | 0.2483 |
122
+ | No log | 9.125 | 146 | 0.2484 | 0.2942 | 0.2484 |
123
+ | No log | 9.25 | 148 | 0.2480 | 0.2970 | 0.2480 |
124
+ | No log | 9.375 | 150 | 0.2488 | 0.2970 | 0.2488 |
125
+ | No log | 9.5 | 152 | 0.2480 | 0.2970 | 0.2480 |
126
+ | No log | 9.625 | 154 | 0.2465 | 0.2970 | 0.2465 |
127
+ | No log | 9.75 | 156 | 0.2462 | 0.2942 | 0.2462 |
128
+ | No log | 9.875 | 158 | 0.2463 | 0.2970 | 0.2463 |
129
+ | No log | 10.0 | 160 | 0.2462 | 0.2970 | 0.2462 |
130
 
131
 
132
  ### Framework versions