apwic commited on
Commit
c859704
1 Parent(s): 9209565

Model save

Browse files
README.md ADDED
@@ -0,0 +1,172 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: indolem/indobert-base-uncased
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: nerui-seq_bn-2
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # nerui-seq_bn-2
15
+
16
+ This model is a fine-tuned version of [indolem/indobert-base-uncased](https://huggingface.co/indolem/indobert-base-uncased) on an unknown dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.0508
19
+ - Location Precision: 0.8835
20
+ - Location Recall: 0.9785
21
+ - Location F1: 0.9286
22
+ - Location Number: 93
23
+ - Organization Precision: 0.9268
24
+ - Organization Recall: 0.9157
25
+ - Organization F1: 0.9212
26
+ - Organization Number: 166
27
+ - Person Precision: 0.9789
28
+ - Person Recall: 0.9789
29
+ - Person F1: 0.9789
30
+ - Person Number: 142
31
+ - Overall Precision: 0.9340
32
+ - Overall Recall: 0.9526
33
+ - Overall F1: 0.9432
34
+ - Overall Accuracy: 0.9877
35
+
36
+ ## Model description
37
+
38
+ More information needed
39
+
40
+ ## Intended uses & limitations
41
+
42
+ More information needed
43
+
44
+ ## Training and evaluation data
45
+
46
+ More information needed
47
+
48
+ ## Training procedure
49
+
50
+ ### Training hyperparameters
51
+
52
+ The following hyperparameters were used during training:
53
+ - learning_rate: 5e-05
54
+ - train_batch_size: 16
55
+ - eval_batch_size: 64
56
+ - seed: 42
57
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
58
+ - lr_scheduler_type: linear
59
+ - num_epochs: 100.0
60
+
61
+ ### Training results
62
+
63
+ | Training Loss | Epoch | Step | Validation Loss | Location Precision | Location Recall | Location F1 | Location Number | Organization Precision | Organization Recall | Organization F1 | Organization Number | Person Precision | Person Recall | Person F1 | Person Number | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
64
+ |:-------------:|:-----:|:----:|:---------------:|:------------------:|:---------------:|:-----------:|:---------------:|:----------------------:|:-------------------:|:---------------:|:-------------------:|:----------------:|:-------------:|:---------:|:-------------:|:-----------------:|:--------------:|:----------:|:----------------:|
65
+ | 0.8344 | 1.0 | 96 | 0.5347 | 0.0 | 0.0 | 0.0 | 93 | 0.1667 | 0.0060 | 0.0116 | 166 | 0.0 | 0.0 | 0.0 | 142 | 0.1111 | 0.0025 | 0.0049 | 0.8348 |
66
+ | 0.4476 | 2.0 | 192 | 0.3089 | 0.3061 | 0.1613 | 0.2113 | 93 | 0.3575 | 0.4759 | 0.4083 | 166 | 0.3249 | 0.5423 | 0.4063 | 142 | 0.3373 | 0.4264 | 0.3767 | 0.9001 |
67
+ | 0.2971 | 3.0 | 288 | 0.2109 | 0.3962 | 0.4516 | 0.4221 | 93 | 0.5707 | 0.7048 | 0.6307 | 166 | 0.6630 | 0.8451 | 0.7430 | 142 | 0.5671 | 0.6958 | 0.6249 | 0.9438 |
68
+ | 0.21 | 4.0 | 384 | 0.1358 | 0.5741 | 0.6667 | 0.6169 | 93 | 0.6935 | 0.7771 | 0.7330 | 166 | 0.8701 | 0.9437 | 0.9054 | 142 | 0.7254 | 0.8105 | 0.7656 | 0.9627 |
69
+ | 0.1444 | 5.0 | 480 | 0.0974 | 0.7064 | 0.8280 | 0.7624 | 93 | 0.7545 | 0.7590 | 0.7568 | 166 | 0.9054 | 0.9437 | 0.9241 | 142 | 0.7948 | 0.8404 | 0.8170 | 0.9701 |
70
+ | 0.1206 | 6.0 | 576 | 0.0884 | 0.7456 | 0.9140 | 0.8213 | 93 | 0.7989 | 0.8373 | 0.8176 | 166 | 0.9448 | 0.9648 | 0.9547 | 142 | 0.8337 | 0.9002 | 0.8657 | 0.9739 |
71
+ | 0.1044 | 7.0 | 672 | 0.0804 | 0.7748 | 0.9247 | 0.8431 | 93 | 0.8056 | 0.8735 | 0.8382 | 166 | 0.9388 | 0.9718 | 0.9550 | 142 | 0.8425 | 0.9202 | 0.8796 | 0.9761 |
72
+ | 0.0937 | 8.0 | 768 | 0.0711 | 0.7727 | 0.9140 | 0.8374 | 93 | 0.8352 | 0.8855 | 0.8596 | 166 | 0.9384 | 0.9648 | 0.9514 | 142 | 0.8542 | 0.9202 | 0.8860 | 0.9786 |
73
+ | 0.0906 | 9.0 | 864 | 0.0635 | 0.8365 | 0.9355 | 0.8832 | 93 | 0.8295 | 0.8795 | 0.8538 | 166 | 0.9388 | 0.9718 | 0.9550 | 142 | 0.8689 | 0.9252 | 0.8961 | 0.9805 |
74
+ | 0.086 | 10.0 | 960 | 0.0629 | 0.8190 | 0.9247 | 0.8687 | 93 | 0.8232 | 0.8976 | 0.8588 | 166 | 0.9257 | 0.9648 | 0.9448 | 142 | 0.8571 | 0.9277 | 0.8910 | 0.9802 |
75
+ | 0.0759 | 11.0 | 1056 | 0.0562 | 0.8969 | 0.9355 | 0.9158 | 93 | 0.8278 | 0.8976 | 0.8613 | 166 | 0.9320 | 0.9648 | 0.9481 | 142 | 0.8797 | 0.9302 | 0.9042 | 0.9808 |
76
+ | 0.071 | 12.0 | 1152 | 0.0537 | 0.8673 | 0.9140 | 0.8901 | 93 | 0.8361 | 0.9217 | 0.8768 | 166 | 0.9384 | 0.9648 | 0.9514 | 142 | 0.8782 | 0.9352 | 0.9058 | 0.9819 |
77
+ | 0.0669 | 13.0 | 1248 | 0.0504 | 0.88 | 0.9462 | 0.9119 | 93 | 0.8765 | 0.8976 | 0.8869 | 166 | 0.9384 | 0.9648 | 0.9514 | 142 | 0.8990 | 0.9327 | 0.9155 | 0.9841 |
78
+ | 0.0639 | 14.0 | 1344 | 0.0516 | 0.8462 | 0.9462 | 0.8934 | 93 | 0.8810 | 0.8916 | 0.8862 | 166 | 0.9517 | 0.9718 | 0.9617 | 142 | 0.8969 | 0.9327 | 0.9144 | 0.9833 |
79
+ | 0.061 | 15.0 | 1440 | 0.0463 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.8678 | 0.9096 | 0.8882 | 166 | 0.9583 | 0.9718 | 0.9650 | 142 | 0.9017 | 0.9377 | 0.9193 | 0.9855 |
80
+ | 0.0594 | 16.0 | 1536 | 0.0477 | 0.8627 | 0.9462 | 0.9026 | 93 | 0.8678 | 0.9096 | 0.8882 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.8998 | 0.9401 | 0.9195 | 0.9852 |
81
+ | 0.0567 | 17.0 | 1632 | 0.0467 | 0.8627 | 0.9462 | 0.9026 | 93 | 0.8571 | 0.9036 | 0.8798 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.8952 | 0.9377 | 0.9160 | 0.9844 |
82
+ | 0.0526 | 18.0 | 1728 | 0.0412 | 0.9263 | 0.9462 | 0.9362 | 93 | 0.8786 | 0.9157 | 0.8968 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9197 | 0.9426 | 0.9310 | 0.9871 |
83
+ | 0.05 | 19.0 | 1824 | 0.0427 | 0.8990 | 0.9570 | 0.9271 | 93 | 0.8935 | 0.9096 | 0.9015 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9197 | 0.9426 | 0.9310 | 0.9868 |
84
+ | 0.0474 | 20.0 | 1920 | 0.0438 | 0.8725 | 0.9570 | 0.9128 | 93 | 0.8701 | 0.9277 | 0.8980 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9028 | 0.9501 | 0.9259 | 0.9855 |
85
+ | 0.0472 | 21.0 | 2016 | 0.0415 | 0.9 | 0.9677 | 0.9326 | 93 | 0.8844 | 0.9217 | 0.9027 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9159 | 0.9501 | 0.9327 | 0.9874 |
86
+ | 0.0426 | 22.0 | 2112 | 0.0416 | 0.89 | 0.9570 | 0.9223 | 93 | 0.8686 | 0.9157 | 0.8915 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9067 | 0.9451 | 0.9255 | 0.9868 |
87
+ | 0.0422 | 23.0 | 2208 | 0.0421 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9036 | 0.9036 | 0.9036 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9220 | 0.9426 | 0.9322 | 0.9868 |
88
+ | 0.0418 | 24.0 | 2304 | 0.0450 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9096 | 0.9096 | 0.9096 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9199 | 0.9451 | 0.9323 | 0.9849 |
89
+ | 0.038 | 25.0 | 2400 | 0.0422 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.8786 | 0.9157 | 0.8968 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9089 | 0.9451 | 0.9267 | 0.9852 |
90
+ | 0.037 | 26.0 | 2496 | 0.0401 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.8686 | 0.9157 | 0.8915 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9045 | 0.9451 | 0.9244 | 0.9857 |
91
+ | 0.0346 | 27.0 | 2592 | 0.0395 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.8935 | 0.9096 | 0.9015 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9153 | 0.9426 | 0.9287 | 0.9868 |
92
+ | 0.0363 | 28.0 | 2688 | 0.0427 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.8539 | 0.9157 | 0.8837 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9021 | 0.9426 | 0.9220 | 0.9846 |
93
+ | 0.0366 | 29.0 | 2784 | 0.0419 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9223 | 0.9476 | 0.9348 | 0.9874 |
94
+ | 0.0323 | 30.0 | 2880 | 0.0411 | 0.89 | 0.9570 | 0.9223 | 93 | 0.8953 | 0.9277 | 0.9112 | 166 | 0.9722 | 0.9859 | 0.9790 | 142 | 0.9207 | 0.9551 | 0.9376 | 0.9857 |
95
+ | 0.0325 | 31.0 | 2976 | 0.0397 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.8935 | 0.9096 | 0.9015 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9155 | 0.9451 | 0.9301 | 0.9871 |
96
+ | 0.0295 | 32.0 | 3072 | 0.0414 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.8779 | 0.9096 | 0.8935 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9108 | 0.9426 | 0.9265 | 0.9857 |
97
+ | 0.0288 | 33.0 | 3168 | 0.0425 | 0.8641 | 0.9570 | 0.9082 | 93 | 0.8922 | 0.8976 | 0.8949 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9104 | 0.9377 | 0.9238 | 0.9844 |
98
+ | 0.0288 | 34.0 | 3264 | 0.0404 | 0.89 | 0.9570 | 0.9223 | 93 | 0.8793 | 0.9217 | 0.9 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9159 | 0.9501 | 0.9327 | 0.9857 |
99
+ | 0.0264 | 35.0 | 3360 | 0.0397 | 0.9082 | 0.9570 | 0.9319 | 93 | 0.8941 | 0.9157 | 0.9048 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9221 | 0.9451 | 0.9335 | 0.9868 |
100
+ | 0.028 | 36.0 | 3456 | 0.0431 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9198 | 0.8976 | 0.9085 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9263 | 0.9401 | 0.9332 | 0.9860 |
101
+ | 0.0245 | 37.0 | 3552 | 0.0393 | 0.8980 | 0.9462 | 0.9215 | 93 | 0.8895 | 0.9217 | 0.9053 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9177 | 0.9451 | 0.9312 | 0.9868 |
102
+ | 0.0238 | 38.0 | 3648 | 0.0424 | 0.9 | 0.9677 | 0.9326 | 93 | 0.9107 | 0.9217 | 0.9162 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9270 | 0.9501 | 0.9384 | 0.9866 |
103
+ | 0.0238 | 39.0 | 3744 | 0.0411 | 0.8812 | 0.9570 | 0.9175 | 93 | 0.9102 | 0.9157 | 0.9129 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9221 | 0.9451 | 0.9335 | 0.9871 |
104
+ | 0.0233 | 40.0 | 3840 | 0.0407 | 0.8990 | 0.9570 | 0.9271 | 93 | 0.9 | 0.9217 | 0.9107 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9223 | 0.9476 | 0.9348 | 0.9863 |
105
+ | 0.023 | 41.0 | 3936 | 0.0403 | 0.8889 | 0.9462 | 0.9167 | 93 | 0.8736 | 0.9157 | 0.8941 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9133 | 0.9451 | 0.9289 | 0.9863 |
106
+ | 0.0217 | 42.0 | 4032 | 0.0416 | 0.89 | 0.9570 | 0.9223 | 93 | 0.9036 | 0.9036 | 0.9036 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9218 | 0.9401 | 0.9309 | 0.9863 |
107
+ | 0.0213 | 43.0 | 4128 | 0.0427 | 0.89 | 0.9570 | 0.9223 | 93 | 0.9198 | 0.8976 | 0.9085 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9284 | 0.9377 | 0.9330 | 0.9855 |
108
+ | 0.0216 | 44.0 | 4224 | 0.0420 | 0.9091 | 0.9677 | 0.9375 | 93 | 0.9102 | 0.9157 | 0.9129 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9291 | 0.9476 | 0.9383 | 0.9871 |
109
+ | 0.02 | 45.0 | 4320 | 0.0422 | 0.9010 | 0.9785 | 0.9381 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9314 | 0.9476 | 0.9394 | 0.9863 |
110
+ | 0.0187 | 46.0 | 4416 | 0.0419 | 0.9010 | 0.9785 | 0.9381 | 93 | 0.8982 | 0.9036 | 0.9009 | 166 | 0.9718 | 0.9718 | 0.9718 | 142 | 0.9244 | 0.9451 | 0.9346 | 0.9874 |
111
+ | 0.0195 | 47.0 | 4512 | 0.0416 | 0.8788 | 0.9355 | 0.9062 | 93 | 0.8837 | 0.9157 | 0.8994 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9153 | 0.9426 | 0.9287 | 0.9868 |
112
+ | 0.0183 | 48.0 | 4608 | 0.0421 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9157 | 0.9157 | 0.9157 | 166 | 0.9650 | 0.9718 | 0.9684 | 142 | 0.9268 | 0.9476 | 0.9371 | 0.9871 |
113
+ | 0.0182 | 49.0 | 4704 | 0.0412 | 0.8713 | 0.9462 | 0.9072 | 93 | 0.8935 | 0.9096 | 0.9015 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9175 | 0.9426 | 0.9299 | 0.9874 |
114
+ | 0.0154 | 50.0 | 4800 | 0.0424 | 0.8990 | 0.9570 | 0.9271 | 93 | 0.9162 | 0.9217 | 0.9189 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9338 | 0.9501 | 0.9419 | 0.9879 |
115
+ | 0.017 | 51.0 | 4896 | 0.0436 | 0.8725 | 0.9570 | 0.9128 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9287 | 0.9426 | 0.9356 | 0.9868 |
116
+ | 0.0165 | 52.0 | 4992 | 0.0437 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9107 | 0.9217 | 0.9162 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9249 | 0.9526 | 0.9386 | 0.9874 |
117
+ | 0.0161 | 53.0 | 5088 | 0.0460 | 0.8990 | 0.9570 | 0.9271 | 93 | 0.9102 | 0.9157 | 0.9129 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9314 | 0.9476 | 0.9394 | 0.9874 |
118
+ | 0.0149 | 54.0 | 5184 | 0.0453 | 0.9 | 0.9677 | 0.9326 | 93 | 0.9112 | 0.9277 | 0.9194 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9296 | 0.9551 | 0.9422 | 0.9877 |
119
+ | 0.0149 | 55.0 | 5280 | 0.0462 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9317 | 0.9036 | 0.9174 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9335 | 0.9451 | 0.9393 | 0.9860 |
120
+ | 0.0155 | 56.0 | 5376 | 0.0445 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9102 | 0.9157 | 0.9129 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9248 | 0.9501 | 0.9373 | 0.9877 |
121
+ | 0.0151 | 57.0 | 5472 | 0.0473 | 0.8922 | 0.9785 | 0.9333 | 93 | 0.8982 | 0.9036 | 0.9009 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9246 | 0.9476 | 0.9360 | 0.9868 |
122
+ | 0.0164 | 58.0 | 5568 | 0.0492 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9102 | 0.9157 | 0.9129 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9272 | 0.9526 | 0.9397 | 0.9871 |
123
+ | 0.0144 | 59.0 | 5664 | 0.0464 | 0.9109 | 0.9892 | 0.9485 | 93 | 0.9268 | 0.9157 | 0.9212 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9387 | 0.9551 | 0.9468 | 0.9877 |
124
+ | 0.014 | 60.0 | 5760 | 0.0499 | 0.875 | 0.9785 | 0.9239 | 93 | 0.9367 | 0.8916 | 0.9136 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9356 | 0.9426 | 0.9391 | 0.9863 |
125
+ | 0.0137 | 61.0 | 5856 | 0.0452 | 0.9020 | 0.9892 | 0.9436 | 93 | 0.9152 | 0.9096 | 0.9124 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9340 | 0.9526 | 0.9432 | 0.9874 |
126
+ | 0.0132 | 62.0 | 5952 | 0.0468 | 0.9020 | 0.9892 | 0.9436 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9363 | 0.9526 | 0.9444 | 0.9874 |
127
+ | 0.0125 | 63.0 | 6048 | 0.0456 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9048 | 0.9157 | 0.9102 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9249 | 0.9526 | 0.9386 | 0.9877 |
128
+ | 0.0131 | 64.0 | 6144 | 0.0464 | 0.9020 | 0.9892 | 0.9436 | 93 | 0.9317 | 0.9036 | 0.9174 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9407 | 0.9501 | 0.9454 | 0.9871 |
129
+ | 0.0119 | 65.0 | 6240 | 0.0464 | 0.9020 | 0.9892 | 0.9436 | 93 | 0.9259 | 0.9036 | 0.9146 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9384 | 0.9501 | 0.9442 | 0.9874 |
130
+ | 0.0135 | 66.0 | 6336 | 0.0454 | 0.8922 | 0.9785 | 0.9333 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9337 | 0.9476 | 0.9406 | 0.9877 |
131
+ | 0.0115 | 67.0 | 6432 | 0.0471 | 0.9010 | 0.9785 | 0.9381 | 93 | 0.9152 | 0.9096 | 0.9124 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9315 | 0.9501 | 0.9407 | 0.9874 |
132
+ | 0.0124 | 68.0 | 6528 | 0.0468 | 0.8922 | 0.9785 | 0.9333 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9338 | 0.9501 | 0.9419 | 0.9871 |
133
+ | 0.0124 | 69.0 | 6624 | 0.0464 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9107 | 0.9217 | 0.9162 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9251 | 0.9551 | 0.9399 | 0.9877 |
134
+ | 0.0107 | 70.0 | 6720 | 0.0470 | 0.8932 | 0.9892 | 0.9388 | 93 | 0.9325 | 0.9157 | 0.9240 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9364 | 0.9551 | 0.9457 | 0.9879 |
135
+ | 0.011 | 71.0 | 6816 | 0.0476 | 0.8922 | 0.9785 | 0.9333 | 93 | 0.9226 | 0.9337 | 0.9281 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9322 | 0.9601 | 0.9459 | 0.9885 |
136
+ | 0.0102 | 72.0 | 6912 | 0.0485 | 0.8932 | 0.9892 | 0.9388 | 93 | 0.9264 | 0.9096 | 0.9179 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9363 | 0.9526 | 0.9444 | 0.9877 |
137
+ | 0.0096 | 73.0 | 7008 | 0.0485 | 0.8932 | 0.9892 | 0.9388 | 93 | 0.9383 | 0.9157 | 0.9268 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9410 | 0.9551 | 0.9480 | 0.9877 |
138
+ | 0.0107 | 74.0 | 7104 | 0.0472 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9379 | 0.9096 | 0.9235 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9384 | 0.9501 | 0.9442 | 0.9874 |
139
+ | 0.0119 | 75.0 | 7200 | 0.0504 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9317 | 0.9036 | 0.9174 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9358 | 0.9451 | 0.9404 | 0.9871 |
140
+ | 0.0105 | 76.0 | 7296 | 0.0488 | 0.875 | 0.9785 | 0.9239 | 93 | 0.9264 | 0.9096 | 0.9179 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9315 | 0.9501 | 0.9407 | 0.9874 |
141
+ | 0.0102 | 77.0 | 7392 | 0.0504 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9273 | 0.9217 | 0.9245 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9341 | 0.9551 | 0.9445 | 0.9877 |
142
+ | 0.0098 | 78.0 | 7488 | 0.0484 | 0.9020 | 0.9892 | 0.9436 | 93 | 0.9321 | 0.9096 | 0.9207 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9409 | 0.9526 | 0.9467 | 0.9885 |
143
+ | 0.0089 | 79.0 | 7584 | 0.0468 | 0.8990 | 0.9570 | 0.9271 | 93 | 0.9207 | 0.9096 | 0.9152 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9358 | 0.9451 | 0.9404 | 0.9882 |
144
+ | 0.0113 | 80.0 | 7680 | 0.0493 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9152 | 0.9096 | 0.9124 | 166 | 0.9720 | 0.9789 | 0.9754 | 142 | 0.9291 | 0.9476 | 0.9383 | 0.9877 |
145
+ | 0.0093 | 81.0 | 7776 | 0.0489 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9217 | 0.9217 | 0.9217 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9340 | 0.9526 | 0.9432 | 0.9879 |
146
+ | 0.0094 | 82.0 | 7872 | 0.0519 | 0.8654 | 0.9677 | 0.9137 | 93 | 0.9136 | 0.8916 | 0.9024 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9240 | 0.9401 | 0.9320 | 0.9866 |
147
+ | 0.0104 | 83.0 | 7968 | 0.0508 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9379 | 0.9096 | 0.9235 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9360 | 0.9476 | 0.9418 | 0.9871 |
148
+ | 0.0098 | 84.0 | 8064 | 0.0511 | 0.875 | 0.9785 | 0.9239 | 93 | 0.9317 | 0.9036 | 0.9174 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9337 | 0.9476 | 0.9406 | 0.9877 |
149
+ | 0.0092 | 85.0 | 8160 | 0.0493 | 0.9020 | 0.9892 | 0.9436 | 93 | 0.9325 | 0.9157 | 0.9240 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9410 | 0.9551 | 0.9480 | 0.9888 |
150
+ | 0.0099 | 86.0 | 8256 | 0.0502 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9383 | 0.9157 | 0.9268 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9384 | 0.9501 | 0.9442 | 0.9879 |
151
+ | 0.0092 | 87.0 | 8352 | 0.0496 | 0.8738 | 0.9677 | 0.9184 | 93 | 0.9321 | 0.9096 | 0.9207 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9337 | 0.9476 | 0.9406 | 0.9874 |
152
+ | 0.0091 | 88.0 | 8448 | 0.0490 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9321 | 0.9096 | 0.9207 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9361 | 0.9501 | 0.9431 | 0.9877 |
153
+ | 0.0091 | 89.0 | 8544 | 0.0497 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9264 | 0.9096 | 0.9179 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9338 | 0.9501 | 0.9419 | 0.9874 |
154
+ | 0.0093 | 90.0 | 8640 | 0.0487 | 0.9020 | 0.9892 | 0.9436 | 93 | 0.9264 | 0.9096 | 0.9179 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9386 | 0.9526 | 0.9455 | 0.9879 |
155
+ | 0.0096 | 91.0 | 8736 | 0.0500 | 0.8922 | 0.9785 | 0.9333 | 93 | 0.9321 | 0.9096 | 0.9207 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9384 | 0.9501 | 0.9442 | 0.9879 |
156
+ | 0.0088 | 92.0 | 8832 | 0.0506 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9259 | 0.9036 | 0.9146 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9335 | 0.9451 | 0.9393 | 0.9874 |
157
+ | 0.0088 | 93.0 | 8928 | 0.0510 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9264 | 0.9096 | 0.9179 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9338 | 0.9501 | 0.9419 | 0.9874 |
158
+ | 0.0089 | 94.0 | 9024 | 0.0513 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9312 | 0.9451 | 0.9381 | 0.9871 |
159
+ | 0.0087 | 95.0 | 9120 | 0.0509 | 0.8824 | 0.9677 | 0.9231 | 93 | 0.9259 | 0.9036 | 0.9146 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9335 | 0.9451 | 0.9393 | 0.9874 |
160
+ | 0.0082 | 96.0 | 9216 | 0.0506 | 0.8911 | 0.9677 | 0.9278 | 93 | 0.9202 | 0.9036 | 0.9119 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9335 | 0.9451 | 0.9393 | 0.9874 |
161
+ | 0.0093 | 97.0 | 9312 | 0.0512 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9268 | 0.9157 | 0.9212 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9340 | 0.9526 | 0.9432 | 0.9877 |
162
+ | 0.0083 | 98.0 | 9408 | 0.0509 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9268 | 0.9157 | 0.9212 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9340 | 0.9526 | 0.9432 | 0.9877 |
163
+ | 0.0075 | 99.0 | 9504 | 0.0507 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9268 | 0.9157 | 0.9212 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9340 | 0.9526 | 0.9432 | 0.9877 |
164
+ | 0.0094 | 100.0 | 9600 | 0.0508 | 0.8835 | 0.9785 | 0.9286 | 93 | 0.9268 | 0.9157 | 0.9212 | 166 | 0.9789 | 0.9789 | 0.9789 | 142 | 0.9340 | 0.9526 | 0.9432 | 0.9877 |
165
+
166
+
167
+ ### Framework versions
168
+
169
+ - Transformers 4.39.3
170
+ - Pytorch 2.3.0+cu121
171
+ - Datasets 2.19.1
172
+ - Tokenizers 0.15.2
adapter-ner/adapter_config.json ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "config": {
3
+ "adapter_residual_before_ln": false,
4
+ "cross_adapter": false,
5
+ "dropout": 0.0,
6
+ "factorized_phm_W": true,
7
+ "factorized_phm_rule": false,
8
+ "hypercomplex_nonlinearity": "glorot-uniform",
9
+ "init_weights": "bert",
10
+ "inv_adapter": null,
11
+ "inv_adapter_reduction_factor": null,
12
+ "is_parallel": false,
13
+ "learn_phm": true,
14
+ "leave_out": [],
15
+ "ln_after": false,
16
+ "ln_before": false,
17
+ "mh_adapter": false,
18
+ "non_linearity": "relu",
19
+ "original_ln_after": true,
20
+ "original_ln_before": true,
21
+ "output_adapter": true,
22
+ "phm_bias": true,
23
+ "phm_c_init": "normal",
24
+ "phm_dim": 4,
25
+ "phm_init_range": 0.0001,
26
+ "phm_layer": false,
27
+ "phm_rank": 1,
28
+ "reduction_factor": 16,
29
+ "residual_before_ln": true,
30
+ "scaling": 1.0,
31
+ "shared_W_phm": false,
32
+ "shared_phm_rule": true,
33
+ "use_gating": false
34
+ },
35
+ "config_id": "9076f36a74755ac4",
36
+ "hidden_size": 768,
37
+ "model_class": "BertForTokenClassification",
38
+ "model_name": "indolem/indobert-base-uncased",
39
+ "model_type": "bert",
40
+ "name": "adapter-ner",
41
+ "version": "0.2.0"
42
+ }
adapter-ner/head_config.json ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "config": null,
3
+ "hidden_size": 768,
4
+ "label2id": {
5
+ "B-LOCATION": 0,
6
+ "B-ORGANIZATION": 1,
7
+ "B-PERSON": 2,
8
+ "I-LOCATION": 3,
9
+ "I-ORGANIZATION": 4,
10
+ "I-PERSON": 5,
11
+ "O": 6
12
+ },
13
+ "model_class": "BertForTokenClassification",
14
+ "model_name": "indolem/indobert-base-uncased",
15
+ "model_type": "bert",
16
+ "name": null,
17
+ "num_labels": 7,
18
+ "version": "0.2.0"
19
+ }
adapter-ner/pytorch_adapter.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:98b2ef7f4e80dff735199b5771d9a3f2f21c539b388873e59347ab03e64e190a
3
+ size 3595750
adapter-ner/pytorch_model_head.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6f52969ae5ebc0fb4c0ba5b1736db82d16477f711983acc0429e4bb418cda7c4
3
+ size 23066
runs/Jun09_14-20-51_a358b85c7679/events.out.tfevents.1717942859.a358b85c7679.55968.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b02481fbf36b422507ffa3cfd1a0b62e3fdc50888ff48b281d775ecf468a2f8c
3
- size 146341
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0896f1f6dcef563b19168dd355abb3ff26b37835b914e3c48614e20253e50025
3
+ size 148123