stefan-it commited on
Commit
46ed4f6
1 Parent(s): de927b2

Upload ./training.log with huggingface_hub

Browse files
Files changed (1) hide show
  1. training.log +508 -0
training.log ADDED
@@ -0,0 +1,508 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 2023-10-23 19:54:00,819 ----------------------------------------------------------------------------------------------------
2
+ 2023-10-23 19:54:00,820 Model: "SequenceTagger(
3
+ (embeddings): TransformerWordEmbeddings(
4
+ (model): BertModel(
5
+ (embeddings): BertEmbeddings(
6
+ (word_embeddings): Embedding(64001, 768)
7
+ (position_embeddings): Embedding(512, 768)
8
+ (token_type_embeddings): Embedding(2, 768)
9
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
10
+ (dropout): Dropout(p=0.1, inplace=False)
11
+ )
12
+ (encoder): BertEncoder(
13
+ (layer): ModuleList(
14
+ (0): BertLayer(
15
+ (attention): BertAttention(
16
+ (self): BertSelfAttention(
17
+ (query): Linear(in_features=768, out_features=768, bias=True)
18
+ (key): Linear(in_features=768, out_features=768, bias=True)
19
+ (value): Linear(in_features=768, out_features=768, bias=True)
20
+ (dropout): Dropout(p=0.1, inplace=False)
21
+ )
22
+ (output): BertSelfOutput(
23
+ (dense): Linear(in_features=768, out_features=768, bias=True)
24
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
25
+ (dropout): Dropout(p=0.1, inplace=False)
26
+ )
27
+ )
28
+ (intermediate): BertIntermediate(
29
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
30
+ (intermediate_act_fn): GELUActivation()
31
+ )
32
+ (output): BertOutput(
33
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
34
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
35
+ (dropout): Dropout(p=0.1, inplace=False)
36
+ )
37
+ )
38
+ (1): BertLayer(
39
+ (attention): BertAttention(
40
+ (self): BertSelfAttention(
41
+ (query): Linear(in_features=768, out_features=768, bias=True)
42
+ (key): Linear(in_features=768, out_features=768, bias=True)
43
+ (value): Linear(in_features=768, out_features=768, bias=True)
44
+ (dropout): Dropout(p=0.1, inplace=False)
45
+ )
46
+ (output): BertSelfOutput(
47
+ (dense): Linear(in_features=768, out_features=768, bias=True)
48
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
49
+ (dropout): Dropout(p=0.1, inplace=False)
50
+ )
51
+ )
52
+ (intermediate): BertIntermediate(
53
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
54
+ (intermediate_act_fn): GELUActivation()
55
+ )
56
+ (output): BertOutput(
57
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
58
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
59
+ (dropout): Dropout(p=0.1, inplace=False)
60
+ )
61
+ )
62
+ (2): BertLayer(
63
+ (attention): BertAttention(
64
+ (self): BertSelfAttention(
65
+ (query): Linear(in_features=768, out_features=768, bias=True)
66
+ (key): Linear(in_features=768, out_features=768, bias=True)
67
+ (value): Linear(in_features=768, out_features=768, bias=True)
68
+ (dropout): Dropout(p=0.1, inplace=False)
69
+ )
70
+ (output): BertSelfOutput(
71
+ (dense): Linear(in_features=768, out_features=768, bias=True)
72
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
73
+ (dropout): Dropout(p=0.1, inplace=False)
74
+ )
75
+ )
76
+ (intermediate): BertIntermediate(
77
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
78
+ (intermediate_act_fn): GELUActivation()
79
+ )
80
+ (output): BertOutput(
81
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
82
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
83
+ (dropout): Dropout(p=0.1, inplace=False)
84
+ )
85
+ )
86
+ (3): BertLayer(
87
+ (attention): BertAttention(
88
+ (self): BertSelfAttention(
89
+ (query): Linear(in_features=768, out_features=768, bias=True)
90
+ (key): Linear(in_features=768, out_features=768, bias=True)
91
+ (value): Linear(in_features=768, out_features=768, bias=True)
92
+ (dropout): Dropout(p=0.1, inplace=False)
93
+ )
94
+ (output): BertSelfOutput(
95
+ (dense): Linear(in_features=768, out_features=768, bias=True)
96
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
97
+ (dropout): Dropout(p=0.1, inplace=False)
98
+ )
99
+ )
100
+ (intermediate): BertIntermediate(
101
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
102
+ (intermediate_act_fn): GELUActivation()
103
+ )
104
+ (output): BertOutput(
105
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
106
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
107
+ (dropout): Dropout(p=0.1, inplace=False)
108
+ )
109
+ )
110
+ (4): BertLayer(
111
+ (attention): BertAttention(
112
+ (self): BertSelfAttention(
113
+ (query): Linear(in_features=768, out_features=768, bias=True)
114
+ (key): Linear(in_features=768, out_features=768, bias=True)
115
+ (value): Linear(in_features=768, out_features=768, bias=True)
116
+ (dropout): Dropout(p=0.1, inplace=False)
117
+ )
118
+ (output): BertSelfOutput(
119
+ (dense): Linear(in_features=768, out_features=768, bias=True)
120
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
121
+ (dropout): Dropout(p=0.1, inplace=False)
122
+ )
123
+ )
124
+ (intermediate): BertIntermediate(
125
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
126
+ (intermediate_act_fn): GELUActivation()
127
+ )
128
+ (output): BertOutput(
129
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
130
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
131
+ (dropout): Dropout(p=0.1, inplace=False)
132
+ )
133
+ )
134
+ (5): BertLayer(
135
+ (attention): BertAttention(
136
+ (self): BertSelfAttention(
137
+ (query): Linear(in_features=768, out_features=768, bias=True)
138
+ (key): Linear(in_features=768, out_features=768, bias=True)
139
+ (value): Linear(in_features=768, out_features=768, bias=True)
140
+ (dropout): Dropout(p=0.1, inplace=False)
141
+ )
142
+ (output): BertSelfOutput(
143
+ (dense): Linear(in_features=768, out_features=768, bias=True)
144
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
145
+ (dropout): Dropout(p=0.1, inplace=False)
146
+ )
147
+ )
148
+ (intermediate): BertIntermediate(
149
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
150
+ (intermediate_act_fn): GELUActivation()
151
+ )
152
+ (output): BertOutput(
153
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
154
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
155
+ (dropout): Dropout(p=0.1, inplace=False)
156
+ )
157
+ )
158
+ (6): BertLayer(
159
+ (attention): BertAttention(
160
+ (self): BertSelfAttention(
161
+ (query): Linear(in_features=768, out_features=768, bias=True)
162
+ (key): Linear(in_features=768, out_features=768, bias=True)
163
+ (value): Linear(in_features=768, out_features=768, bias=True)
164
+ (dropout): Dropout(p=0.1, inplace=False)
165
+ )
166
+ (output): BertSelfOutput(
167
+ (dense): Linear(in_features=768, out_features=768, bias=True)
168
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
169
+ (dropout): Dropout(p=0.1, inplace=False)
170
+ )
171
+ )
172
+ (intermediate): BertIntermediate(
173
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
174
+ (intermediate_act_fn): GELUActivation()
175
+ )
176
+ (output): BertOutput(
177
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
178
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
179
+ (dropout): Dropout(p=0.1, inplace=False)
180
+ )
181
+ )
182
+ (7): BertLayer(
183
+ (attention): BertAttention(
184
+ (self): BertSelfAttention(
185
+ (query): Linear(in_features=768, out_features=768, bias=True)
186
+ (key): Linear(in_features=768, out_features=768, bias=True)
187
+ (value): Linear(in_features=768, out_features=768, bias=True)
188
+ (dropout): Dropout(p=0.1, inplace=False)
189
+ )
190
+ (output): BertSelfOutput(
191
+ (dense): Linear(in_features=768, out_features=768, bias=True)
192
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
193
+ (dropout): Dropout(p=0.1, inplace=False)
194
+ )
195
+ )
196
+ (intermediate): BertIntermediate(
197
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
198
+ (intermediate_act_fn): GELUActivation()
199
+ )
200
+ (output): BertOutput(
201
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
202
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
203
+ (dropout): Dropout(p=0.1, inplace=False)
204
+ )
205
+ )
206
+ (8): BertLayer(
207
+ (attention): BertAttention(
208
+ (self): BertSelfAttention(
209
+ (query): Linear(in_features=768, out_features=768, bias=True)
210
+ (key): Linear(in_features=768, out_features=768, bias=True)
211
+ (value): Linear(in_features=768, out_features=768, bias=True)
212
+ (dropout): Dropout(p=0.1, inplace=False)
213
+ )
214
+ (output): BertSelfOutput(
215
+ (dense): Linear(in_features=768, out_features=768, bias=True)
216
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
217
+ (dropout): Dropout(p=0.1, inplace=False)
218
+ )
219
+ )
220
+ (intermediate): BertIntermediate(
221
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
222
+ (intermediate_act_fn): GELUActivation()
223
+ )
224
+ (output): BertOutput(
225
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
226
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
227
+ (dropout): Dropout(p=0.1, inplace=False)
228
+ )
229
+ )
230
+ (9): BertLayer(
231
+ (attention): BertAttention(
232
+ (self): BertSelfAttention(
233
+ (query): Linear(in_features=768, out_features=768, bias=True)
234
+ (key): Linear(in_features=768, out_features=768, bias=True)
235
+ (value): Linear(in_features=768, out_features=768, bias=True)
236
+ (dropout): Dropout(p=0.1, inplace=False)
237
+ )
238
+ (output): BertSelfOutput(
239
+ (dense): Linear(in_features=768, out_features=768, bias=True)
240
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
241
+ (dropout): Dropout(p=0.1, inplace=False)
242
+ )
243
+ )
244
+ (intermediate): BertIntermediate(
245
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
246
+ (intermediate_act_fn): GELUActivation()
247
+ )
248
+ (output): BertOutput(
249
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
250
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
251
+ (dropout): Dropout(p=0.1, inplace=False)
252
+ )
253
+ )
254
+ (10): BertLayer(
255
+ (attention): BertAttention(
256
+ (self): BertSelfAttention(
257
+ (query): Linear(in_features=768, out_features=768, bias=True)
258
+ (key): Linear(in_features=768, out_features=768, bias=True)
259
+ (value): Linear(in_features=768, out_features=768, bias=True)
260
+ (dropout): Dropout(p=0.1, inplace=False)
261
+ )
262
+ (output): BertSelfOutput(
263
+ (dense): Linear(in_features=768, out_features=768, bias=True)
264
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
265
+ (dropout): Dropout(p=0.1, inplace=False)
266
+ )
267
+ )
268
+ (intermediate): BertIntermediate(
269
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
270
+ (intermediate_act_fn): GELUActivation()
271
+ )
272
+ (output): BertOutput(
273
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
274
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
275
+ (dropout): Dropout(p=0.1, inplace=False)
276
+ )
277
+ )
278
+ (11): BertLayer(
279
+ (attention): BertAttention(
280
+ (self): BertSelfAttention(
281
+ (query): Linear(in_features=768, out_features=768, bias=True)
282
+ (key): Linear(in_features=768, out_features=768, bias=True)
283
+ (value): Linear(in_features=768, out_features=768, bias=True)
284
+ (dropout): Dropout(p=0.1, inplace=False)
285
+ )
286
+ (output): BertSelfOutput(
287
+ (dense): Linear(in_features=768, out_features=768, bias=True)
288
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
289
+ (dropout): Dropout(p=0.1, inplace=False)
290
+ )
291
+ )
292
+ (intermediate): BertIntermediate(
293
+ (dense): Linear(in_features=768, out_features=3072, bias=True)
294
+ (intermediate_act_fn): GELUActivation()
295
+ )
296
+ (output): BertOutput(
297
+ (dense): Linear(in_features=3072, out_features=768, bias=True)
298
+ (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
299
+ (dropout): Dropout(p=0.1, inplace=False)
300
+ )
301
+ )
302
+ )
303
+ )
304
+ (pooler): BertPooler(
305
+ (dense): Linear(in_features=768, out_features=768, bias=True)
306
+ (activation): Tanh()
307
+ )
308
+ )
309
+ )
310
+ (locked_dropout): LockedDropout(p=0.5)
311
+ (linear): Linear(in_features=768, out_features=25, bias=True)
312
+ (loss_function): CrossEntropyLoss()
313
+ )"
314
+ 2023-10-23 19:54:00,820 ----------------------------------------------------------------------------------------------------
315
+ 2023-10-23 19:54:00,820 MultiCorpus: 966 train + 219 dev + 204 test sentences
316
+ - NER_HIPE_2022 Corpus: 966 train + 219 dev + 204 test sentences - /home/ubuntu/.flair/datasets/ner_hipe_2022/v2.1/ajmc/fr/with_doc_seperator
317
+ 2023-10-23 19:54:00,820 ----------------------------------------------------------------------------------------------------
318
+ 2023-10-23 19:54:00,820 Train: 966 sentences
319
+ 2023-10-23 19:54:00,820 (train_with_dev=False, train_with_test=False)
320
+ 2023-10-23 19:54:00,821 ----------------------------------------------------------------------------------------------------
321
+ 2023-10-23 19:54:00,821 Training Params:
322
+ 2023-10-23 19:54:00,821 - learning_rate: "5e-05"
323
+ 2023-10-23 19:54:00,821 - mini_batch_size: "4"
324
+ 2023-10-23 19:54:00,821 - max_epochs: "10"
325
+ 2023-10-23 19:54:00,821 - shuffle: "True"
326
+ 2023-10-23 19:54:00,821 ----------------------------------------------------------------------------------------------------
327
+ 2023-10-23 19:54:00,821 Plugins:
328
+ 2023-10-23 19:54:00,821 - TensorboardLogger
329
+ 2023-10-23 19:54:00,821 - LinearScheduler | warmup_fraction: '0.1'
330
+ 2023-10-23 19:54:00,821 ----------------------------------------------------------------------------------------------------
331
+ 2023-10-23 19:54:00,821 Final evaluation on model from best epoch (best-model.pt)
332
+ 2023-10-23 19:54:00,821 - metric: "('micro avg', 'f1-score')"
333
+ 2023-10-23 19:54:00,821 ----------------------------------------------------------------------------------------------------
334
+ 2023-10-23 19:54:00,821 Computation:
335
+ 2023-10-23 19:54:00,821 - compute on device: cuda:0
336
+ 2023-10-23 19:54:00,821 - embedding storage: none
337
+ 2023-10-23 19:54:00,821 ----------------------------------------------------------------------------------------------------
338
+ 2023-10-23 19:54:00,821 Model training base path: "hmbench-ajmc/fr-dbmdz/bert-base-historic-multilingual-64k-td-cased-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-5"
339
+ 2023-10-23 19:54:00,821 ----------------------------------------------------------------------------------------------------
340
+ 2023-10-23 19:54:00,821 ----------------------------------------------------------------------------------------------------
341
+ 2023-10-23 19:54:00,821 Logging anything other than scalars to TensorBoard is currently not supported.
342
+ 2023-10-23 19:54:02,281 epoch 1 - iter 24/242 - loss 3.02908177 - time (sec): 1.46 - samples/sec: 1576.51 - lr: 0.000005 - momentum: 0.000000
343
+ 2023-10-23 19:54:03,837 epoch 1 - iter 48/242 - loss 2.01798402 - time (sec): 3.02 - samples/sec: 1707.17 - lr: 0.000010 - momentum: 0.000000
344
+ 2023-10-23 19:54:05,327 epoch 1 - iter 72/242 - loss 1.55185778 - time (sec): 4.51 - samples/sec: 1660.01 - lr: 0.000015 - momentum: 0.000000
345
+ 2023-10-23 19:54:06,844 epoch 1 - iter 96/242 - loss 1.27186543 - time (sec): 6.02 - samples/sec: 1651.27 - lr: 0.000020 - momentum: 0.000000
346
+ 2023-10-23 19:54:08,397 epoch 1 - iter 120/242 - loss 1.10527967 - time (sec): 7.58 - samples/sec: 1640.93 - lr: 0.000025 - momentum: 0.000000
347
+ 2023-10-23 19:54:09,937 epoch 1 - iter 144/242 - loss 0.99645057 - time (sec): 9.12 - samples/sec: 1618.66 - lr: 0.000030 - momentum: 0.000000
348
+ 2023-10-23 19:54:11,465 epoch 1 - iter 168/242 - loss 0.89381082 - time (sec): 10.64 - samples/sec: 1623.01 - lr: 0.000035 - momentum: 0.000000
349
+ 2023-10-23 19:54:12,985 epoch 1 - iter 192/242 - loss 0.81345984 - time (sec): 12.16 - samples/sec: 1621.73 - lr: 0.000039 - momentum: 0.000000
350
+ 2023-10-23 19:54:14,550 epoch 1 - iter 216/242 - loss 0.74715069 - time (sec): 13.73 - samples/sec: 1614.81 - lr: 0.000044 - momentum: 0.000000
351
+ 2023-10-23 19:54:16,032 epoch 1 - iter 240/242 - loss 0.69305886 - time (sec): 15.21 - samples/sec: 1609.87 - lr: 0.000049 - momentum: 0.000000
352
+ 2023-10-23 19:54:16,153 ----------------------------------------------------------------------------------------------------
353
+ 2023-10-23 19:54:16,153 EPOCH 1 done: loss 0.6874 - lr: 0.000049
354
+ 2023-10-23 19:54:16,774 DEV : loss 0.1657165139913559 - f1-score (micro avg) 0.6499
355
+ 2023-10-23 19:54:16,777 saving best model
356
+ 2023-10-23 19:54:17,252 ----------------------------------------------------------------------------------------------------
357
+ 2023-10-23 19:54:18,776 epoch 2 - iter 24/242 - loss 0.18982989 - time (sec): 1.52 - samples/sec: 1642.03 - lr: 0.000049 - momentum: 0.000000
358
+ 2023-10-23 19:54:20,312 epoch 2 - iter 48/242 - loss 0.16868344 - time (sec): 3.06 - samples/sec: 1615.20 - lr: 0.000049 - momentum: 0.000000
359
+ 2023-10-23 19:54:21,820 epoch 2 - iter 72/242 - loss 0.16378782 - time (sec): 4.57 - samples/sec: 1640.75 - lr: 0.000048 - momentum: 0.000000
360
+ 2023-10-23 19:54:23,321 epoch 2 - iter 96/242 - loss 0.16854221 - time (sec): 6.07 - samples/sec: 1629.22 - lr: 0.000048 - momentum: 0.000000
361
+ 2023-10-23 19:54:24,864 epoch 2 - iter 120/242 - loss 0.15458604 - time (sec): 7.61 - samples/sec: 1620.40 - lr: 0.000047 - momentum: 0.000000
362
+ 2023-10-23 19:54:26,386 epoch 2 - iter 144/242 - loss 0.14810813 - time (sec): 9.13 - samples/sec: 1610.74 - lr: 0.000047 - momentum: 0.000000
363
+ 2023-10-23 19:54:27,917 epoch 2 - iter 168/242 - loss 0.15025063 - time (sec): 10.66 - samples/sec: 1618.47 - lr: 0.000046 - momentum: 0.000000
364
+ 2023-10-23 19:54:29,451 epoch 2 - iter 192/242 - loss 0.15010035 - time (sec): 12.20 - samples/sec: 1614.30 - lr: 0.000046 - momentum: 0.000000
365
+ 2023-10-23 19:54:30,942 epoch 2 - iter 216/242 - loss 0.14614517 - time (sec): 13.69 - samples/sec: 1614.79 - lr: 0.000045 - momentum: 0.000000
366
+ 2023-10-23 19:54:32,442 epoch 2 - iter 240/242 - loss 0.14967479 - time (sec): 15.19 - samples/sec: 1615.52 - lr: 0.000045 - momentum: 0.000000
367
+ 2023-10-23 19:54:32,569 ----------------------------------------------------------------------------------------------------
368
+ 2023-10-23 19:54:32,569 EPOCH 2 done: loss 0.1501 - lr: 0.000045
369
+ 2023-10-23 19:54:33,262 DEV : loss 0.11106210947036743 - f1-score (micro avg) 0.7985
370
+ 2023-10-23 19:54:33,265 saving best model
371
+ 2023-10-23 19:54:33,862 ----------------------------------------------------------------------------------------------------
372
+ 2023-10-23 19:54:35,340 epoch 3 - iter 24/242 - loss 0.07707799 - time (sec): 1.48 - samples/sec: 1548.37 - lr: 0.000044 - momentum: 0.000000
373
+ 2023-10-23 19:54:36,872 epoch 3 - iter 48/242 - loss 0.09733679 - time (sec): 3.01 - samples/sec: 1521.98 - lr: 0.000043 - momentum: 0.000000
374
+ 2023-10-23 19:54:38,464 epoch 3 - iter 72/242 - loss 0.09180546 - time (sec): 4.60 - samples/sec: 1582.02 - lr: 0.000043 - momentum: 0.000000
375
+ 2023-10-23 19:54:39,994 epoch 3 - iter 96/242 - loss 0.08892388 - time (sec): 6.13 - samples/sec: 1570.15 - lr: 0.000042 - momentum: 0.000000
376
+ 2023-10-23 19:54:41,515 epoch 3 - iter 120/242 - loss 0.09754773 - time (sec): 7.65 - samples/sec: 1610.29 - lr: 0.000042 - momentum: 0.000000
377
+ 2023-10-23 19:54:43,025 epoch 3 - iter 144/242 - loss 0.09629438 - time (sec): 9.16 - samples/sec: 1588.40 - lr: 0.000041 - momentum: 0.000000
378
+ 2023-10-23 19:54:44,566 epoch 3 - iter 168/242 - loss 0.09584241 - time (sec): 10.70 - samples/sec: 1608.85 - lr: 0.000041 - momentum: 0.000000
379
+ 2023-10-23 19:54:46,056 epoch 3 - iter 192/242 - loss 0.09380987 - time (sec): 12.19 - samples/sec: 1601.31 - lr: 0.000040 - momentum: 0.000000
380
+ 2023-10-23 19:54:47,580 epoch 3 - iter 216/242 - loss 0.09142753 - time (sec): 13.72 - samples/sec: 1596.57 - lr: 0.000040 - momentum: 0.000000
381
+ 2023-10-23 19:54:49,116 epoch 3 - iter 240/242 - loss 0.09182178 - time (sec): 15.25 - samples/sec: 1610.18 - lr: 0.000039 - momentum: 0.000000
382
+ 2023-10-23 19:54:49,240 ----------------------------------------------------------------------------------------------------
383
+ 2023-10-23 19:54:49,240 EPOCH 3 done: loss 0.0913 - lr: 0.000039
384
+ 2023-10-23 19:54:49,930 DEV : loss 0.1296485811471939 - f1-score (micro avg) 0.8445
385
+ 2023-10-23 19:54:49,934 saving best model
386
+ 2023-10-23 19:54:50,608 ----------------------------------------------------------------------------------------------------
387
+ 2023-10-23 19:54:52,082 epoch 4 - iter 24/242 - loss 0.04930126 - time (sec): 1.47 - samples/sec: 1585.43 - lr: 0.000038 - momentum: 0.000000
388
+ 2023-10-23 19:54:53,635 epoch 4 - iter 48/242 - loss 0.06258024 - time (sec): 3.03 - samples/sec: 1609.31 - lr: 0.000038 - momentum: 0.000000
389
+ 2023-10-23 19:54:55,129 epoch 4 - iter 72/242 - loss 0.05973513 - time (sec): 4.52 - samples/sec: 1589.01 - lr: 0.000037 - momentum: 0.000000
390
+ 2023-10-23 19:54:56,767 epoch 4 - iter 96/242 - loss 0.06641017 - time (sec): 6.16 - samples/sec: 1553.52 - lr: 0.000037 - momentum: 0.000000
391
+ 2023-10-23 19:54:58,286 epoch 4 - iter 120/242 - loss 0.06616114 - time (sec): 7.68 - samples/sec: 1564.09 - lr: 0.000036 - momentum: 0.000000
392
+ 2023-10-23 19:54:59,778 epoch 4 - iter 144/242 - loss 0.06088754 - time (sec): 9.17 - samples/sec: 1542.98 - lr: 0.000036 - momentum: 0.000000
393
+ 2023-10-23 19:55:01,289 epoch 4 - iter 168/242 - loss 0.05839909 - time (sec): 10.68 - samples/sec: 1538.46 - lr: 0.000035 - momentum: 0.000000
394
+ 2023-10-23 19:55:02,865 epoch 4 - iter 192/242 - loss 0.06217878 - time (sec): 12.26 - samples/sec: 1572.94 - lr: 0.000035 - momentum: 0.000000
395
+ 2023-10-23 19:55:04,432 epoch 4 - iter 216/242 - loss 0.06618679 - time (sec): 13.82 - samples/sec: 1592.46 - lr: 0.000034 - momentum: 0.000000
396
+ 2023-10-23 19:55:05,976 epoch 4 - iter 240/242 - loss 0.06638804 - time (sec): 15.37 - samples/sec: 1599.35 - lr: 0.000033 - momentum: 0.000000
397
+ 2023-10-23 19:55:06,100 ----------------------------------------------------------------------------------------------------
398
+ 2023-10-23 19:55:06,100 EPOCH 4 done: loss 0.0663 - lr: 0.000033
399
+ 2023-10-23 19:55:06,793 DEV : loss 0.15876781940460205 - f1-score (micro avg) 0.8224
400
+ 2023-10-23 19:55:06,797 ----------------------------------------------------------------------------------------------------
401
+ 2023-10-23 19:55:08,308 epoch 5 - iter 24/242 - loss 0.06022989 - time (sec): 1.51 - samples/sec: 1664.44 - lr: 0.000033 - momentum: 0.000000
402
+ 2023-10-23 19:55:09,836 epoch 5 - iter 48/242 - loss 0.03978630 - time (sec): 3.04 - samples/sec: 1652.27 - lr: 0.000032 - momentum: 0.000000
403
+ 2023-10-23 19:55:11,382 epoch 5 - iter 72/242 - loss 0.04483749 - time (sec): 4.59 - samples/sec: 1647.08 - lr: 0.000032 - momentum: 0.000000
404
+ 2023-10-23 19:55:12,870 epoch 5 - iter 96/242 - loss 0.04800811 - time (sec): 6.07 - samples/sec: 1644.63 - lr: 0.000031 - momentum: 0.000000
405
+ 2023-10-23 19:55:14,402 epoch 5 - iter 120/242 - loss 0.05006269 - time (sec): 7.61 - samples/sec: 1659.16 - lr: 0.000031 - momentum: 0.000000
406
+ 2023-10-23 19:55:15,876 epoch 5 - iter 144/242 - loss 0.05146417 - time (sec): 9.08 - samples/sec: 1642.85 - lr: 0.000030 - momentum: 0.000000
407
+ 2023-10-23 19:55:17,402 epoch 5 - iter 168/242 - loss 0.05069314 - time (sec): 10.60 - samples/sec: 1634.40 - lr: 0.000030 - momentum: 0.000000
408
+ 2023-10-23 19:55:18,898 epoch 5 - iter 192/242 - loss 0.04770876 - time (sec): 12.10 - samples/sec: 1640.86 - lr: 0.000029 - momentum: 0.000000
409
+ 2023-10-23 19:55:20,461 epoch 5 - iter 216/242 - loss 0.05091984 - time (sec): 13.66 - samples/sec: 1644.77 - lr: 0.000028 - momentum: 0.000000
410
+ 2023-10-23 19:55:21,993 epoch 5 - iter 240/242 - loss 0.05083321 - time (sec): 15.20 - samples/sec: 1623.55 - lr: 0.000028 - momentum: 0.000000
411
+ 2023-10-23 19:55:22,102 ----------------------------------------------------------------------------------------------------
412
+ 2023-10-23 19:55:22,102 EPOCH 5 done: loss 0.0507 - lr: 0.000028
413
+ 2023-10-23 19:55:22,797 DEV : loss 0.1611303985118866 - f1-score (micro avg) 0.8348
414
+ 2023-10-23 19:55:22,801 ----------------------------------------------------------------------------------------------------
415
+ 2023-10-23 19:55:24,316 epoch 6 - iter 24/242 - loss 0.02656180 - time (sec): 1.51 - samples/sec: 1737.41 - lr: 0.000027 - momentum: 0.000000
416
+ 2023-10-23 19:55:25,864 epoch 6 - iter 48/242 - loss 0.02741681 - time (sec): 3.06 - samples/sec: 1626.39 - lr: 0.000027 - momentum: 0.000000
417
+ 2023-10-23 19:55:27,418 epoch 6 - iter 72/242 - loss 0.02911125 - time (sec): 4.62 - samples/sec: 1645.52 - lr: 0.000026 - momentum: 0.000000
418
+ 2023-10-23 19:55:28,925 epoch 6 - iter 96/242 - loss 0.03691839 - time (sec): 6.12 - samples/sec: 1658.26 - lr: 0.000026 - momentum: 0.000000
419
+ 2023-10-23 19:55:30,444 epoch 6 - iter 120/242 - loss 0.03816996 - time (sec): 7.64 - samples/sec: 1626.27 - lr: 0.000025 - momentum: 0.000000
420
+ 2023-10-23 19:55:31,981 epoch 6 - iter 144/242 - loss 0.03656373 - time (sec): 9.18 - samples/sec: 1609.95 - lr: 0.000025 - momentum: 0.000000
421
+ 2023-10-23 19:55:33,481 epoch 6 - iter 168/242 - loss 0.03486906 - time (sec): 10.68 - samples/sec: 1613.60 - lr: 0.000024 - momentum: 0.000000
422
+ 2023-10-23 19:55:34,990 epoch 6 - iter 192/242 - loss 0.03680509 - time (sec): 12.19 - samples/sec: 1606.30 - lr: 0.000023 - momentum: 0.000000
423
+ 2023-10-23 19:55:36,521 epoch 6 - iter 216/242 - loss 0.03411852 - time (sec): 13.72 - samples/sec: 1601.19 - lr: 0.000023 - momentum: 0.000000
424
+ 2023-10-23 19:55:38,025 epoch 6 - iter 240/242 - loss 0.03417563 - time (sec): 15.22 - samples/sec: 1613.46 - lr: 0.000022 - momentum: 0.000000
425
+ 2023-10-23 19:55:38,148 ----------------------------------------------------------------------------------------------------
426
+ 2023-10-23 19:55:38,148 EPOCH 6 done: loss 0.0339 - lr: 0.000022
427
+ 2023-10-23 19:55:38,846 DEV : loss 0.18326157331466675 - f1-score (micro avg) 0.8343
428
+ 2023-10-23 19:55:38,850 ----------------------------------------------------------------------------------------------------
429
+ 2023-10-23 19:55:40,333 epoch 7 - iter 24/242 - loss 0.02051287 - time (sec): 1.48 - samples/sec: 1610.34 - lr: 0.000022 - momentum: 0.000000
430
+ 2023-10-23 19:55:41,826 epoch 7 - iter 48/242 - loss 0.02832778 - time (sec): 2.98 - samples/sec: 1546.30 - lr: 0.000021 - momentum: 0.000000
431
+ 2023-10-23 19:55:43,364 epoch 7 - iter 72/242 - loss 0.02435630 - time (sec): 4.51 - samples/sec: 1541.24 - lr: 0.000021 - momentum: 0.000000
432
+ 2023-10-23 19:55:44,863 epoch 7 - iter 96/242 - loss 0.02003073 - time (sec): 6.01 - samples/sec: 1540.81 - lr: 0.000020 - momentum: 0.000000
433
+ 2023-10-23 19:55:46,337 epoch 7 - iter 120/242 - loss 0.02455821 - time (sec): 7.49 - samples/sec: 1542.93 - lr: 0.000020 - momentum: 0.000000
434
+ 2023-10-23 19:55:47,901 epoch 7 - iter 144/242 - loss 0.02202070 - time (sec): 9.05 - samples/sec: 1576.40 - lr: 0.000019 - momentum: 0.000000
435
+ 2023-10-23 19:55:49,479 epoch 7 - iter 168/242 - loss 0.02747623 - time (sec): 10.63 - samples/sec: 1600.40 - lr: 0.000018 - momentum: 0.000000
436
+ 2023-10-23 19:55:50,984 epoch 7 - iter 192/242 - loss 0.02707506 - time (sec): 12.13 - samples/sec: 1601.11 - lr: 0.000018 - momentum: 0.000000
437
+ 2023-10-23 19:55:52,548 epoch 7 - iter 216/242 - loss 0.02899112 - time (sec): 13.70 - samples/sec: 1611.34 - lr: 0.000017 - momentum: 0.000000
438
+ 2023-10-23 19:55:54,078 epoch 7 - iter 240/242 - loss 0.02860385 - time (sec): 15.23 - samples/sec: 1617.70 - lr: 0.000017 - momentum: 0.000000
439
+ 2023-10-23 19:55:54,194 ----------------------------------------------------------------------------------------------------
440
+ 2023-10-23 19:55:54,194 EPOCH 7 done: loss 0.0298 - lr: 0.000017
441
+ 2023-10-23 19:55:54,888 DEV : loss 0.20003901422023773 - f1-score (micro avg) 0.8501
442
+ 2023-10-23 19:55:54,892 saving best model
443
+ 2023-10-23 19:55:55,556 ----------------------------------------------------------------------------------------------------
444
+ 2023-10-23 19:55:57,088 epoch 8 - iter 24/242 - loss 0.01561098 - time (sec): 1.53 - samples/sec: 1623.00 - lr: 0.000016 - momentum: 0.000000
445
+ 2023-10-23 19:55:58,567 epoch 8 - iter 48/242 - loss 0.02032853 - time (sec): 3.01 - samples/sec: 1623.61 - lr: 0.000016 - momentum: 0.000000
446
+ 2023-10-23 19:56:00,093 epoch 8 - iter 72/242 - loss 0.01578867 - time (sec): 4.54 - samples/sec: 1682.93 - lr: 0.000015 - momentum: 0.000000
447
+ 2023-10-23 19:56:01,621 epoch 8 - iter 96/242 - loss 0.01385398 - time (sec): 6.06 - samples/sec: 1664.02 - lr: 0.000015 - momentum: 0.000000
448
+ 2023-10-23 19:56:03,094 epoch 8 - iter 120/242 - loss 0.01185117 - time (sec): 7.54 - samples/sec: 1640.94 - lr: 0.000014 - momentum: 0.000000
449
+ 2023-10-23 19:56:04,602 epoch 8 - iter 144/242 - loss 0.01327037 - time (sec): 9.04 - samples/sec: 1624.26 - lr: 0.000013 - momentum: 0.000000
450
+ 2023-10-23 19:56:06,155 epoch 8 - iter 168/242 - loss 0.01408385 - time (sec): 10.60 - samples/sec: 1620.78 - lr: 0.000013 - momentum: 0.000000
451
+ 2023-10-23 19:56:07,716 epoch 8 - iter 192/242 - loss 0.01512167 - time (sec): 12.16 - samples/sec: 1635.44 - lr: 0.000012 - momentum: 0.000000
452
+ 2023-10-23 19:56:09,228 epoch 8 - iter 216/242 - loss 0.01548915 - time (sec): 13.67 - samples/sec: 1630.38 - lr: 0.000012 - momentum: 0.000000
453
+ 2023-10-23 19:56:10,757 epoch 8 - iter 240/242 - loss 0.01551029 - time (sec): 15.20 - samples/sec: 1621.75 - lr: 0.000011 - momentum: 0.000000
454
+ 2023-10-23 19:56:10,870 ----------------------------------------------------------------------------------------------------
455
+ 2023-10-23 19:56:10,870 EPOCH 8 done: loss 0.0156 - lr: 0.000011
456
+ 2023-10-23 19:56:11,563 DEV : loss 0.2106192260980606 - f1-score (micro avg) 0.8398
457
+ 2023-10-23 19:56:11,567 ----------------------------------------------------------------------------------------------------
458
+ 2023-10-23 19:56:13,112 epoch 9 - iter 24/242 - loss 0.00846013 - time (sec): 1.54 - samples/sec: 1685.13 - lr: 0.000011 - momentum: 0.000000
459
+ 2023-10-23 19:56:14,632 epoch 9 - iter 48/242 - loss 0.00695458 - time (sec): 3.06 - samples/sec: 1664.04 - lr: 0.000010 - momentum: 0.000000
460
+ 2023-10-23 19:56:16,161 epoch 9 - iter 72/242 - loss 0.01033016 - time (sec): 4.59 - samples/sec: 1639.21 - lr: 0.000010 - momentum: 0.000000
461
+ 2023-10-23 19:56:17,662 epoch 9 - iter 96/242 - loss 0.00825252 - time (sec): 6.09 - samples/sec: 1615.59 - lr: 0.000009 - momentum: 0.000000
462
+ 2023-10-23 19:56:19,129 epoch 9 - iter 120/242 - loss 0.00685529 - time (sec): 7.56 - samples/sec: 1571.75 - lr: 0.000008 - momentum: 0.000000
463
+ 2023-10-23 19:56:20,687 epoch 9 - iter 144/242 - loss 0.00761663 - time (sec): 9.12 - samples/sec: 1590.50 - lr: 0.000008 - momentum: 0.000000
464
+ 2023-10-23 19:56:22,185 epoch 9 - iter 168/242 - loss 0.00872170 - time (sec): 10.62 - samples/sec: 1591.73 - lr: 0.000007 - momentum: 0.000000
465
+ 2023-10-23 19:56:23,766 epoch 9 - iter 192/242 - loss 0.00777299 - time (sec): 12.20 - samples/sec: 1598.39 - lr: 0.000007 - momentum: 0.000000
466
+ 2023-10-23 19:56:25,303 epoch 9 - iter 216/242 - loss 0.00837163 - time (sec): 13.74 - samples/sec: 1594.82 - lr: 0.000006 - momentum: 0.000000
467
+ 2023-10-23 19:56:26,841 epoch 9 - iter 240/242 - loss 0.00873881 - time (sec): 15.27 - samples/sec: 1611.98 - lr: 0.000006 - momentum: 0.000000
468
+ 2023-10-23 19:56:26,952 ----------------------------------------------------------------------------------------------------
469
+ 2023-10-23 19:56:26,953 EPOCH 9 done: loss 0.0087 - lr: 0.000006
470
+ 2023-10-23 19:56:27,649 DEV : loss 0.21975746750831604 - f1-score (micro avg) 0.8292
471
+ 2023-10-23 19:56:27,653 ----------------------------------------------------------------------------------------------------
472
+ 2023-10-23 19:56:29,187 epoch 10 - iter 24/242 - loss 0.01598168 - time (sec): 1.53 - samples/sec: 1584.96 - lr: 0.000005 - momentum: 0.000000
473
+ 2023-10-23 19:56:30,677 epoch 10 - iter 48/242 - loss 0.01795362 - time (sec): 3.02 - samples/sec: 1503.00 - lr: 0.000005 - momentum: 0.000000
474
+ 2023-10-23 19:56:32,190 epoch 10 - iter 72/242 - loss 0.01314097 - time (sec): 4.54 - samples/sec: 1548.00 - lr: 0.000004 - momentum: 0.000000
475
+ 2023-10-23 19:56:33,784 epoch 10 - iter 96/242 - loss 0.01198160 - time (sec): 6.13 - samples/sec: 1568.31 - lr: 0.000003 - momentum: 0.000000
476
+ 2023-10-23 19:56:35,280 epoch 10 - iter 120/242 - loss 0.01184489 - time (sec): 7.63 - samples/sec: 1574.95 - lr: 0.000003 - momentum: 0.000000
477
+ 2023-10-23 19:56:36,807 epoch 10 - iter 144/242 - loss 0.01199912 - time (sec): 9.15 - samples/sec: 1587.15 - lr: 0.000002 - momentum: 0.000000
478
+ 2023-10-23 19:56:38,372 epoch 10 - iter 168/242 - loss 0.01076440 - time (sec): 10.72 - samples/sec: 1608.19 - lr: 0.000002 - momentum: 0.000000
479
+ 2023-10-23 19:56:39,918 epoch 10 - iter 192/242 - loss 0.00966567 - time (sec): 12.26 - samples/sec: 1603.60 - lr: 0.000001 - momentum: 0.000000
480
+ 2023-10-23 19:56:41,460 epoch 10 - iter 216/242 - loss 0.00868111 - time (sec): 13.81 - samples/sec: 1620.18 - lr: 0.000001 - momentum: 0.000000
481
+ 2023-10-23 19:56:42,971 epoch 10 - iter 240/242 - loss 0.00817940 - time (sec): 15.32 - samples/sec: 1608.99 - lr: 0.000000 - momentum: 0.000000
482
+ 2023-10-23 19:56:43,082 ----------------------------------------------------------------------------------------------------
483
+ 2023-10-23 19:56:43,082 EPOCH 10 done: loss 0.0081 - lr: 0.000000
484
+ 2023-10-23 19:56:43,779 DEV : loss 0.22472596168518066 - f1-score (micro avg) 0.8257
485
+ 2023-10-23 19:56:44,255 ----------------------------------------------------------------------------------------------------
486
+ 2023-10-23 19:56:44,256 Loading model from best epoch ...
487
+ 2023-10-23 19:56:45,831 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date
488
+ 2023-10-23 19:56:46,685
489
+ Results:
490
+ - F-score (micro) 0.8189
491
+ - F-score (macro) 0.5017
492
+ - Accuracy 0.7163
493
+
494
+ By class:
495
+ precision recall f1-score support
496
+
497
+ pers 0.8611 0.8921 0.8763 139
498
+ scope 0.8321 0.8837 0.8571 129
499
+ work 0.6559 0.7625 0.7052 80
500
+ loc 0.8000 0.4444 0.5714 9
501
+ date 0.0000 0.0000 0.0000 3
502
+ object 0.0000 0.0000 0.0000 0
503
+
504
+ micro avg 0.7974 0.8417 0.8189 360
505
+ macro avg 0.5249 0.4971 0.5017 360
506
+ weighted avg 0.7964 0.8417 0.8165 360
507
+
508
+ 2023-10-23 19:56:46,686 ----------------------------------------------------------------------------------------------------