2023-10-17 08:21:05,531 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,532 Model: "SequenceTagger( (embeddings): TransformerWordEmbeddings( (model): ElectraModel( (embeddings): ElectraEmbeddings( (word_embeddings): Embedding(32001, 768) (position_embeddings): Embedding(512, 768) (token_type_embeddings): Embedding(2, 768) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) (encoder): ElectraEncoder( (layer): ModuleList( (0-11): 12 x ElectraLayer( (attention): ElectraAttention( (self): ElectraSelfAttention( (query): Linear(in_features=768, out_features=768, bias=True) (key): Linear(in_features=768, out_features=768, bias=True) (value): Linear(in_features=768, out_features=768, bias=True) (dropout): Dropout(p=0.1, inplace=False) ) (output): ElectraSelfOutput( (dense): Linear(in_features=768, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) (intermediate): ElectraIntermediate( (dense): Linear(in_features=768, out_features=3072, bias=True) (intermediate_act_fn): GELUActivation() ) (output): ElectraOutput( (dense): Linear(in_features=3072, out_features=768, bias=True) (LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True) (dropout): Dropout(p=0.1, inplace=False) ) ) ) ) ) ) (locked_dropout): LockedDropout(p=0.5) (linear): Linear(in_features=768, out_features=25, bias=True) (loss_function): CrossEntropyLoss() )" 2023-10-17 08:21:05,532 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,532 MultiCorpus: 1100 train + 206 dev + 240 test sentences - NER_HIPE_2022 Corpus: 1100 train + 206 dev + 240 test sentences - /root/.flair/datasets/ner_hipe_2022/v2.1/ajmc/de/with_doc_seperator 2023-10-17 08:21:05,532 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,532 Train: 1100 sentences 2023-10-17 08:21:05,532 (train_with_dev=False, train_with_test=False) 2023-10-17 08:21:05,532 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,532 Training Params: 2023-10-17 08:21:05,532 - learning_rate: "5e-05" 2023-10-17 08:21:05,532 - mini_batch_size: "4" 2023-10-17 08:21:05,532 - max_epochs: "10" 2023-10-17 08:21:05,532 - shuffle: "True" 2023-10-17 08:21:05,532 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,532 Plugins: 2023-10-17 08:21:05,532 - TensorboardLogger 2023-10-17 08:21:05,532 - LinearScheduler | warmup_fraction: '0.1' 2023-10-17 08:21:05,532 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,532 Final evaluation on model from best epoch (best-model.pt) 2023-10-17 08:21:05,532 - metric: "('micro avg', 'f1-score')" 2023-10-17 08:21:05,532 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,532 Computation: 2023-10-17 08:21:05,532 - compute on device: cuda:0 2023-10-17 08:21:05,532 - embedding storage: none 2023-10-17 08:21:05,532 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,533 Model training base path: "hmbench-ajmc/de-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-1" 2023-10-17 08:21:05,533 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,533 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:05,533 Logging anything other than scalars to TensorBoard is currently not supported. 2023-10-17 08:21:06,817 epoch 1 - iter 27/275 - loss 3.39551245 - time (sec): 1.28 - samples/sec: 1666.61 - lr: 0.000005 - momentum: 0.000000 2023-10-17 08:21:08,065 epoch 1 - iter 54/275 - loss 2.64151202 - time (sec): 2.53 - samples/sec: 1681.00 - lr: 0.000010 - momentum: 0.000000 2023-10-17 08:21:09,236 epoch 1 - iter 81/275 - loss 2.09865343 - time (sec): 3.70 - samples/sec: 1734.39 - lr: 0.000015 - momentum: 0.000000 2023-10-17 08:21:10,470 epoch 1 - iter 108/275 - loss 1.67182252 - time (sec): 4.94 - samples/sec: 1770.04 - lr: 0.000019 - momentum: 0.000000 2023-10-17 08:21:11,716 epoch 1 - iter 135/275 - loss 1.40410810 - time (sec): 6.18 - samples/sec: 1806.19 - lr: 0.000024 - momentum: 0.000000 2023-10-17 08:21:12,967 epoch 1 - iter 162/275 - loss 1.24018274 - time (sec): 7.43 - samples/sec: 1820.32 - lr: 0.000029 - momentum: 0.000000 2023-10-17 08:21:14,216 epoch 1 - iter 189/275 - loss 1.10788515 - time (sec): 8.68 - samples/sec: 1805.76 - lr: 0.000034 - momentum: 0.000000 2023-10-17 08:21:15,444 epoch 1 - iter 216/275 - loss 1.00484610 - time (sec): 9.91 - samples/sec: 1808.63 - lr: 0.000039 - momentum: 0.000000 2023-10-17 08:21:16,708 epoch 1 - iter 243/275 - loss 0.92627982 - time (sec): 11.17 - samples/sec: 1796.84 - lr: 0.000044 - momentum: 0.000000 2023-10-17 08:21:17,987 epoch 1 - iter 270/275 - loss 0.85220835 - time (sec): 12.45 - samples/sec: 1802.34 - lr: 0.000049 - momentum: 0.000000 2023-10-17 08:21:18,222 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:18,222 EPOCH 1 done: loss 0.8430 - lr: 0.000049 2023-10-17 08:21:18,736 DEV : loss 0.18237413465976715 - f1-score (micro avg) 0.7491 2023-10-17 08:21:18,742 saving best model 2023-10-17 08:21:19,068 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:20,325 epoch 2 - iter 27/275 - loss 0.18101215 - time (sec): 1.26 - samples/sec: 1914.66 - lr: 0.000049 - momentum: 0.000000 2023-10-17 08:21:21,551 epoch 2 - iter 54/275 - loss 0.17159911 - time (sec): 2.48 - samples/sec: 1841.22 - lr: 0.000049 - momentum: 0.000000 2023-10-17 08:21:22,815 epoch 2 - iter 81/275 - loss 0.18841132 - time (sec): 3.75 - samples/sec: 1827.39 - lr: 0.000048 - momentum: 0.000000 2023-10-17 08:21:24,072 epoch 2 - iter 108/275 - loss 0.19026077 - time (sec): 5.00 - samples/sec: 1815.90 - lr: 0.000048 - momentum: 0.000000 2023-10-17 08:21:25,309 epoch 2 - iter 135/275 - loss 0.17415715 - time (sec): 6.24 - samples/sec: 1801.86 - lr: 0.000047 - momentum: 0.000000 2023-10-17 08:21:26,568 epoch 2 - iter 162/275 - loss 0.17149194 - time (sec): 7.50 - samples/sec: 1797.78 - lr: 0.000047 - momentum: 0.000000 2023-10-17 08:21:27,818 epoch 2 - iter 189/275 - loss 0.16901785 - time (sec): 8.75 - samples/sec: 1784.59 - lr: 0.000046 - momentum: 0.000000 2023-10-17 08:21:29,049 epoch 2 - iter 216/275 - loss 0.16965454 - time (sec): 9.98 - samples/sec: 1773.62 - lr: 0.000046 - momentum: 0.000000 2023-10-17 08:21:30,296 epoch 2 - iter 243/275 - loss 0.16693701 - time (sec): 11.23 - samples/sec: 1783.09 - lr: 0.000045 - momentum: 0.000000 2023-10-17 08:21:31,555 epoch 2 - iter 270/275 - loss 0.15966804 - time (sec): 12.48 - samples/sec: 1792.73 - lr: 0.000045 - momentum: 0.000000 2023-10-17 08:21:31,785 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:31,785 EPOCH 2 done: loss 0.1579 - lr: 0.000045 2023-10-17 08:21:32,427 DEV : loss 0.18694719672203064 - f1-score (micro avg) 0.7962 2023-10-17 08:21:32,431 saving best model 2023-10-17 08:21:33,006 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:34,258 epoch 3 - iter 27/275 - loss 0.10338227 - time (sec): 1.25 - samples/sec: 1698.73 - lr: 0.000044 - momentum: 0.000000 2023-10-17 08:21:35,491 epoch 3 - iter 54/275 - loss 0.10936433 - time (sec): 2.48 - samples/sec: 1842.19 - lr: 0.000043 - momentum: 0.000000 2023-10-17 08:21:36,720 epoch 3 - iter 81/275 - loss 0.09604649 - time (sec): 3.71 - samples/sec: 1792.55 - lr: 0.000043 - momentum: 0.000000 2023-10-17 08:21:37,955 epoch 3 - iter 108/275 - loss 0.09915960 - time (sec): 4.95 - samples/sec: 1756.92 - lr: 0.000042 - momentum: 0.000000 2023-10-17 08:21:39,214 epoch 3 - iter 135/275 - loss 0.10373457 - time (sec): 6.21 - samples/sec: 1795.26 - lr: 0.000042 - momentum: 0.000000 2023-10-17 08:21:40,455 epoch 3 - iter 162/275 - loss 0.10532775 - time (sec): 7.45 - samples/sec: 1783.62 - lr: 0.000041 - momentum: 0.000000 2023-10-17 08:21:41,685 epoch 3 - iter 189/275 - loss 0.11696122 - time (sec): 8.68 - samples/sec: 1820.89 - lr: 0.000041 - momentum: 0.000000 2023-10-17 08:21:42,948 epoch 3 - iter 216/275 - loss 0.11855891 - time (sec): 9.94 - samples/sec: 1796.19 - lr: 0.000040 - momentum: 0.000000 2023-10-17 08:21:44,180 epoch 3 - iter 243/275 - loss 0.11859956 - time (sec): 11.17 - samples/sec: 1791.30 - lr: 0.000040 - momentum: 0.000000 2023-10-17 08:21:45,428 epoch 3 - iter 270/275 - loss 0.12663447 - time (sec): 12.42 - samples/sec: 1801.62 - lr: 0.000039 - momentum: 0.000000 2023-10-17 08:21:45,654 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:45,654 EPOCH 3 done: loss 0.1252 - lr: 0.000039 2023-10-17 08:21:46,293 DEV : loss 0.1728365570306778 - f1-score (micro avg) 0.84 2023-10-17 08:21:46,298 saving best model 2023-10-17 08:21:46,746 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:47,998 epoch 4 - iter 27/275 - loss 0.05299613 - time (sec): 1.25 - samples/sec: 1938.21 - lr: 0.000038 - momentum: 0.000000 2023-10-17 08:21:49,224 epoch 4 - iter 54/275 - loss 0.06587099 - time (sec): 2.48 - samples/sec: 1835.93 - lr: 0.000038 - momentum: 0.000000 2023-10-17 08:21:50,453 epoch 4 - iter 81/275 - loss 0.07005691 - time (sec): 3.70 - samples/sec: 1814.78 - lr: 0.000037 - momentum: 0.000000 2023-10-17 08:21:51,687 epoch 4 - iter 108/275 - loss 0.07608689 - time (sec): 4.94 - samples/sec: 1796.99 - lr: 0.000037 - momentum: 0.000000 2023-10-17 08:21:52,943 epoch 4 - iter 135/275 - loss 0.08892863 - time (sec): 6.19 - samples/sec: 1822.45 - lr: 0.000036 - momentum: 0.000000 2023-10-17 08:21:54,177 epoch 4 - iter 162/275 - loss 0.08944390 - time (sec): 7.43 - samples/sec: 1802.70 - lr: 0.000036 - momentum: 0.000000 2023-10-17 08:21:55,419 epoch 4 - iter 189/275 - loss 0.08487576 - time (sec): 8.67 - samples/sec: 1801.88 - lr: 0.000035 - momentum: 0.000000 2023-10-17 08:21:56,680 epoch 4 - iter 216/275 - loss 0.08286826 - time (sec): 9.93 - samples/sec: 1803.71 - lr: 0.000035 - momentum: 0.000000 2023-10-17 08:21:57,982 epoch 4 - iter 243/275 - loss 0.08230399 - time (sec): 11.23 - samples/sec: 1796.27 - lr: 0.000034 - momentum: 0.000000 2023-10-17 08:21:59,214 epoch 4 - iter 270/275 - loss 0.08125657 - time (sec): 12.47 - samples/sec: 1788.79 - lr: 0.000034 - momentum: 0.000000 2023-10-17 08:21:59,453 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:21:59,453 EPOCH 4 done: loss 0.0798 - lr: 0.000034 2023-10-17 08:22:00,104 DEV : loss 0.18954457342624664 - f1-score (micro avg) 0.8713 2023-10-17 08:22:00,112 saving best model 2023-10-17 08:22:00,682 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:22:01,973 epoch 5 - iter 27/275 - loss 0.08219054 - time (sec): 1.29 - samples/sec: 1768.72 - lr: 0.000033 - momentum: 0.000000 2023-10-17 08:22:03,205 epoch 5 - iter 54/275 - loss 0.08503182 - time (sec): 2.52 - samples/sec: 1823.09 - lr: 0.000032 - momentum: 0.000000 2023-10-17 08:22:04,468 epoch 5 - iter 81/275 - loss 0.08351946 - time (sec): 3.78 - samples/sec: 1819.14 - lr: 0.000032 - momentum: 0.000000 2023-10-17 08:22:05,705 epoch 5 - iter 108/275 - loss 0.07984801 - time (sec): 5.02 - samples/sec: 1799.08 - lr: 0.000031 - momentum: 0.000000 2023-10-17 08:22:06,945 epoch 5 - iter 135/275 - loss 0.07339370 - time (sec): 6.26 - samples/sec: 1763.11 - lr: 0.000031 - momentum: 0.000000 2023-10-17 08:22:08,220 epoch 5 - iter 162/275 - loss 0.07109291 - time (sec): 7.53 - samples/sec: 1740.90 - lr: 0.000030 - momentum: 0.000000 2023-10-17 08:22:09,449 epoch 5 - iter 189/275 - loss 0.06550290 - time (sec): 8.76 - samples/sec: 1756.26 - lr: 0.000030 - momentum: 0.000000 2023-10-17 08:22:10,687 epoch 5 - iter 216/275 - loss 0.07935454 - time (sec): 10.00 - samples/sec: 1789.90 - lr: 0.000029 - momentum: 0.000000 2023-10-17 08:22:11,917 epoch 5 - iter 243/275 - loss 0.07699592 - time (sec): 11.23 - samples/sec: 1800.53 - lr: 0.000029 - momentum: 0.000000 2023-10-17 08:22:13,227 epoch 5 - iter 270/275 - loss 0.07376305 - time (sec): 12.54 - samples/sec: 1781.54 - lr: 0.000028 - momentum: 0.000000 2023-10-17 08:22:13,457 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:22:13,457 EPOCH 5 done: loss 0.0724 - lr: 0.000028 2023-10-17 08:22:14,112 DEV : loss 0.1961093693971634 - f1-score (micro avg) 0.878 2023-10-17 08:22:14,117 saving best model 2023-10-17 08:22:14,582 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:22:15,818 epoch 6 - iter 27/275 - loss 0.05376331 - time (sec): 1.23 - samples/sec: 1735.11 - lr: 0.000027 - momentum: 0.000000 2023-10-17 08:22:17,043 epoch 6 - iter 54/275 - loss 0.05380094 - time (sec): 2.46 - samples/sec: 1981.75 - lr: 0.000027 - momentum: 0.000000 2023-10-17 08:22:18,221 epoch 6 - iter 81/275 - loss 0.05193664 - time (sec): 3.64 - samples/sec: 1921.84 - lr: 0.000026 - momentum: 0.000000 2023-10-17 08:22:19,451 epoch 6 - iter 108/275 - loss 0.04725920 - time (sec): 4.87 - samples/sec: 1871.74 - lr: 0.000026 - momentum: 0.000000 2023-10-17 08:22:20,678 epoch 6 - iter 135/275 - loss 0.03971605 - time (sec): 6.09 - samples/sec: 1867.66 - lr: 0.000025 - momentum: 0.000000 2023-10-17 08:22:21,932 epoch 6 - iter 162/275 - loss 0.03886938 - time (sec): 7.35 - samples/sec: 1866.12 - lr: 0.000025 - momentum: 0.000000 2023-10-17 08:22:23,156 epoch 6 - iter 189/275 - loss 0.03928234 - time (sec): 8.57 - samples/sec: 1836.24 - lr: 0.000024 - momentum: 0.000000 2023-10-17 08:22:24,402 epoch 6 - iter 216/275 - loss 0.03975998 - time (sec): 9.82 - samples/sec: 1824.03 - lr: 0.000024 - momentum: 0.000000 2023-10-17 08:22:25,656 epoch 6 - iter 243/275 - loss 0.03796291 - time (sec): 11.07 - samples/sec: 1810.97 - lr: 0.000023 - momentum: 0.000000 2023-10-17 08:22:26,911 epoch 6 - iter 270/275 - loss 0.04130517 - time (sec): 12.33 - samples/sec: 1809.67 - lr: 0.000022 - momentum: 0.000000 2023-10-17 08:22:27,138 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:22:27,138 EPOCH 6 done: loss 0.0451 - lr: 0.000022 2023-10-17 08:22:27,799 DEV : loss 0.19352462887763977 - f1-score (micro avg) 0.8543 2023-10-17 08:22:27,804 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:22:29,055 epoch 7 - iter 27/275 - loss 0.02029976 - time (sec): 1.25 - samples/sec: 1614.51 - lr: 0.000022 - momentum: 0.000000 2023-10-17 08:22:30,313 epoch 7 - iter 54/275 - loss 0.01882092 - time (sec): 2.51 - samples/sec: 1684.49 - lr: 0.000021 - momentum: 0.000000 2023-10-17 08:22:31,699 epoch 7 - iter 81/275 - loss 0.04995364 - time (sec): 3.89 - samples/sec: 1652.45 - lr: 0.000021 - momentum: 0.000000 2023-10-17 08:22:32,998 epoch 7 - iter 108/275 - loss 0.04386443 - time (sec): 5.19 - samples/sec: 1667.82 - lr: 0.000020 - momentum: 0.000000 2023-10-17 08:22:34,269 epoch 7 - iter 135/275 - loss 0.04098256 - time (sec): 6.46 - samples/sec: 1708.26 - lr: 0.000020 - momentum: 0.000000 2023-10-17 08:22:35,519 epoch 7 - iter 162/275 - loss 0.03492181 - time (sec): 7.71 - samples/sec: 1706.20 - lr: 0.000019 - momentum: 0.000000 2023-10-17 08:22:36,720 epoch 7 - iter 189/275 - loss 0.03379056 - time (sec): 8.91 - samples/sec: 1719.35 - lr: 0.000019 - momentum: 0.000000 2023-10-17 08:22:37,923 epoch 7 - iter 216/275 - loss 0.03379012 - time (sec): 10.12 - samples/sec: 1744.30 - lr: 0.000018 - momentum: 0.000000 2023-10-17 08:22:39,164 epoch 7 - iter 243/275 - loss 0.03239923 - time (sec): 11.36 - samples/sec: 1765.28 - lr: 0.000017 - momentum: 0.000000 2023-10-17 08:22:40,396 epoch 7 - iter 270/275 - loss 0.03241688 - time (sec): 12.59 - samples/sec: 1771.33 - lr: 0.000017 - momentum: 0.000000 2023-10-17 08:22:40,635 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:22:40,635 EPOCH 7 done: loss 0.0318 - lr: 0.000017 2023-10-17 08:22:41,272 DEV : loss 0.2002793848514557 - f1-score (micro avg) 0.8662 2023-10-17 08:22:41,277 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:22:42,506 epoch 8 - iter 27/275 - loss 0.03346243 - time (sec): 1.23 - samples/sec: 1786.29 - lr: 0.000016 - momentum: 0.000000 2023-10-17 08:22:43,729 epoch 8 - iter 54/275 - loss 0.04019703 - time (sec): 2.45 - samples/sec: 1867.40 - lr: 0.000016 - momentum: 0.000000 2023-10-17 08:22:44,954 epoch 8 - iter 81/275 - loss 0.04568437 - time (sec): 3.68 - samples/sec: 1843.30 - lr: 0.000015 - momentum: 0.000000 2023-10-17 08:22:46,200 epoch 8 - iter 108/275 - loss 0.03837371 - time (sec): 4.92 - samples/sec: 1845.15 - lr: 0.000015 - momentum: 0.000000 2023-10-17 08:22:47,431 epoch 8 - iter 135/275 - loss 0.03159264 - time (sec): 6.15 - samples/sec: 1854.79 - lr: 0.000014 - momentum: 0.000000 2023-10-17 08:22:48,667 epoch 8 - iter 162/275 - loss 0.02683346 - time (sec): 7.39 - samples/sec: 1832.21 - lr: 0.000014 - momentum: 0.000000 2023-10-17 08:22:49,899 epoch 8 - iter 189/275 - loss 0.02514710 - time (sec): 8.62 - samples/sec: 1827.58 - lr: 0.000013 - momentum: 0.000000 2023-10-17 08:22:51,120 epoch 8 - iter 216/275 - loss 0.02452208 - time (sec): 9.84 - samples/sec: 1817.88 - lr: 0.000012 - momentum: 0.000000 2023-10-17 08:22:52,328 epoch 8 - iter 243/275 - loss 0.02681178 - time (sec): 11.05 - samples/sec: 1837.11 - lr: 0.000012 - momentum: 0.000000 2023-10-17 08:22:53,552 epoch 8 - iter 270/275 - loss 0.02480341 - time (sec): 12.27 - samples/sec: 1821.82 - lr: 0.000011 - momentum: 0.000000 2023-10-17 08:22:53,791 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:22:53,791 EPOCH 8 done: loss 0.0244 - lr: 0.000011 2023-10-17 08:22:54,423 DEV : loss 0.18160505592823029 - f1-score (micro avg) 0.878 2023-10-17 08:22:54,428 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:22:55,730 epoch 9 - iter 27/275 - loss 0.03260429 - time (sec): 1.30 - samples/sec: 1645.17 - lr: 0.000011 - momentum: 0.000000 2023-10-17 08:22:57,012 epoch 9 - iter 54/275 - loss 0.03493095 - time (sec): 2.58 - samples/sec: 1613.16 - lr: 0.000010 - momentum: 0.000000 2023-10-17 08:22:58,249 epoch 9 - iter 81/275 - loss 0.02331322 - time (sec): 3.82 - samples/sec: 1684.88 - lr: 0.000010 - momentum: 0.000000 2023-10-17 08:22:59,481 epoch 9 - iter 108/275 - loss 0.01833588 - time (sec): 5.05 - samples/sec: 1737.07 - lr: 0.000009 - momentum: 0.000000 2023-10-17 08:23:00,711 epoch 9 - iter 135/275 - loss 0.01701417 - time (sec): 6.28 - samples/sec: 1750.63 - lr: 0.000009 - momentum: 0.000000 2023-10-17 08:23:01,975 epoch 9 - iter 162/275 - loss 0.01655618 - time (sec): 7.55 - samples/sec: 1786.23 - lr: 0.000008 - momentum: 0.000000 2023-10-17 08:23:03,187 epoch 9 - iter 189/275 - loss 0.01514631 - time (sec): 8.76 - samples/sec: 1787.24 - lr: 0.000007 - momentum: 0.000000 2023-10-17 08:23:04,389 epoch 9 - iter 216/275 - loss 0.01622065 - time (sec): 9.96 - samples/sec: 1772.19 - lr: 0.000007 - momentum: 0.000000 2023-10-17 08:23:05,637 epoch 9 - iter 243/275 - loss 0.01467038 - time (sec): 11.21 - samples/sec: 1763.06 - lr: 0.000006 - momentum: 0.000000 2023-10-17 08:23:06,869 epoch 9 - iter 270/275 - loss 0.01682948 - time (sec): 12.44 - samples/sec: 1788.62 - lr: 0.000006 - momentum: 0.000000 2023-10-17 08:23:07,104 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:23:07,104 EPOCH 9 done: loss 0.0176 - lr: 0.000006 2023-10-17 08:23:07,735 DEV : loss 0.18272937834262848 - f1-score (micro avg) 0.8835 2023-10-17 08:23:07,739 saving best model 2023-10-17 08:23:08,186 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:23:09,429 epoch 10 - iter 27/275 - loss 0.01377901 - time (sec): 1.24 - samples/sec: 1761.39 - lr: 0.000005 - momentum: 0.000000 2023-10-17 08:23:10,675 epoch 10 - iter 54/275 - loss 0.00736407 - time (sec): 2.48 - samples/sec: 1696.85 - lr: 0.000005 - momentum: 0.000000 2023-10-17 08:23:11,913 epoch 10 - iter 81/275 - loss 0.00774600 - time (sec): 3.72 - samples/sec: 1734.16 - lr: 0.000004 - momentum: 0.000000 2023-10-17 08:23:13,154 epoch 10 - iter 108/275 - loss 0.00674955 - time (sec): 4.96 - samples/sec: 1719.28 - lr: 0.000004 - momentum: 0.000000 2023-10-17 08:23:14,403 epoch 10 - iter 135/275 - loss 0.00682536 - time (sec): 6.21 - samples/sec: 1748.99 - lr: 0.000003 - momentum: 0.000000 2023-10-17 08:23:15,677 epoch 10 - iter 162/275 - loss 0.00997848 - time (sec): 7.49 - samples/sec: 1731.47 - lr: 0.000002 - momentum: 0.000000 2023-10-17 08:23:16,969 epoch 10 - iter 189/275 - loss 0.01238191 - time (sec): 8.78 - samples/sec: 1763.98 - lr: 0.000002 - momentum: 0.000000 2023-10-17 08:23:18,215 epoch 10 - iter 216/275 - loss 0.01158723 - time (sec): 10.02 - samples/sec: 1783.61 - lr: 0.000001 - momentum: 0.000000 2023-10-17 08:23:19,493 epoch 10 - iter 243/275 - loss 0.01444603 - time (sec): 11.30 - samples/sec: 1785.72 - lr: 0.000001 - momentum: 0.000000 2023-10-17 08:23:20,727 epoch 10 - iter 270/275 - loss 0.01395665 - time (sec): 12.54 - samples/sec: 1785.05 - lr: 0.000000 - momentum: 0.000000 2023-10-17 08:23:20,963 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:23:20,963 EPOCH 10 done: loss 0.0140 - lr: 0.000000 2023-10-17 08:23:21,598 DEV : loss 0.18181470036506653 - f1-score (micro avg) 0.8856 2023-10-17 08:23:21,602 saving best model 2023-10-17 08:23:22,400 ---------------------------------------------------------------------------------------------------- 2023-10-17 08:23:22,401 Loading model from best epoch ... 2023-10-17 08:23:23,726 SequenceTagger predicts: Dictionary with 25 tags: O, S-scope, B-scope, E-scope, I-scope, S-pers, B-pers, E-pers, I-pers, S-work, B-work, E-work, I-work, S-loc, B-loc, E-loc, I-loc, S-object, B-object, E-object, I-object, S-date, B-date, E-date, I-date 2023-10-17 08:23:24,605 Results: - F-score (micro) 0.919 - F-score (macro) 0.7842 - Accuracy 0.8586 By class: precision recall f1-score support scope 0.9143 0.9091 0.9117 176 pers 0.9836 0.9375 0.9600 128 work 0.9014 0.8649 0.8828 74 object 0.5000 0.5000 0.5000 2 loc 1.0000 0.5000 0.6667 2 micro avg 0.9326 0.9058 0.9190 382 macro avg 0.8599 0.7423 0.7842 382 weighted avg 0.9333 0.9058 0.9188 382 2023-10-17 08:23:24,606 ----------------------------------------------------------------------------------------------------