stefan-it's picture
Upload folder using huggingface_hub
9a1dea6
raw
history blame
24.1 kB
2023-10-17 16:18:49,833 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,834 Model: "SequenceTagger(
(embeddings): TransformerWordEmbeddings(
(model): ElectraModel(
(embeddings): ElectraEmbeddings(
(word_embeddings): Embedding(32001, 768)
(position_embeddings): Embedding(512, 768)
(token_type_embeddings): Embedding(2, 768)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(encoder): ElectraEncoder(
(layer): ModuleList(
(0-11): 12 x ElectraLayer(
(attention): ElectraAttention(
(self): ElectraSelfAttention(
(query): Linear(in_features=768, out_features=768, bias=True)
(key): Linear(in_features=768, out_features=768, bias=True)
(value): Linear(in_features=768, out_features=768, bias=True)
(dropout): Dropout(p=0.1, inplace=False)
)
(output): ElectraSelfOutput(
(dense): Linear(in_features=768, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
(intermediate): ElectraIntermediate(
(dense): Linear(in_features=768, out_features=3072, bias=True)
(intermediate_act_fn): GELUActivation()
)
(output): ElectraOutput(
(dense): Linear(in_features=3072, out_features=768, bias=True)
(LayerNorm): LayerNorm((768,), eps=1e-12, elementwise_affine=True)
(dropout): Dropout(p=0.1, inplace=False)
)
)
)
)
)
)
(locked_dropout): LockedDropout(p=0.5)
(linear): Linear(in_features=768, out_features=13, bias=True)
(loss_function): CrossEntropyLoss()
)"
2023-10-17 16:18:49,834 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,834 MultiCorpus: 5777 train + 722 dev + 723 test sentences
- NER_ICDAR_EUROPEANA Corpus: 5777 train + 722 dev + 723 test sentences - /root/.flair/datasets/ner_icdar_europeana/nl
2023-10-17 16:18:49,835 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,835 Train: 5777 sentences
2023-10-17 16:18:49,835 (train_with_dev=False, train_with_test=False)
2023-10-17 16:18:49,835 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,835 Training Params:
2023-10-17 16:18:49,835 - learning_rate: "5e-05"
2023-10-17 16:18:49,835 - mini_batch_size: "4"
2023-10-17 16:18:49,835 - max_epochs: "10"
2023-10-17 16:18:49,835 - shuffle: "True"
2023-10-17 16:18:49,835 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,835 Plugins:
2023-10-17 16:18:49,835 - TensorboardLogger
2023-10-17 16:18:49,835 - LinearScheduler | warmup_fraction: '0.1'
2023-10-17 16:18:49,835 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,835 Final evaluation on model from best epoch (best-model.pt)
2023-10-17 16:18:49,835 - metric: "('micro avg', 'f1-score')"
2023-10-17 16:18:49,835 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,835 Computation:
2023-10-17 16:18:49,835 - compute on device: cuda:0
2023-10-17 16:18:49,835 - embedding storage: none
2023-10-17 16:18:49,835 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,835 Model training base path: "hmbench-icdar/nl-hmteams/teams-base-historic-multilingual-discriminator-bs4-wsFalse-e10-lr5e-05-poolingfirst-layers-1-crfFalse-2"
2023-10-17 16:18:49,835 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,835 ----------------------------------------------------------------------------------------------------
2023-10-17 16:18:49,835 Logging anything other than scalars to TensorBoard is currently not supported.
2023-10-17 16:18:56,849 epoch 1 - iter 144/1445 - loss 2.13705809 - time (sec): 7.01 - samples/sec: 2602.15 - lr: 0.000005 - momentum: 0.000000
2023-10-17 16:19:03,750 epoch 1 - iter 288/1445 - loss 1.21878406 - time (sec): 13.91 - samples/sec: 2562.25 - lr: 0.000010 - momentum: 0.000000
2023-10-17 16:19:10,804 epoch 1 - iter 432/1445 - loss 0.90739784 - time (sec): 20.97 - samples/sec: 2498.56 - lr: 0.000015 - momentum: 0.000000
2023-10-17 16:19:17,974 epoch 1 - iter 576/1445 - loss 0.72044535 - time (sec): 28.14 - samples/sec: 2490.68 - lr: 0.000020 - momentum: 0.000000
2023-10-17 16:19:25,043 epoch 1 - iter 720/1445 - loss 0.60080964 - time (sec): 35.21 - samples/sec: 2512.23 - lr: 0.000025 - momentum: 0.000000
2023-10-17 16:19:31,904 epoch 1 - iter 864/1445 - loss 0.52771699 - time (sec): 42.07 - samples/sec: 2513.55 - lr: 0.000030 - momentum: 0.000000
2023-10-17 16:19:38,848 epoch 1 - iter 1008/1445 - loss 0.47241003 - time (sec): 49.01 - samples/sec: 2521.27 - lr: 0.000035 - momentum: 0.000000
2023-10-17 16:19:45,917 epoch 1 - iter 1152/1445 - loss 0.42720095 - time (sec): 56.08 - samples/sec: 2519.40 - lr: 0.000040 - momentum: 0.000000
2023-10-17 16:19:52,870 epoch 1 - iter 1296/1445 - loss 0.39375757 - time (sec): 63.03 - samples/sec: 2509.51 - lr: 0.000045 - momentum: 0.000000
2023-10-17 16:19:59,869 epoch 1 - iter 1440/1445 - loss 0.36718281 - time (sec): 70.03 - samples/sec: 2509.84 - lr: 0.000050 - momentum: 0.000000
2023-10-17 16:20:00,093 ----------------------------------------------------------------------------------------------------
2023-10-17 16:20:00,093 EPOCH 1 done: loss 0.3666 - lr: 0.000050
2023-10-17 16:20:02,889 DEV : loss 0.0853525772690773 - f1-score (micro avg) 0.8262
2023-10-17 16:20:02,908 saving best model
2023-10-17 16:20:03,241 ----------------------------------------------------------------------------------------------------
2023-10-17 16:20:09,898 epoch 2 - iter 144/1445 - loss 0.13163265 - time (sec): 6.65 - samples/sec: 2487.09 - lr: 0.000049 - momentum: 0.000000
2023-10-17 16:20:16,952 epoch 2 - iter 288/1445 - loss 0.11859313 - time (sec): 13.71 - samples/sec: 2444.56 - lr: 0.000049 - momentum: 0.000000
2023-10-17 16:20:24,250 epoch 2 - iter 432/1445 - loss 0.10993065 - time (sec): 21.01 - samples/sec: 2427.99 - lr: 0.000048 - momentum: 0.000000
2023-10-17 16:20:31,667 epoch 2 - iter 576/1445 - loss 0.10512844 - time (sec): 28.42 - samples/sec: 2425.61 - lr: 0.000048 - momentum: 0.000000
2023-10-17 16:20:38,916 epoch 2 - iter 720/1445 - loss 0.10132998 - time (sec): 35.67 - samples/sec: 2409.57 - lr: 0.000047 - momentum: 0.000000
2023-10-17 16:20:46,181 epoch 2 - iter 864/1445 - loss 0.09827432 - time (sec): 42.94 - samples/sec: 2449.22 - lr: 0.000047 - momentum: 0.000000
2023-10-17 16:20:53,243 epoch 2 - iter 1008/1445 - loss 0.09734789 - time (sec): 50.00 - samples/sec: 2461.18 - lr: 0.000046 - momentum: 0.000000
2023-10-17 16:21:00,233 epoch 2 - iter 1152/1445 - loss 0.09616555 - time (sec): 56.99 - samples/sec: 2456.82 - lr: 0.000046 - momentum: 0.000000
2023-10-17 16:21:07,398 epoch 2 - iter 1296/1445 - loss 0.09677640 - time (sec): 64.15 - samples/sec: 2465.71 - lr: 0.000045 - momentum: 0.000000
2023-10-17 16:21:14,472 epoch 2 - iter 1440/1445 - loss 0.09633877 - time (sec): 71.23 - samples/sec: 2467.41 - lr: 0.000044 - momentum: 0.000000
2023-10-17 16:21:14,703 ----------------------------------------------------------------------------------------------------
2023-10-17 16:21:14,703 EPOCH 2 done: loss 0.0963 - lr: 0.000044
2023-10-17 16:21:19,008 DEV : loss 0.09023821353912354 - f1-score (micro avg) 0.8362
2023-10-17 16:21:19,040 saving best model
2023-10-17 16:21:19,492 ----------------------------------------------------------------------------------------------------
2023-10-17 16:21:26,867 epoch 3 - iter 144/1445 - loss 0.07639569 - time (sec): 7.37 - samples/sec: 2469.85 - lr: 0.000044 - momentum: 0.000000
2023-10-17 16:21:33,792 epoch 3 - iter 288/1445 - loss 0.07289280 - time (sec): 14.30 - samples/sec: 2474.87 - lr: 0.000043 - momentum: 0.000000
2023-10-17 16:21:41,036 epoch 3 - iter 432/1445 - loss 0.07369285 - time (sec): 21.54 - samples/sec: 2526.94 - lr: 0.000043 - momentum: 0.000000
2023-10-17 16:21:47,971 epoch 3 - iter 576/1445 - loss 0.07352607 - time (sec): 28.48 - samples/sec: 2503.52 - lr: 0.000042 - momentum: 0.000000
2023-10-17 16:21:55,171 epoch 3 - iter 720/1445 - loss 0.07225890 - time (sec): 35.68 - samples/sec: 2486.81 - lr: 0.000042 - momentum: 0.000000
2023-10-17 16:22:02,381 epoch 3 - iter 864/1445 - loss 0.06938398 - time (sec): 42.88 - samples/sec: 2494.71 - lr: 0.000041 - momentum: 0.000000
2023-10-17 16:22:09,345 epoch 3 - iter 1008/1445 - loss 0.07054082 - time (sec): 49.85 - samples/sec: 2475.72 - lr: 0.000041 - momentum: 0.000000
2023-10-17 16:22:16,271 epoch 3 - iter 1152/1445 - loss 0.07020152 - time (sec): 56.78 - samples/sec: 2463.08 - lr: 0.000040 - momentum: 0.000000
2023-10-17 16:22:23,487 epoch 3 - iter 1296/1445 - loss 0.07093905 - time (sec): 63.99 - samples/sec: 2465.29 - lr: 0.000039 - momentum: 0.000000
2023-10-17 16:22:30,787 epoch 3 - iter 1440/1445 - loss 0.07220694 - time (sec): 71.29 - samples/sec: 2462.25 - lr: 0.000039 - momentum: 0.000000
2023-10-17 16:22:31,037 ----------------------------------------------------------------------------------------------------
2023-10-17 16:22:31,038 EPOCH 3 done: loss 0.0723 - lr: 0.000039
2023-10-17 16:22:34,317 DEV : loss 0.09389135241508484 - f1-score (micro avg) 0.8522
2023-10-17 16:22:34,334 saving best model
2023-10-17 16:22:34,794 ----------------------------------------------------------------------------------------------------
2023-10-17 16:22:41,916 epoch 4 - iter 144/1445 - loss 0.03934107 - time (sec): 7.12 - samples/sec: 2501.87 - lr: 0.000038 - momentum: 0.000000
2023-10-17 16:22:48,884 epoch 4 - iter 288/1445 - loss 0.04960378 - time (sec): 14.09 - samples/sec: 2474.30 - lr: 0.000038 - momentum: 0.000000
2023-10-17 16:22:55,742 epoch 4 - iter 432/1445 - loss 0.04967966 - time (sec): 20.95 - samples/sec: 2464.10 - lr: 0.000037 - momentum: 0.000000
2023-10-17 16:23:02,583 epoch 4 - iter 576/1445 - loss 0.05239601 - time (sec): 27.79 - samples/sec: 2472.89 - lr: 0.000037 - momentum: 0.000000
2023-10-17 16:23:10,050 epoch 4 - iter 720/1445 - loss 0.05481533 - time (sec): 35.25 - samples/sec: 2459.94 - lr: 0.000036 - momentum: 0.000000
2023-10-17 16:23:17,233 epoch 4 - iter 864/1445 - loss 0.05681795 - time (sec): 42.44 - samples/sec: 2473.57 - lr: 0.000036 - momentum: 0.000000
2023-10-17 16:23:24,313 epoch 4 - iter 1008/1445 - loss 0.05672883 - time (sec): 49.52 - samples/sec: 2478.78 - lr: 0.000035 - momentum: 0.000000
2023-10-17 16:23:31,477 epoch 4 - iter 1152/1445 - loss 0.05481502 - time (sec): 56.68 - samples/sec: 2477.63 - lr: 0.000034 - momentum: 0.000000
2023-10-17 16:23:38,544 epoch 4 - iter 1296/1445 - loss 0.05437627 - time (sec): 63.75 - samples/sec: 2479.36 - lr: 0.000034 - momentum: 0.000000
2023-10-17 16:23:45,784 epoch 4 - iter 1440/1445 - loss 0.05419519 - time (sec): 70.99 - samples/sec: 2475.65 - lr: 0.000033 - momentum: 0.000000
2023-10-17 16:23:46,031 ----------------------------------------------------------------------------------------------------
2023-10-17 16:23:46,032 EPOCH 4 done: loss 0.0541 - lr: 0.000033
2023-10-17 16:23:49,286 DEV : loss 0.10415765643119812 - f1-score (micro avg) 0.8558
2023-10-17 16:23:49,302 saving best model
2023-10-17 16:23:49,759 ----------------------------------------------------------------------------------------------------
2023-10-17 16:23:56,689 epoch 5 - iter 144/1445 - loss 0.05670271 - time (sec): 6.93 - samples/sec: 2374.53 - lr: 0.000033 - momentum: 0.000000
2023-10-17 16:24:03,688 epoch 5 - iter 288/1445 - loss 0.04436221 - time (sec): 13.93 - samples/sec: 2443.03 - lr: 0.000032 - momentum: 0.000000
2023-10-17 16:24:10,773 epoch 5 - iter 432/1445 - loss 0.04616850 - time (sec): 21.01 - samples/sec: 2458.36 - lr: 0.000032 - momentum: 0.000000
2023-10-17 16:24:17,711 epoch 5 - iter 576/1445 - loss 0.04352124 - time (sec): 27.95 - samples/sec: 2426.81 - lr: 0.000031 - momentum: 0.000000
2023-10-17 16:24:25,343 epoch 5 - iter 720/1445 - loss 0.04411388 - time (sec): 35.58 - samples/sec: 2418.81 - lr: 0.000031 - momentum: 0.000000
2023-10-17 16:24:32,451 epoch 5 - iter 864/1445 - loss 0.04402856 - time (sec): 42.69 - samples/sec: 2426.02 - lr: 0.000030 - momentum: 0.000000
2023-10-17 16:24:39,891 epoch 5 - iter 1008/1445 - loss 0.04635845 - time (sec): 50.13 - samples/sec: 2442.79 - lr: 0.000029 - momentum: 0.000000
2023-10-17 16:24:47,269 epoch 5 - iter 1152/1445 - loss 0.04510679 - time (sec): 57.51 - samples/sec: 2459.29 - lr: 0.000029 - momentum: 0.000000
2023-10-17 16:24:54,254 epoch 5 - iter 1296/1445 - loss 0.04387905 - time (sec): 64.49 - samples/sec: 2467.17 - lr: 0.000028 - momentum: 0.000000
2023-10-17 16:25:00,944 epoch 5 - iter 1440/1445 - loss 0.04340737 - time (sec): 71.18 - samples/sec: 2466.49 - lr: 0.000028 - momentum: 0.000000
2023-10-17 16:25:01,206 ----------------------------------------------------------------------------------------------------
2023-10-17 16:25:01,206 EPOCH 5 done: loss 0.0434 - lr: 0.000028
2023-10-17 16:25:04,542 DEV : loss 0.14961402118206024 - f1-score (micro avg) 0.7955
2023-10-17 16:25:04,563 ----------------------------------------------------------------------------------------------------
2023-10-17 16:25:11,913 epoch 6 - iter 144/1445 - loss 0.05786667 - time (sec): 7.35 - samples/sec: 2503.05 - lr: 0.000027 - momentum: 0.000000
2023-10-17 16:25:18,998 epoch 6 - iter 288/1445 - loss 0.03941575 - time (sec): 14.43 - samples/sec: 2463.13 - lr: 0.000027 - momentum: 0.000000
2023-10-17 16:25:25,910 epoch 6 - iter 432/1445 - loss 0.03463529 - time (sec): 21.35 - samples/sec: 2477.92 - lr: 0.000026 - momentum: 0.000000
2023-10-17 16:25:32,830 epoch 6 - iter 576/1445 - loss 0.03145162 - time (sec): 28.27 - samples/sec: 2487.21 - lr: 0.000026 - momentum: 0.000000
2023-10-17 16:25:39,923 epoch 6 - iter 720/1445 - loss 0.02986732 - time (sec): 35.36 - samples/sec: 2501.74 - lr: 0.000025 - momentum: 0.000000
2023-10-17 16:25:46,695 epoch 6 - iter 864/1445 - loss 0.02915652 - time (sec): 42.13 - samples/sec: 2533.84 - lr: 0.000024 - momentum: 0.000000
2023-10-17 16:25:53,428 epoch 6 - iter 1008/1445 - loss 0.02815790 - time (sec): 48.86 - samples/sec: 2551.88 - lr: 0.000024 - momentum: 0.000000
2023-10-17 16:26:00,418 epoch 6 - iter 1152/1445 - loss 0.02863677 - time (sec): 55.85 - samples/sec: 2527.10 - lr: 0.000023 - momentum: 0.000000
2023-10-17 16:26:07,287 epoch 6 - iter 1296/1445 - loss 0.02857277 - time (sec): 62.72 - samples/sec: 2519.48 - lr: 0.000023 - momentum: 0.000000
2023-10-17 16:26:14,266 epoch 6 - iter 1440/1445 - loss 0.02901616 - time (sec): 69.70 - samples/sec: 2518.37 - lr: 0.000022 - momentum: 0.000000
2023-10-17 16:26:14,559 ----------------------------------------------------------------------------------------------------
2023-10-17 16:26:14,559 EPOCH 6 done: loss 0.0289 - lr: 0.000022
2023-10-17 16:26:17,754 DEV : loss 0.15008436143398285 - f1-score (micro avg) 0.8202
2023-10-17 16:26:17,770 ----------------------------------------------------------------------------------------------------
2023-10-17 16:26:24,700 epoch 7 - iter 144/1445 - loss 0.02000359 - time (sec): 6.93 - samples/sec: 2656.70 - lr: 0.000022 - momentum: 0.000000
2023-10-17 16:26:31,753 epoch 7 - iter 288/1445 - loss 0.01569030 - time (sec): 13.98 - samples/sec: 2590.17 - lr: 0.000021 - momentum: 0.000000
2023-10-17 16:26:39,001 epoch 7 - iter 432/1445 - loss 0.01782813 - time (sec): 21.23 - samples/sec: 2505.61 - lr: 0.000021 - momentum: 0.000000
2023-10-17 16:26:46,180 epoch 7 - iter 576/1445 - loss 0.01964390 - time (sec): 28.41 - samples/sec: 2499.50 - lr: 0.000020 - momentum: 0.000000
2023-10-17 16:26:53,381 epoch 7 - iter 720/1445 - loss 0.01835474 - time (sec): 35.61 - samples/sec: 2499.50 - lr: 0.000019 - momentum: 0.000000
2023-10-17 16:27:00,040 epoch 7 - iter 864/1445 - loss 0.01866023 - time (sec): 42.27 - samples/sec: 2508.78 - lr: 0.000019 - momentum: 0.000000
2023-10-17 16:27:07,184 epoch 7 - iter 1008/1445 - loss 0.01877361 - time (sec): 49.41 - samples/sec: 2486.61 - lr: 0.000018 - momentum: 0.000000
2023-10-17 16:27:13,862 epoch 7 - iter 1152/1445 - loss 0.01962775 - time (sec): 56.09 - samples/sec: 2498.38 - lr: 0.000018 - momentum: 0.000000
2023-10-17 16:27:20,488 epoch 7 - iter 1296/1445 - loss 0.01955537 - time (sec): 62.72 - samples/sec: 2507.44 - lr: 0.000017 - momentum: 0.000000
2023-10-17 16:27:27,403 epoch 7 - iter 1440/1445 - loss 0.02047548 - time (sec): 69.63 - samples/sec: 2522.27 - lr: 0.000017 - momentum: 0.000000
2023-10-17 16:27:27,637 ----------------------------------------------------------------------------------------------------
2023-10-17 16:27:27,637 EPOCH 7 done: loss 0.0204 - lr: 0.000017
2023-10-17 16:27:31,016 DEV : loss 0.14934930205345154 - f1-score (micro avg) 0.8585
2023-10-17 16:27:31,032 saving best model
2023-10-17 16:27:31,496 ----------------------------------------------------------------------------------------------------
2023-10-17 16:27:38,491 epoch 8 - iter 144/1445 - loss 0.01993635 - time (sec): 6.99 - samples/sec: 2446.12 - lr: 0.000016 - momentum: 0.000000
2023-10-17 16:27:45,557 epoch 8 - iter 288/1445 - loss 0.01529310 - time (sec): 14.06 - samples/sec: 2434.12 - lr: 0.000016 - momentum: 0.000000
2023-10-17 16:27:52,729 epoch 8 - iter 432/1445 - loss 0.01421649 - time (sec): 21.23 - samples/sec: 2451.36 - lr: 0.000015 - momentum: 0.000000
2023-10-17 16:27:59,839 epoch 8 - iter 576/1445 - loss 0.01275077 - time (sec): 28.34 - samples/sec: 2445.92 - lr: 0.000014 - momentum: 0.000000
2023-10-17 16:28:06,792 epoch 8 - iter 720/1445 - loss 0.01099475 - time (sec): 35.29 - samples/sec: 2421.95 - lr: 0.000014 - momentum: 0.000000
2023-10-17 16:28:14,048 epoch 8 - iter 864/1445 - loss 0.01053576 - time (sec): 42.55 - samples/sec: 2442.96 - lr: 0.000013 - momentum: 0.000000
2023-10-17 16:28:21,295 epoch 8 - iter 1008/1445 - loss 0.01107931 - time (sec): 49.80 - samples/sec: 2455.05 - lr: 0.000013 - momentum: 0.000000
2023-10-17 16:28:28,360 epoch 8 - iter 1152/1445 - loss 0.01295286 - time (sec): 56.86 - samples/sec: 2453.26 - lr: 0.000012 - momentum: 0.000000
2023-10-17 16:28:35,774 epoch 8 - iter 1296/1445 - loss 0.01309587 - time (sec): 64.28 - samples/sec: 2455.43 - lr: 0.000012 - momentum: 0.000000
2023-10-17 16:28:43,046 epoch 8 - iter 1440/1445 - loss 0.01319627 - time (sec): 71.55 - samples/sec: 2454.86 - lr: 0.000011 - momentum: 0.000000
2023-10-17 16:28:43,274 ----------------------------------------------------------------------------------------------------
2023-10-17 16:28:43,274 EPOCH 8 done: loss 0.0134 - lr: 0.000011
2023-10-17 16:28:46,569 DEV : loss 0.16947199404239655 - f1-score (micro avg) 0.8524
2023-10-17 16:28:46,602 ----------------------------------------------------------------------------------------------------
2023-10-17 16:28:53,623 epoch 9 - iter 144/1445 - loss 0.00899269 - time (sec): 7.02 - samples/sec: 2577.61 - lr: 0.000011 - momentum: 0.000000
2023-10-17 16:29:01,060 epoch 9 - iter 288/1445 - loss 0.01406469 - time (sec): 14.46 - samples/sec: 2548.89 - lr: 0.000010 - momentum: 0.000000
2023-10-17 16:29:08,313 epoch 9 - iter 432/1445 - loss 0.01182979 - time (sec): 21.71 - samples/sec: 2487.70 - lr: 0.000009 - momentum: 0.000000
2023-10-17 16:29:15,658 epoch 9 - iter 576/1445 - loss 0.01105186 - time (sec): 29.06 - samples/sec: 2453.24 - lr: 0.000009 - momentum: 0.000000
2023-10-17 16:29:22,816 epoch 9 - iter 720/1445 - loss 0.01186549 - time (sec): 36.21 - samples/sec: 2470.18 - lr: 0.000008 - momentum: 0.000000
2023-10-17 16:29:30,151 epoch 9 - iter 864/1445 - loss 0.01072237 - time (sec): 43.55 - samples/sec: 2460.49 - lr: 0.000008 - momentum: 0.000000
2023-10-17 16:29:37,289 epoch 9 - iter 1008/1445 - loss 0.01098163 - time (sec): 50.69 - samples/sec: 2446.11 - lr: 0.000007 - momentum: 0.000000
2023-10-17 16:29:44,787 epoch 9 - iter 1152/1445 - loss 0.01023991 - time (sec): 58.18 - samples/sec: 2426.10 - lr: 0.000007 - momentum: 0.000000
2023-10-17 16:29:52,556 epoch 9 - iter 1296/1445 - loss 0.01014140 - time (sec): 65.95 - samples/sec: 2401.36 - lr: 0.000006 - momentum: 0.000000
2023-10-17 16:30:00,822 epoch 9 - iter 1440/1445 - loss 0.00966526 - time (sec): 74.22 - samples/sec: 2365.25 - lr: 0.000006 - momentum: 0.000000
2023-10-17 16:30:01,060 ----------------------------------------------------------------------------------------------------
2023-10-17 16:30:01,060 EPOCH 9 done: loss 0.0096 - lr: 0.000006
2023-10-17 16:30:04,778 DEV : loss 0.15933012962341309 - f1-score (micro avg) 0.8648
2023-10-17 16:30:04,794 saving best model
2023-10-17 16:30:05,250 ----------------------------------------------------------------------------------------------------
2023-10-17 16:30:13,087 epoch 10 - iter 144/1445 - loss 0.00391485 - time (sec): 7.83 - samples/sec: 2316.86 - lr: 0.000005 - momentum: 0.000000
2023-10-17 16:30:20,360 epoch 10 - iter 288/1445 - loss 0.00316501 - time (sec): 15.11 - samples/sec: 2309.41 - lr: 0.000004 - momentum: 0.000000
2023-10-17 16:30:27,734 epoch 10 - iter 432/1445 - loss 0.00358503 - time (sec): 22.48 - samples/sec: 2231.35 - lr: 0.000004 - momentum: 0.000000
2023-10-17 16:30:35,048 epoch 10 - iter 576/1445 - loss 0.00564098 - time (sec): 29.80 - samples/sec: 2274.25 - lr: 0.000003 - momentum: 0.000000
2023-10-17 16:30:43,390 epoch 10 - iter 720/1445 - loss 0.00548837 - time (sec): 38.14 - samples/sec: 2249.40 - lr: 0.000003 - momentum: 0.000000
2023-10-17 16:30:51,087 epoch 10 - iter 864/1445 - loss 0.00552389 - time (sec): 45.83 - samples/sec: 2243.93 - lr: 0.000002 - momentum: 0.000000
2023-10-17 16:30:59,119 epoch 10 - iter 1008/1445 - loss 0.00560035 - time (sec): 53.87 - samples/sec: 2241.21 - lr: 0.000002 - momentum: 0.000000
2023-10-17 16:31:07,014 epoch 10 - iter 1152/1445 - loss 0.00546225 - time (sec): 61.76 - samples/sec: 2241.91 - lr: 0.000001 - momentum: 0.000000
2023-10-17 16:31:14,728 epoch 10 - iter 1296/1445 - loss 0.00603445 - time (sec): 69.48 - samples/sec: 2258.14 - lr: 0.000001 - momentum: 0.000000
2023-10-17 16:31:21,998 epoch 10 - iter 1440/1445 - loss 0.00608040 - time (sec): 76.75 - samples/sec: 2289.15 - lr: 0.000000 - momentum: 0.000000
2023-10-17 16:31:22,229 ----------------------------------------------------------------------------------------------------
2023-10-17 16:31:22,230 EPOCH 10 done: loss 0.0061 - lr: 0.000000
2023-10-17 16:31:25,469 DEV : loss 0.15785089135169983 - f1-score (micro avg) 0.8684
2023-10-17 16:31:25,485 saving best model
2023-10-17 16:31:26,269 ----------------------------------------------------------------------------------------------------
2023-10-17 16:31:26,270 Loading model from best epoch ...
2023-10-17 16:31:27,631 SequenceTagger predicts: Dictionary with 13 tags: O, S-LOC, B-LOC, E-LOC, I-LOC, S-PER, B-PER, E-PER, I-PER, S-ORG, B-ORG, E-ORG, I-ORG
2023-10-17 16:31:30,393
Results:
- F-score (micro) 0.8444
- F-score (macro) 0.7376
- Accuracy 0.7372
By class:
precision recall f1-score support
PER 0.8584 0.8299 0.8439 482
LOC 0.9402 0.8581 0.8973 458
ORG 0.5370 0.4203 0.4715 69
micro avg 0.8763 0.8147 0.8444 1009
macro avg 0.7785 0.7027 0.7376 1009
weighted avg 0.8735 0.8147 0.8426 1009
2023-10-17 16:31:30,393 ----------------------------------------------------------------------------------------------------