Edit model card

nerui-pt-pl30-3

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0655
  • Location Precision: 0.8830
  • Location Recall: 0.9651
  • Location F1: 0.9222
  • Location Number: 86
  • Organization Precision: 0.9368
  • Organization Recall: 0.9157
  • Organization F1: 0.9261
  • Organization Number: 178
  • Person Precision: 0.9688
  • Person Recall: 0.9688
  • Person F1: 0.9688
  • Person Number: 128
  • Overall Precision: 0.9343
  • Overall Recall: 0.9439
  • Overall F1: 0.9391
  • Overall Accuracy: 0.9870

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8467 1.0 96 0.3800 0.0 0.0 0.0 86 0.2437 0.2191 0.2308 178 0.3116 0.3359 0.3233 128 0.2733 0.2092 0.2370 0.8699
0.3716 2.0 192 0.2423 0.3929 0.3837 0.3882 86 0.5917 0.5618 0.5764 178 0.6456 0.7969 0.7133 128 0.5718 0.5995 0.5853 0.9309
0.2184 3.0 288 0.1218 0.7412 0.7326 0.7368 86 0.7225 0.7753 0.7480 178 0.9130 0.9844 0.9474 128 0.7899 0.8342 0.8114 0.9649
0.1455 4.0 384 0.0960 0.6822 0.8488 0.7565 86 0.7910 0.7865 0.7887 178 0.9259 0.9766 0.9506 128 0.8067 0.8622 0.8335 0.9714
0.1147 5.0 480 0.0768 0.8 0.8837 0.8398 86 0.8641 0.8933 0.8785 178 0.9398 0.9766 0.9579 128 0.8738 0.9184 0.8955 0.9762
0.1009 6.0 576 0.0727 0.6696 0.8721 0.7576 86 0.9 0.8090 0.8521 178 0.9615 0.9766 0.9690 128 0.8557 0.8776 0.8665 0.9776
0.0893 7.0 672 0.0597 0.8041 0.9070 0.8525 86 0.8703 0.9045 0.8871 178 0.9688 0.9688 0.9688 128 0.8854 0.9260 0.9052 0.9814
0.082 8.0 768 0.0496 0.8556 0.8953 0.875 86 0.8703 0.9045 0.8871 178 0.9609 0.9609 0.9609 128 0.8958 0.9209 0.9082 0.9835
0.0775 9.0 864 0.0508 0.7822 0.9186 0.8449 86 0.9059 0.8652 0.8851 178 0.9766 0.9766 0.9766 128 0.8972 0.9133 0.9052 0.9811
0.0696 10.0 960 0.0515 0.7714 0.9419 0.8482 86 0.9128 0.8820 0.8971 178 0.9843 0.9766 0.9804 128 0.8985 0.9260 0.9121 0.9819
0.0635 11.0 1056 0.0434 0.8495 0.9186 0.8827 86 0.9282 0.9438 0.9359 178 0.9688 0.9688 0.9688 128 0.9229 0.9464 0.9345 0.9854
0.0604 12.0 1152 0.0430 0.8901 0.9419 0.9153 86 0.8978 0.9382 0.9176 178 0.9843 0.9766 0.9804 128 0.9233 0.9515 0.9372 0.9870
0.0557 13.0 1248 0.0416 0.8977 0.9186 0.9080 86 0.8978 0.9382 0.9176 178 0.9688 0.9688 0.9688 128 0.9204 0.9439 0.9320 0.9862
0.051 14.0 1344 0.0443 0.9080 0.9186 0.9133 86 0.9157 0.9157 0.9157 178 0.9843 0.9766 0.9804 128 0.9362 0.9362 0.9362 0.9862
0.0492 15.0 1440 0.0424 0.8778 0.9186 0.8977 86 0.9153 0.9101 0.9127 178 0.9764 0.9688 0.9725 128 0.9264 0.9311 0.9288 0.9868
0.0463 16.0 1536 0.0439 0.9011 0.9535 0.9266 86 0.9180 0.9438 0.9307 178 0.9843 0.9766 0.9804 128 0.9352 0.9566 0.9458 0.9870
0.0417 17.0 1632 0.0444 0.9 0.9419 0.9205 86 0.9235 0.9494 0.9363 178 0.9690 0.9766 0.9728 128 0.9328 0.9566 0.9446 0.9879
0.0407 18.0 1728 0.0415 0.9294 0.9186 0.9240 86 0.9333 0.9438 0.9385 178 0.9843 0.9766 0.9804 128 0.9490 0.9490 0.9490 0.9884
0.0413 19.0 1824 0.0429 0.9080 0.9186 0.9133 86 0.9444 0.9551 0.9497 178 0.9843 0.9766 0.9804 128 0.9492 0.9541 0.9517 0.9879
0.0368 20.0 1920 0.0558 0.8632 0.9535 0.9061 86 0.9162 0.9213 0.9188 178 0.9690 0.9766 0.9728 128 0.9206 0.9464 0.9333 0.9838
0.0361 21.0 2016 0.0473 0.9 0.9419 0.9205 86 0.9270 0.9270 0.9270 178 0.9764 0.9688 0.9725 128 0.9367 0.9439 0.9403 0.9868
0.0355 22.0 2112 0.0557 0.8247 0.9302 0.8743 86 0.9022 0.9326 0.9171 178 0.9609 0.9609 0.9609 128 0.9022 0.9413 0.9213 0.9835
0.0314 23.0 2208 0.0475 0.8901 0.9419 0.9153 86 0.9385 0.9438 0.9412 178 0.9615 0.9766 0.9690 128 0.935 0.9541 0.9444 0.9868
0.0308 24.0 2304 0.0556 0.8384 0.9651 0.8973 86 0.9412 0.8989 0.9195 178 0.9612 0.9688 0.9650 128 0.9221 0.9362 0.9291 0.9852
0.0303 25.0 2400 0.0505 0.9213 0.9535 0.9371 86 0.9371 0.9213 0.9292 178 0.9688 0.9688 0.9688 128 0.9439 0.9439 0.9439 0.9876
0.0288 26.0 2496 0.0486 0.9111 0.9535 0.9318 86 0.9157 0.9157 0.9157 178 0.9766 0.9766 0.9766 128 0.9343 0.9439 0.9391 0.9860
0.0276 27.0 2592 0.0479 0.8817 0.9535 0.9162 86 0.9176 0.9382 0.9278 178 0.9615 0.9766 0.9690 128 0.9235 0.9541 0.9385 0.9879
0.0264 28.0 2688 0.0468 0.9524 0.9302 0.9412 86 0.8984 0.9438 0.9205 178 0.9535 0.9609 0.9572 128 0.9275 0.9464 0.9369 0.9873
0.0267 29.0 2784 0.0553 0.8632 0.9535 0.9061 86 0.9209 0.9157 0.9183 178 0.9766 0.9766 0.9766 128 0.925 0.9439 0.9343 0.9849
0.0258 30.0 2880 0.0486 0.8989 0.9302 0.9143 86 0.9140 0.9551 0.9341 178 0.9766 0.9766 0.9766 128 0.9305 0.9566 0.9434 0.9868
0.0252 31.0 2976 0.0507 0.9 0.9419 0.9205 86 0.9368 0.9157 0.9261 178 0.9615 0.9766 0.9690 128 0.9365 0.9413 0.9389 0.9860
0.0248 32.0 3072 0.0498 0.9111 0.9535 0.9318 86 0.9101 0.9101 0.9101 178 0.9612 0.9688 0.9650 128 0.9270 0.9388 0.9328 0.9865
0.0233 33.0 3168 0.0516 0.9318 0.9535 0.9425 86 0.9326 0.9326 0.9326 178 0.9766 0.9766 0.9766 128 0.9467 0.9515 0.9491 0.9881
0.0243 34.0 3264 0.0541 0.8632 0.9535 0.9061 86 0.9382 0.9382 0.9382 178 0.9843 0.9766 0.9804 128 0.935 0.9541 0.9444 0.9870
0.0199 35.0 3360 0.0515 0.9111 0.9535 0.9318 86 0.9218 0.9270 0.9244 178 0.9766 0.9766 0.9766 128 0.9370 0.9490 0.9430 0.9870
0.0217 36.0 3456 0.0696 0.8454 0.9535 0.8962 86 0.9235 0.8820 0.9023 178 0.9685 0.9609 0.9647 128 0.9188 0.9235 0.9211 0.9830
0.0215 37.0 3552 0.0597 0.8889 0.9302 0.9091 86 0.9056 0.9157 0.9106 178 0.9766 0.9766 0.9766 128 0.9246 0.9388 0.9316 0.9857
0.0194 38.0 3648 0.0528 0.8723 0.9535 0.9111 86 0.9209 0.9157 0.9183 178 0.9764 0.9688 0.9725 128 0.9271 0.9413 0.9342 0.9857
0.0182 39.0 3744 0.0516 0.8913 0.9535 0.9213 86 0.9261 0.9157 0.9209 178 0.9690 0.9766 0.9728 128 0.9320 0.9439 0.9379 0.9868
0.0187 40.0 3840 0.0568 0.8925 0.9651 0.9274 86 0.9368 0.9157 0.9261 178 0.9766 0.9766 0.9766 128 0.9392 0.9464 0.9428 0.9865
0.0184 41.0 3936 0.0632 0.8710 0.9419 0.9050 86 0.9298 0.8933 0.9112 178 0.9766 0.9766 0.9766 128 0.9311 0.9311 0.9311 0.9843
0.0175 42.0 4032 0.0560 0.8817 0.9535 0.9162 86 0.9153 0.9101 0.9127 178 0.9615 0.9766 0.9690 128 0.9225 0.9413 0.9318 0.9852
0.0177 43.0 4128 0.0562 0.9 0.9419 0.9205 86 0.9157 0.9157 0.9157 178 0.9690 0.9766 0.9728 128 0.9295 0.9413 0.9354 0.9860
0.0181 44.0 4224 0.0505 0.9111 0.9535 0.9318 86 0.9157 0.9157 0.9157 178 0.9688 0.9688 0.9688 128 0.9318 0.9413 0.9365 0.9860
0.0172 45.0 4320 0.0636 0.8723 0.9535 0.9111 86 0.9191 0.8933 0.9060 178 0.9535 0.9609 0.9572 128 0.9192 0.9286 0.9239 0.9854
0.0166 46.0 4416 0.0647 0.8737 0.9651 0.9171 86 0.9186 0.8876 0.9029 178 0.9766 0.9766 0.9766 128 0.9266 0.9337 0.9301 0.9860
0.017 47.0 4512 0.0551 0.8901 0.9419 0.9153 86 0.9162 0.9213 0.9188 178 0.9766 0.9766 0.9766 128 0.9296 0.9439 0.9367 0.9868
0.0147 48.0 4608 0.0572 0.8723 0.9535 0.9111 86 0.9148 0.9045 0.9096 178 0.9766 0.9766 0.9766 128 0.9246 0.9388 0.9316 0.9868
0.0138 49.0 4704 0.0527 0.8817 0.9535 0.9162 86 0.9360 0.9045 0.9200 178 0.9766 0.9766 0.9766 128 0.9364 0.9388 0.9376 0.9873
0.0154 50.0 4800 0.0613 0.9419 0.9419 0.9419 86 0.9180 0.9438 0.9307 178 0.9766 0.9766 0.9766 128 0.9421 0.9541 0.9480 0.9865
0.0151 51.0 4896 0.0615 0.8817 0.9535 0.9162 86 0.9422 0.9157 0.9288 178 0.9766 0.9766 0.9766 128 0.9391 0.9439 0.9415 0.9870
0.0144 52.0 4992 0.0548 0.8913 0.9535 0.9213 86 0.9371 0.9213 0.9292 178 0.9688 0.9688 0.9688 128 0.9367 0.9439 0.9403 0.9873
0.0145 53.0 5088 0.0669 0.9080 0.9186 0.9133 86 0.9270 0.9270 0.9270 178 0.9766 0.9766 0.9766 128 0.9389 0.9413 0.9401 0.9860
0.0132 54.0 5184 0.0701 0.9091 0.9302 0.9195 86 0.9106 0.9157 0.9132 178 0.9688 0.9688 0.9688 128 0.9291 0.9362 0.9327 0.9852
0.0133 55.0 5280 0.0593 0.9 0.9419 0.9205 86 0.9419 0.9101 0.9257 178 0.9766 0.9766 0.9766 128 0.9436 0.9388 0.9412 0.9865
0.0126 56.0 5376 0.0588 0.8817 0.9535 0.9162 86 0.9318 0.9213 0.9266 178 0.9766 0.9766 0.9766 128 0.9345 0.9464 0.9404 0.9870
0.013 57.0 5472 0.0589 0.8737 0.9651 0.9171 86 0.9371 0.9213 0.9292 178 0.9766 0.9766 0.9766 128 0.9347 0.9490 0.9418 0.9865
0.0126 58.0 5568 0.0615 0.8913 0.9535 0.9213 86 0.9425 0.9213 0.9318 178 0.9766 0.9766 0.9766 128 0.9416 0.9464 0.9440 0.9870
0.0113 59.0 5664 0.0596 0.8925 0.9651 0.9274 86 0.9368 0.9157 0.9261 178 0.9688 0.9688 0.9688 128 0.9367 0.9439 0.9403 0.9870
0.0125 60.0 5760 0.0596 0.8925 0.9651 0.9274 86 0.9483 0.9270 0.9375 178 0.9688 0.9688 0.9688 128 0.9418 0.9490 0.9454 0.9876
0.0117 61.0 5856 0.0644 0.9101 0.9419 0.9257 86 0.9375 0.9270 0.9322 178 0.9688 0.9688 0.9688 128 0.9415 0.9439 0.9427 0.9870
0.0113 62.0 5952 0.0621 0.8817 0.9535 0.9162 86 0.9205 0.9101 0.9153 178 0.9688 0.9688 0.9688 128 0.9270 0.9388 0.9328 0.9860
0.0115 63.0 6048 0.0634 0.8817 0.9535 0.9162 86 0.9310 0.9101 0.9205 178 0.9766 0.9766 0.9766 128 0.9342 0.9413 0.9377 0.9865
0.0105 64.0 6144 0.0688 0.8817 0.9535 0.9162 86 0.9257 0.9101 0.9178 178 0.9766 0.9766 0.9766 128 0.9318 0.9413 0.9365 0.9862
0.0115 65.0 6240 0.0591 0.8913 0.9535 0.9213 86 0.9218 0.9270 0.9244 178 0.9766 0.9766 0.9766 128 0.9323 0.9490 0.9406 0.9873
0.0081 66.0 6336 0.0631 0.8913 0.9535 0.9213 86 0.9425 0.9213 0.9318 178 0.9688 0.9688 0.9688 128 0.9391 0.9439 0.9415 0.9873
0.0091 67.0 6432 0.0679 0.8817 0.9535 0.9162 86 0.9310 0.9101 0.9205 178 0.9766 0.9766 0.9766 128 0.9342 0.9413 0.9377 0.9862
0.0096 68.0 6528 0.0684 0.9011 0.9535 0.9266 86 0.9261 0.9157 0.9209 178 0.9688 0.9688 0.9688 128 0.9342 0.9413 0.9377 0.9860
0.0098 69.0 6624 0.0686 0.8913 0.9535 0.9213 86 0.9314 0.9157 0.9235 178 0.9766 0.9766 0.9766 128 0.9367 0.9439 0.9403 0.9862
0.0081 70.0 6720 0.0659 0.8817 0.9535 0.9162 86 0.9310 0.9101 0.9205 178 0.9766 0.9766 0.9766 128 0.9342 0.9413 0.9377 0.9868
0.0091 71.0 6816 0.0677 0.8913 0.9535 0.9213 86 0.9486 0.9326 0.9405 178 0.9766 0.9766 0.9766 128 0.9443 0.9515 0.9479 0.9873
0.0087 72.0 6912 0.0659 0.8913 0.9535 0.9213 86 0.9474 0.9101 0.9284 178 0.9766 0.9766 0.9766 128 0.9437 0.9413 0.9425 0.9870
0.009 73.0 7008 0.0659 0.8913 0.9535 0.9213 86 0.9364 0.9101 0.9231 178 0.9766 0.9766 0.9766 128 0.9389 0.9413 0.9401 0.9862
0.0097 74.0 7104 0.0631 0.8913 0.9535 0.9213 86 0.9419 0.9101 0.9257 178 0.9766 0.9766 0.9766 128 0.9413 0.9413 0.9413 0.9868
0.0096 75.0 7200 0.0627 0.8901 0.9419 0.9153 86 0.9257 0.9101 0.9178 178 0.9766 0.9766 0.9766 128 0.9340 0.9388 0.9364 0.9862
0.0085 76.0 7296 0.0648 0.8913 0.9535 0.9213 86 0.9310 0.9101 0.9205 178 0.9766 0.9766 0.9766 128 0.9365 0.9413 0.9389 0.9868
0.0081 77.0 7392 0.0659 0.8913 0.9535 0.9213 86 0.9360 0.9045 0.9200 178 0.9766 0.9766 0.9766 128 0.9388 0.9388 0.9388 0.9862
0.0089 78.0 7488 0.0651 0.8913 0.9535 0.9213 86 0.9583 0.9045 0.9306 178 0.9766 0.9766 0.9766 128 0.9485 0.9388 0.9436 0.9879
0.0075 79.0 7584 0.0643 0.8913 0.9535 0.9213 86 0.9310 0.9101 0.9205 178 0.9766 0.9766 0.9766 128 0.9365 0.9413 0.9389 0.9865
0.009 80.0 7680 0.0645 0.9011 0.9535 0.9266 86 0.9375 0.9270 0.9322 178 0.9766 0.9766 0.9766 128 0.9418 0.9490 0.9454 0.9876
0.0103 81.0 7776 0.0637 0.8817 0.9535 0.9162 86 0.9360 0.9045 0.9200 178 0.9766 0.9766 0.9766 128 0.9364 0.9388 0.9376 0.9862
0.0085 82.0 7872 0.0638 0.8817 0.9535 0.9162 86 0.9474 0.9101 0.9284 178 0.9766 0.9766 0.9766 128 0.9413 0.9413 0.9413 0.9868
0.0081 83.0 7968 0.0637 0.8817 0.9535 0.9162 86 0.9419 0.9101 0.9257 178 0.9766 0.9766 0.9766 128 0.9389 0.9413 0.9401 0.9865
0.0073 84.0 8064 0.0631 0.8830 0.9651 0.9222 86 0.9368 0.9157 0.9261 178 0.9688 0.9688 0.9688 128 0.9343 0.9439 0.9391 0.9865
0.0067 85.0 8160 0.0622 0.9032 0.9767 0.9385 86 0.9371 0.9213 0.9292 178 0.9688 0.9688 0.9688 128 0.9394 0.9490 0.9442 0.9870
0.0072 86.0 8256 0.0688 0.8830 0.9651 0.9222 86 0.9364 0.9101 0.9231 178 0.9766 0.9766 0.9766 128 0.9367 0.9439 0.9403 0.9865
0.0071 87.0 8352 0.0658 0.8925 0.9651 0.9274 86 0.9310 0.9101 0.9205 178 0.9688 0.9688 0.9688 128 0.9342 0.9413 0.9377 0.9865
0.0083 88.0 8448 0.0687 0.8913 0.9535 0.9213 86 0.9209 0.9157 0.9183 178 0.9688 0.9688 0.9688 128 0.9295 0.9413 0.9354 0.9865
0.008 89.0 8544 0.0678 0.8936 0.9767 0.9333 86 0.9529 0.9101 0.9310 178 0.9688 0.9688 0.9688 128 0.9439 0.9439 0.9439 0.9873
0.0058 90.0 8640 0.0681 0.8817 0.9535 0.9162 86 0.92 0.9045 0.9122 178 0.9688 0.9688 0.9688 128 0.9268 0.9362 0.9315 0.9860
0.0076 91.0 8736 0.0648 0.9022 0.9651 0.9326 86 0.9266 0.9213 0.9239 178 0.9688 0.9688 0.9688 128 0.9345 0.9464 0.9404 0.9870
0.0068 92.0 8832 0.0657 0.8913 0.9535 0.9213 86 0.9209 0.9157 0.9183 178 0.9688 0.9688 0.9688 128 0.9295 0.9413 0.9354 0.9865
0.0067 93.0 8928 0.0659 0.8913 0.9535 0.9213 86 0.9209 0.9157 0.9183 178 0.9688 0.9688 0.9688 128 0.9295 0.9413 0.9354 0.9865
0.0084 94.0 9024 0.0652 0.8936 0.9767 0.9333 86 0.9368 0.9157 0.9261 178 0.9688 0.9688 0.9688 128 0.9369 0.9464 0.9416 0.9873
0.0061 95.0 9120 0.0647 0.8925 0.9651 0.9274 86 0.9261 0.9157 0.9209 178 0.9688 0.9688 0.9688 128 0.9320 0.9439 0.9379 0.9868
0.0063 96.0 9216 0.0642 0.8913 0.9535 0.9213 86 0.9157 0.9157 0.9157 178 0.9688 0.9688 0.9688 128 0.9271 0.9413 0.9342 0.9868
0.0067 97.0 9312 0.0645 0.8925 0.9651 0.9274 86 0.9371 0.9213 0.9292 178 0.9688 0.9688 0.9688 128 0.9369 0.9464 0.9416 0.9873
0.0072 98.0 9408 0.0660 0.8830 0.9651 0.9222 86 0.9368 0.9157 0.9261 178 0.9688 0.9688 0.9688 128 0.9343 0.9439 0.9391 0.9870
0.0069 99.0 9504 0.0655 0.8830 0.9651 0.9222 86 0.9368 0.9157 0.9261 178 0.9688 0.9688 0.9688 128 0.9343 0.9439 0.9391 0.9870
0.0068 100.0 9600 0.0655 0.8830 0.9651 0.9222 86 0.9368 0.9157 0.9261 178 0.9688 0.9688 0.9688 128 0.9343 0.9439 0.9391 0.9870

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl30-3

Finetuned
(365)
this model