Edit model card

nerui-pt-pl30-2

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0677
  • Location Precision: 0.9
  • Location Recall: 0.9677
  • Location F1: 0.9326
  • Location Number: 93
  • Organization Precision: 0.9162
  • Organization Recall: 0.9217
  • Organization F1: 0.9189
  • Organization Number: 166
  • Person Precision: 0.9787
  • Person Recall: 0.9718
  • Person F1: 0.9753
  • Person Number: 142
  • Overall Precision: 0.9338
  • Overall Recall: 0.9501
  • Overall F1: 0.9419
  • Overall Accuracy: 0.9890

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8532 1.0 96 0.3947 0.0 0.0 0.0 93 0.24 0.2169 0.2278 166 0.2622 0.3028 0.2810 142 0.25 0.1970 0.2204 0.8639
0.359 2.0 192 0.2230 0.3176 0.5054 0.3900 93 0.5739 0.6084 0.5906 166 0.6216 0.8099 0.7034 142 0.5167 0.6559 0.5780 0.9342
0.1925 3.0 288 0.1115 0.8133 0.6559 0.7262 93 0.6634 0.8193 0.7332 166 0.9020 0.9718 0.9356 142 0.7737 0.8354 0.8034 0.9613
0.1309 4.0 384 0.0838 0.7143 0.9140 0.8019 93 0.7953 0.8193 0.8071 166 0.9272 0.9859 0.9556 142 0.8186 0.9002 0.8575 0.9720
0.1058 5.0 480 0.0644 0.8804 0.8710 0.8757 93 0.8690 0.8795 0.8743 166 0.9653 0.9789 0.9720 142 0.9059 0.9127 0.9093 0.9811
0.0954 6.0 576 0.0571 0.7719 0.9462 0.8502 93 0.8839 0.8253 0.8536 166 0.9718 0.9718 0.9718 142 0.8832 0.9052 0.8941 0.9819
0.0835 7.0 672 0.0501 0.8350 0.9247 0.8776 93 0.8715 0.9398 0.9043 166 0.9650 0.9718 0.9684 142 0.8941 0.9476 0.9201 0.9846
0.0757 8.0 768 0.0515 0.8241 0.9570 0.8856 93 0.9030 0.8976 0.9003 166 0.9857 0.9718 0.9787 142 0.9104 0.9377 0.9238 0.9833
0.0712 9.0 864 0.0452 0.8365 0.9355 0.8832 93 0.9102 0.9157 0.9129 166 0.9718 0.9718 0.9718 142 0.9128 0.9401 0.9263 0.9852
0.0653 10.0 960 0.0424 0.7982 0.9355 0.8614 93 0.9130 0.8855 0.8991 166 0.9653 0.9789 0.9720 142 0.9010 0.9302 0.9153 0.9838
0.0602 11.0 1056 0.0433 0.8365 0.9355 0.8832 93 0.9102 0.9157 0.9129 166 0.9586 0.9789 0.9686 142 0.9087 0.9426 0.9253 0.9855
0.0517 12.0 1152 0.0398 0.8614 0.9355 0.8969 93 0.9006 0.9277 0.9139 166 0.9857 0.9718 0.9787 142 0.9199 0.9451 0.9323 0.9874
0.0504 13.0 1248 0.0478 0.9348 0.9247 0.9297 93 0.8941 0.9157 0.9048 166 0.9448 0.9648 0.9547 142 0.9214 0.9352 0.9282 0.9855
0.0475 14.0 1344 0.0485 0.8558 0.9570 0.9036 93 0.9497 0.9096 0.9292 166 0.9586 0.9789 0.9686 142 0.9289 0.9451 0.9370 0.9868
0.0467 15.0 1440 0.0451 0.8411 0.9677 0.9000 93 0.9241 0.8795 0.9012 166 0.9720 0.9789 0.9754 142 0.9191 0.9352 0.9271 0.9857
0.044 16.0 1536 0.0426 0.88 0.9462 0.9119 93 0.9146 0.9036 0.9091 166 0.9718 0.9718 0.9718 142 0.9261 0.9377 0.9318 0.9874
0.0409 17.0 1632 0.0428 0.8824 0.9677 0.9231 93 0.9152 0.9096 0.9124 166 0.9650 0.9718 0.9684 142 0.9244 0.9451 0.9346 0.9885
0.037 18.0 1728 0.0448 0.8911 0.9677 0.9278 93 0.9367 0.8916 0.9136 166 0.9718 0.9718 0.9718 142 0.9377 0.9377 0.9377 0.9879
0.0357 19.0 1824 0.0451 0.9474 0.9677 0.9574 93 0.9387 0.9217 0.9301 166 0.9787 0.9718 0.9753 142 0.9549 0.9501 0.9525 0.9896
0.032 20.0 1920 0.0417 0.88 0.9462 0.9119 93 0.9222 0.9277 0.9249 166 0.9789 0.9789 0.9789 142 0.9315 0.9501 0.9407 0.9888
0.0326 21.0 2016 0.0439 0.8980 0.9462 0.9215 93 0.9059 0.9277 0.9167 166 0.9857 0.9718 0.9787 142 0.9314 0.9476 0.9394 0.9885
0.0348 22.0 2112 0.0414 0.8922 0.9785 0.9333 93 0.9102 0.9157 0.9129 166 0.9787 0.9718 0.9753 142 0.9293 0.9501 0.9396 0.9885
0.0288 23.0 2208 0.0401 0.9192 0.9785 0.9479 93 0.9123 0.9398 0.9258 166 0.9789 0.9789 0.9789 142 0.9369 0.9626 0.9496 0.9901
0.0317 24.0 2304 0.0423 0.9271 0.9570 0.9418 93 0.9277 0.9277 0.9277 166 0.9720 0.9789 0.9754 142 0.9432 0.9526 0.9479 0.9896
0.0296 25.0 2400 0.0435 0.9286 0.9785 0.9529 93 0.9390 0.9277 0.9333 166 0.9718 0.9718 0.9718 142 0.9480 0.9551 0.9516 0.9901
0.0269 26.0 2496 0.0430 0.9184 0.9677 0.9424 93 0.9387 0.9217 0.9301 166 0.9718 0.9718 0.9718 142 0.9454 0.9501 0.9478 0.9896
0.0248 27.0 2592 0.0428 0.9278 0.9677 0.9474 93 0.9222 0.9277 0.9249 166 0.9787 0.9718 0.9753 142 0.9432 0.9526 0.9479 0.9898
0.0263 28.0 2688 0.0489 0.8980 0.9462 0.9215 93 0.9123 0.9398 0.9258 166 0.9718 0.9718 0.9718 142 0.9294 0.9526 0.9409 0.9888
0.0245 29.0 2784 0.0465 0.9010 0.9785 0.9381 93 0.9202 0.9036 0.9119 166 0.9718 0.9718 0.9718 142 0.9335 0.9451 0.9393 0.9888
0.0217 30.0 2880 0.0465 0.8911 0.9677 0.9278 93 0.9172 0.9337 0.9254 166 0.9718 0.9718 0.9718 142 0.9296 0.9551 0.9422 0.9882
0.0219 31.0 2976 0.0479 0.9184 0.9677 0.9424 93 0.9162 0.9217 0.9189 166 0.9720 0.9789 0.9754 142 0.9363 0.9526 0.9444 0.9893
0.0209 32.0 3072 0.0500 0.9010 0.9785 0.9381 93 0.9268 0.9157 0.9212 166 0.9787 0.9718 0.9753 142 0.9384 0.9501 0.9442 0.9882
0.0206 33.0 3168 0.0465 0.9286 0.9785 0.9529 93 0.9217 0.9217 0.9217 166 0.9789 0.9789 0.9789 142 0.9433 0.9551 0.9492 0.9893
0.0202 34.0 3264 0.0447 0.9278 0.9677 0.9474 93 0.9448 0.9277 0.9362 166 0.9718 0.9718 0.9718 142 0.9502 0.9526 0.9514 0.9901
0.0187 35.0 3360 0.0518 0.9451 0.9247 0.9348 93 0.9029 0.9518 0.9267 166 0.9650 0.9718 0.9684 142 0.9340 0.9526 0.9432 0.9885
0.019 36.0 3456 0.0499 0.9 0.9677 0.9326 93 0.9268 0.9157 0.9212 166 0.9718 0.9718 0.9718 142 0.9360 0.9476 0.9418 0.9888
0.0197 37.0 3552 0.0472 0.9560 0.9355 0.9457 93 0.9349 0.9518 0.9433 166 0.9650 0.9718 0.9684 142 0.9504 0.9551 0.9527 0.9896
0.0169 38.0 3648 0.0564 0.8725 0.9570 0.9128 93 0.9107 0.9217 0.9162 166 0.9787 0.9718 0.9753 142 0.9246 0.9476 0.9360 0.9860
0.0176 39.0 3744 0.0530 0.8824 0.9677 0.9231 93 0.9202 0.9036 0.9119 166 0.9720 0.9789 0.9754 142 0.9289 0.9451 0.9370 0.9877
0.0184 40.0 3840 0.0491 0.9570 0.9570 0.9570 93 0.9176 0.9398 0.9286 166 0.9650 0.9718 0.9684 142 0.9433 0.9551 0.9492 0.9882
0.0169 41.0 3936 0.0543 0.9462 0.9462 0.9462 93 0.9176 0.9398 0.9286 166 0.9650 0.9718 0.9684 142 0.9409 0.9526 0.9467 0.9879
0.0152 42.0 4032 0.0538 0.9184 0.9677 0.9424 93 0.9394 0.9337 0.9366 166 0.9720 0.9789 0.9754 142 0.9458 0.9576 0.9517 0.9888
0.015 43.0 4128 0.0531 0.9271 0.9570 0.9418 93 0.9341 0.9398 0.9369 166 0.9720 0.9789 0.9754 142 0.9458 0.9576 0.9517 0.9898
0.0154 44.0 4224 0.0561 0.9175 0.9570 0.9368 93 0.9226 0.9337 0.9281 166 0.9650 0.9718 0.9684 142 0.9363 0.9526 0.9444 0.9879
0.0149 45.0 4320 0.0562 0.9082 0.9570 0.9319 93 0.9212 0.9157 0.9184 166 0.9787 0.9718 0.9753 142 0.9381 0.9451 0.9416 0.9879
0.0158 46.0 4416 0.0483 0.9091 0.9677 0.9375 93 0.9277 0.9277 0.9277 166 0.9858 0.9789 0.9823 142 0.9433 0.9551 0.9492 0.9898
0.0131 47.0 4512 0.0485 0.9271 0.9570 0.9418 93 0.9112 0.9277 0.9194 166 0.9718 0.9718 0.9718 142 0.9361 0.9501 0.9431 0.9893
0.0149 48.0 4608 0.0595 0.89 0.9570 0.9223 93 0.9255 0.8976 0.9113 166 0.9857 0.9718 0.9787 142 0.9377 0.9377 0.9377 0.9877
0.0147 49.0 4704 0.0587 0.8738 0.9677 0.9184 93 0.9193 0.8916 0.9052 166 0.9718 0.9718 0.9718 142 0.9261 0.9377 0.9318 0.9863
0.0128 50.0 4800 0.0571 0.9278 0.9677 0.9474 93 0.9112 0.9277 0.9194 166 0.9787 0.9718 0.9753 142 0.9386 0.9526 0.9455 0.9888
0.0107 51.0 4896 0.0608 0.89 0.9570 0.9223 93 0.9212 0.9157 0.9184 166 0.9718 0.9718 0.9718 142 0.9312 0.9451 0.9381 0.9877
0.0127 52.0 4992 0.0599 0.9082 0.9570 0.9319 93 0.9268 0.9157 0.9212 166 0.9718 0.9718 0.9718 142 0.9381 0.9451 0.9416 0.9885
0.012 53.0 5088 0.0573 0.9 0.9677 0.9326 93 0.9268 0.9157 0.9212 166 0.9718 0.9718 0.9718 142 0.9360 0.9476 0.9418 0.9885
0.0114 54.0 5184 0.0572 0.9271 0.9570 0.9418 93 0.9277 0.9277 0.9277 166 0.9718 0.9718 0.9718 142 0.9431 0.9501 0.9466 0.9893
0.0115 55.0 5280 0.0574 0.91 0.9785 0.9430 93 0.9264 0.9096 0.9179 166 0.9787 0.9718 0.9753 142 0.9406 0.9476 0.9441 0.9890
0.0106 56.0 5376 0.0600 0.8911 0.9677 0.9278 93 0.9277 0.9277 0.9277 166 0.9718 0.9718 0.9718 142 0.9340 0.9526 0.9432 0.9885
0.0111 57.0 5472 0.0596 0.89 0.9570 0.9223 93 0.9217 0.9217 0.9217 166 0.9650 0.9718 0.9684 142 0.9291 0.9476 0.9383 0.9879
0.0117 58.0 5568 0.0610 0.9010 0.9785 0.9381 93 0.9329 0.9217 0.9273 166 0.9718 0.9718 0.9718 142 0.9386 0.9526 0.9455 0.9882
0.0091 59.0 5664 0.0595 0.9091 0.9677 0.9375 93 0.9217 0.9217 0.9217 166 0.9718 0.9718 0.9718 142 0.9361 0.9501 0.9431 0.9888
0.0095 60.0 5760 0.0567 0.9091 0.9677 0.9375 93 0.9226 0.9337 0.9281 166 0.9718 0.9718 0.9718 142 0.9364 0.9551 0.9457 0.9890
0.011 61.0 5856 0.0590 0.9462 0.9462 0.9462 93 0.9070 0.9398 0.9231 166 0.9718 0.9718 0.9718 142 0.9386 0.9526 0.9455 0.9888
0.0109 62.0 5952 0.0650 0.89 0.9570 0.9223 93 0.9202 0.9036 0.9119 166 0.9718 0.9718 0.9718 142 0.9309 0.9401 0.9355 0.9866
0.0101 63.0 6048 0.0630 0.91 0.9785 0.9430 93 0.9222 0.9277 0.9249 166 0.9718 0.9718 0.9718 142 0.9364 0.9551 0.9457 0.9885
0.0098 64.0 6144 0.0605 0.9167 0.9462 0.9312 93 0.9172 0.9337 0.9254 166 0.9787 0.9718 0.9753 142 0.9384 0.9501 0.9442 0.9882
0.0093 65.0 6240 0.0651 0.8889 0.9462 0.9167 93 0.9102 0.9157 0.9129 166 0.9583 0.9718 0.9650 142 0.9220 0.9426 0.9322 0.9866
0.0089 66.0 6336 0.0651 0.8911 0.9677 0.9278 93 0.9212 0.9157 0.9184 166 0.9650 0.9718 0.9684 142 0.9291 0.9476 0.9383 0.9879
0.0081 67.0 6432 0.0660 0.8738 0.9677 0.9184 93 0.9259 0.9036 0.9146 166 0.9789 0.9789 0.9789 142 0.9312 0.9451 0.9381 0.9877
0.0093 68.0 6528 0.0646 0.9091 0.9677 0.9375 93 0.9268 0.9157 0.9212 166 0.9718 0.9718 0.9718 142 0.9383 0.9476 0.9429 0.9888
0.0079 69.0 6624 0.0610 0.9368 0.9570 0.9468 93 0.9222 0.9277 0.9249 166 0.9718 0.9718 0.9718 142 0.9431 0.9501 0.9466 0.9896
0.0079 70.0 6720 0.0638 0.9184 0.9677 0.9424 93 0.9157 0.9157 0.9157 166 0.9787 0.9718 0.9753 142 0.9383 0.9476 0.9429 0.9890
0.0077 71.0 6816 0.0633 0.9167 0.9462 0.9312 93 0.9226 0.9337 0.9281 166 0.9720 0.9789 0.9754 142 0.9386 0.9526 0.9455 0.9890
0.0082 72.0 6912 0.0667 0.9 0.9677 0.9326 93 0.9162 0.9217 0.9189 166 0.9787 0.9718 0.9753 142 0.9338 0.9501 0.9419 0.9888
0.0086 73.0 7008 0.0638 0.8990 0.9570 0.9271 93 0.9217 0.9217 0.9217 166 0.9787 0.9718 0.9753 142 0.9360 0.9476 0.9418 0.9896
0.0084 74.0 7104 0.0692 0.8889 0.9462 0.9167 93 0.9212 0.9157 0.9184 166 0.9789 0.9789 0.9789 142 0.9335 0.9451 0.9393 0.9874
0.0078 75.0 7200 0.0677 0.89 0.9570 0.9223 93 0.9277 0.9277 0.9277 166 0.9718 0.9718 0.9718 142 0.9338 0.9501 0.9419 0.9882
0.0082 76.0 7296 0.0631 0.9175 0.9570 0.9368 93 0.9333 0.9277 0.9305 166 0.9789 0.9789 0.9789 142 0.9455 0.9526 0.9491 0.9898
0.007 77.0 7392 0.0711 0.8980 0.9462 0.9215 93 0.9255 0.8976 0.9113 166 0.9787 0.9718 0.9753 142 0.9375 0.9352 0.9363 0.9868
0.008 78.0 7488 0.0645 0.9184 0.9677 0.9424 93 0.9451 0.9337 0.9394 166 0.9789 0.9789 0.9789 142 0.9505 0.9576 0.9540 0.9907
0.0066 79.0 7584 0.0623 0.9082 0.9570 0.9319 93 0.9333 0.9277 0.9305 166 0.9858 0.9789 0.9823 142 0.9455 0.9526 0.9491 0.9890
0.0069 80.0 7680 0.0697 0.8812 0.9570 0.9175 93 0.9375 0.9036 0.9202 166 0.9858 0.9789 0.9823 142 0.9403 0.9426 0.9415 0.9877
0.0058 81.0 7776 0.0668 0.8889 0.9462 0.9167 93 0.9273 0.9217 0.9245 166 0.9650 0.9718 0.9684 142 0.9312 0.9451 0.9381 0.9879
0.0069 82.0 7872 0.0683 0.89 0.9570 0.9223 93 0.9325 0.9157 0.9240 166 0.9789 0.9789 0.9789 142 0.9383 0.9476 0.9429 0.9882
0.0068 83.0 7968 0.0706 0.8990 0.9570 0.9271 93 0.9212 0.9157 0.9184 166 0.9789 0.9789 0.9789 142 0.9360 0.9476 0.9418 0.9877
0.0065 84.0 8064 0.0690 0.9 0.9677 0.9326 93 0.9333 0.9277 0.9305 166 0.9858 0.9789 0.9823 142 0.9433 0.9551 0.9492 0.9885
0.0066 85.0 8160 0.0673 0.9 0.9677 0.9326 93 0.9277 0.9277 0.9277 166 0.9789 0.9789 0.9789 142 0.9387 0.9551 0.9468 0.9879
0.0063 86.0 8256 0.0659 0.9184 0.9677 0.9424 93 0.9107 0.9217 0.9162 166 0.9787 0.9718 0.9753 142 0.9361 0.9501 0.9431 0.9890
0.0064 87.0 8352 0.0661 0.9184 0.9677 0.9424 93 0.9273 0.9217 0.9245 166 0.9787 0.9718 0.9753 142 0.9431 0.9501 0.9466 0.9896
0.0056 88.0 8448 0.0643 0.9184 0.9677 0.9424 93 0.9222 0.9277 0.9249 166 0.9787 0.9718 0.9753 142 0.9409 0.9526 0.9467 0.9898
0.0065 89.0 8544 0.0647 0.9091 0.9677 0.9375 93 0.9162 0.9217 0.9189 166 0.9718 0.9718 0.9718 142 0.9338 0.9501 0.9419 0.9893
0.0062 90.0 8640 0.0644 0.9184 0.9677 0.9424 93 0.9273 0.9217 0.9245 166 0.9787 0.9718 0.9753 142 0.9431 0.9501 0.9466 0.9896
0.0058 91.0 8736 0.0660 0.9 0.9677 0.9326 93 0.9217 0.9217 0.9217 166 0.9718 0.9718 0.9718 142 0.9338 0.9501 0.9419 0.9885
0.0051 92.0 8832 0.0647 0.9192 0.9785 0.9479 93 0.9277 0.9277 0.9277 166 0.9718 0.9718 0.9718 142 0.9410 0.9551 0.9480 0.9888
0.0048 93.0 8928 0.0672 0.89 0.9570 0.9223 93 0.9107 0.9217 0.9162 166 0.9718 0.9718 0.9718 142 0.9268 0.9476 0.9371 0.9882
0.0058 94.0 9024 0.0668 0.9091 0.9677 0.9375 93 0.9162 0.9217 0.9189 166 0.9787 0.9718 0.9753 142 0.9361 0.9501 0.9431 0.9890
0.0067 95.0 9120 0.0673 0.9091 0.9677 0.9375 93 0.9162 0.9217 0.9189 166 0.9718 0.9718 0.9718 142 0.9338 0.9501 0.9419 0.9885
0.0051 96.0 9216 0.0672 0.9278 0.9677 0.9474 93 0.9217 0.9217 0.9217 166 0.9787 0.9718 0.9753 142 0.9431 0.9501 0.9466 0.9890
0.005 97.0 9312 0.0677 0.9 0.9677 0.9326 93 0.9162 0.9217 0.9189 166 0.9787 0.9718 0.9753 142 0.9338 0.9501 0.9419 0.9890
0.0062 98.0 9408 0.0679 0.9 0.9677 0.9326 93 0.9162 0.9217 0.9189 166 0.9787 0.9718 0.9753 142 0.9338 0.9501 0.9419 0.9890
0.0053 99.0 9504 0.0678 0.9 0.9677 0.9326 93 0.9162 0.9217 0.9189 166 0.9787 0.9718 0.9753 142 0.9338 0.9501 0.9419 0.9890
0.0046 100.0 9600 0.0677 0.9 0.9677 0.9326 93 0.9162 0.9217 0.9189 166 0.9787 0.9718 0.9753 142 0.9338 0.9501 0.9419 0.9890

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl30-2

Finetuned
(365)
this model