Edit model card

nerui-pt-pl30-1

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0474
  • Location Precision: 0.9407
  • Location Recall: 0.9569
  • Location F1: 0.9487
  • Location Number: 116
  • Organization Precision: 0.9675
  • Organization Recall: 0.9430
  • Organization F1: 0.9551
  • Organization Number: 158
  • Person Precision: 0.9685
  • Person Recall: 0.9919
  • Person F1: 0.9801
  • Person Number: 124
  • Overall Precision: 0.9599
  • Overall Recall: 0.9623
  • Overall F1: 0.9611
  • Overall Accuracy: 0.9912

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
0.8685 1.0 96 0.4002 0.0 0.0 0.0 116 0.2589 0.1835 0.2148 158 0.3131 0.25 0.2780 124 0.2817 0.1508 0.1964 0.8600
0.3663 2.0 192 0.2149 0.3 0.3879 0.3383 116 0.5629 0.5380 0.5502 158 0.6084 0.8145 0.6966 124 0.4946 0.5804 0.5341 0.9286
0.2004 3.0 288 0.1089 0.8108 0.7759 0.7930 116 0.7033 0.8101 0.7529 158 0.8971 0.9839 0.9385 124 0.7925 0.8543 0.8222 0.9673
0.1345 4.0 384 0.0808 0.7698 0.9224 0.8392 116 0.8121 0.8481 0.8297 158 0.9462 0.9919 0.9685 124 0.8387 0.9146 0.8750 0.9731
0.1084 5.0 480 0.0553 0.8938 0.8707 0.8821 116 0.8938 0.9051 0.8994 158 0.9685 0.9919 0.9801 124 0.9175 0.9221 0.9198 0.9835
0.0959 6.0 576 0.0483 0.8385 0.9397 0.8862 116 0.8834 0.9114 0.8972 158 0.9762 0.9919 0.9840 124 0.8974 0.9447 0.9204 0.9855
0.0827 7.0 672 0.0428 0.9455 0.8966 0.9204 116 0.8523 0.9494 0.8982 158 0.9839 0.9839 0.9839 124 0.9171 0.9447 0.9307 0.9871
0.08 8.0 768 0.0381 0.9145 0.9224 0.9185 116 0.9136 0.9367 0.9250 158 0.976 0.9839 0.9799 124 0.9332 0.9472 0.9401 0.9896
0.0729 9.0 864 0.0486 0.872 0.9397 0.9046 116 0.9141 0.9430 0.9283 158 0.9762 0.9919 0.9840 124 0.9203 0.9573 0.9384 0.9844
0.0635 10.0 960 0.0435 0.8425 0.9224 0.8807 116 0.9136 0.9367 0.9250 158 0.9685 0.9919 0.9801 124 0.9087 0.9497 0.9287 0.9857
0.0614 11.0 1056 0.0421 0.8594 0.9483 0.9016 116 0.94 0.8924 0.9156 158 0.9762 0.9919 0.9840 124 0.9257 0.9397 0.9327 0.9865
0.0561 12.0 1152 0.0318 0.8852 0.9310 0.9076 116 0.9416 0.9177 0.9295 158 0.9762 0.9919 0.9840 124 0.9353 0.9447 0.94 0.9882
0.0546 13.0 1248 0.0305 0.9322 0.9483 0.9402 116 0.9371 0.9430 0.9401 158 0.9685 0.9919 0.9801 124 0.9455 0.9598 0.9526 0.9915
0.0475 14.0 1344 0.0348 0.9231 0.9310 0.9270 116 0.9484 0.9304 0.9393 158 0.976 0.9839 0.9799 124 0.9496 0.9472 0.9484 0.9896
0.0464 15.0 1440 0.0338 0.925 0.9569 0.9407 116 0.9423 0.9304 0.9363 158 0.9762 0.9919 0.9840 124 0.9478 0.9573 0.9525 0.9904
0.0428 16.0 1536 0.0317 0.9167 0.9483 0.9322 116 0.9430 0.9430 0.9430 158 0.976 0.9839 0.9799 124 0.9454 0.9573 0.9513 0.9907
0.0417 17.0 1632 0.0322 0.925 0.9569 0.9407 116 0.9419 0.9241 0.9329 158 0.9762 0.9919 0.9840 124 0.9476 0.9548 0.9512 0.9912
0.0388 18.0 1728 0.0332 0.9322 0.9483 0.9402 116 0.9551 0.9430 0.9490 158 0.9839 0.9839 0.9839 124 0.9573 0.9573 0.9573 0.9909
0.0366 19.0 1824 0.0317 0.9068 0.9224 0.9145 116 0.925 0.9367 0.9308 158 0.9762 0.9919 0.9840 124 0.9356 0.9497 0.9426 0.9909
0.0373 20.0 1920 0.0352 0.9083 0.9397 0.9237 116 0.9423 0.9304 0.9363 158 0.9762 0.9919 0.9840 124 0.9428 0.9523 0.9475 0.9893
0.0363 21.0 2016 0.0348 0.9106 0.9655 0.9372 116 0.9608 0.9304 0.9453 158 0.9762 0.9919 0.9840 124 0.9502 0.9598 0.9550 0.9901
0.0333 22.0 2112 0.0326 0.9402 0.9483 0.9442 116 0.9675 0.9430 0.9551 158 0.9762 0.9919 0.9840 124 0.9622 0.9598 0.9610 0.9920
0.0334 23.0 2208 0.0332 0.9244 0.9483 0.9362 116 0.9603 0.9177 0.9385 158 0.9839 0.9839 0.9839 124 0.9569 0.9472 0.9520 0.9907
0.0311 24.0 2304 0.0336 0.9316 0.9397 0.9356 116 0.9542 0.9241 0.9389 158 0.9762 0.9919 0.9840 124 0.9545 0.9497 0.9521 0.9912
0.0291 25.0 2400 0.0459 0.8682 0.9655 0.9143 116 0.9608 0.9304 0.9453 158 0.9685 0.9919 0.9801 124 0.9340 0.9598 0.9467 0.9887
0.0309 26.0 2496 0.0323 0.9407 0.9569 0.9487 116 0.9610 0.9367 0.9487 158 0.9839 0.9839 0.9839 124 0.9621 0.9573 0.9597 0.9923
0.03 27.0 2592 0.0337 0.9316 0.9397 0.9356 116 0.9539 0.9177 0.9355 158 0.9762 0.9919 0.9840 124 0.9544 0.9472 0.9508 0.9898
0.0251 28.0 2688 0.0360 0.9083 0.9397 0.9237 116 0.9367 0.9367 0.9367 158 0.976 0.9839 0.9799 124 0.9404 0.9523 0.9463 0.9898
0.0228 29.0 2784 0.0436 0.9083 0.9397 0.9237 116 0.9430 0.9430 0.9430 158 0.9762 0.9919 0.9840 124 0.9431 0.9573 0.9501 0.9890
0.0248 30.0 2880 0.0401 0.9138 0.9138 0.9138 116 0.9313 0.9430 0.9371 158 0.9762 0.9919 0.9840 124 0.9403 0.9497 0.9450 0.9890
0.0238 31.0 2976 0.0381 0.9412 0.9655 0.9532 116 0.9669 0.9241 0.9450 158 0.976 0.9839 0.9799 124 0.9620 0.9548 0.9584 0.9909
0.0213 32.0 3072 0.0407 0.9167 0.9483 0.9322 116 0.9419 0.9241 0.9329 158 0.976 0.9839 0.9799 124 0.945 0.9497 0.9474 0.9901
0.0222 33.0 3168 0.0342 0.9402 0.9483 0.9442 116 0.9539 0.9177 0.9355 158 0.9762 0.9919 0.9840 124 0.9570 0.9497 0.9533 0.9915
0.022 34.0 3264 0.0417 0.9083 0.9397 0.9237 116 0.9430 0.9430 0.9430 158 0.976 0.9839 0.9799 124 0.9429 0.9548 0.9488 0.9898
0.0217 35.0 3360 0.0404 0.9153 0.9310 0.9231 116 0.9490 0.9430 0.9460 158 0.9683 0.9839 0.976 124 0.9451 0.9523 0.9487 0.9909
0.0188 36.0 3456 0.0390 0.9474 0.9310 0.9391 116 0.9363 0.9304 0.9333 158 0.9609 0.9919 0.9762 124 0.9474 0.9497 0.9486 0.9909
0.0212 37.0 3552 0.0352 0.9643 0.9310 0.9474 116 0.9497 0.9557 0.9527 158 0.976 0.9839 0.9799 124 0.9621 0.9573 0.9597 0.9923
0.0183 38.0 3648 0.0390 0.9478 0.9397 0.9437 116 0.9434 0.9494 0.9464 158 0.976 0.9839 0.9799 124 0.9549 0.9573 0.9561 0.9915
0.0184 39.0 3744 0.0332 0.9483 0.9483 0.9483 116 0.9613 0.9430 0.9521 158 0.976 0.9839 0.9799 124 0.9621 0.9573 0.9597 0.9926
0.0176 40.0 3840 0.0409 0.9167 0.9483 0.9322 116 0.9608 0.9304 0.9453 158 0.9606 0.9839 0.9721 124 0.9475 0.9523 0.9499 0.9901
0.0182 41.0 3936 0.0388 0.9322 0.9483 0.9402 116 0.9554 0.9494 0.9524 158 0.9685 0.9919 0.9801 124 0.9527 0.9623 0.9575 0.9923
0.0175 42.0 4032 0.0399 0.9098 0.9569 0.9328 116 0.9613 0.9430 0.9521 158 0.9685 0.9919 0.9801 124 0.9480 0.9623 0.9551 0.9912
0.0169 43.0 4128 0.0381 0.9478 0.9397 0.9437 116 0.9245 0.9304 0.9274 158 0.9839 0.9839 0.9839 124 0.9497 0.9497 0.9497 0.9907
0.0166 44.0 4224 0.0399 0.9174 0.9569 0.9367 116 0.9613 0.9430 0.9521 158 0.9531 0.9839 0.9683 124 0.9455 0.9598 0.9526 0.9907
0.015 45.0 4320 0.0362 0.9333 0.9655 0.9492 116 0.9608 0.9304 0.9453 158 0.976 0.9839 0.9799 124 0.9573 0.9573 0.9573 0.9923
0.0163 46.0 4416 0.0411 0.9008 0.9397 0.9198 116 0.9548 0.9367 0.9457 158 0.9685 0.9919 0.9801 124 0.9429 0.9548 0.9488 0.9901
0.0163 47.0 4512 0.0393 0.9167 0.9483 0.9322 116 0.9608 0.9304 0.9453 158 0.9762 0.9919 0.9840 124 0.9524 0.9548 0.9536 0.9915
0.0136 48.0 4608 0.0423 0.9316 0.9397 0.9356 116 0.9548 0.9367 0.9457 158 0.9762 0.9919 0.9840 124 0.9548 0.9548 0.9548 0.9915
0.0147 49.0 4704 0.0378 0.9397 0.9397 0.9397 116 0.9551 0.9430 0.9490 158 0.9839 0.9839 0.9839 124 0.9596 0.9548 0.9572 0.9920
0.0126 50.0 4800 0.0433 0.9316 0.9397 0.9356 116 0.9551 0.9430 0.9490 158 0.9683 0.9839 0.976 124 0.9524 0.9548 0.9536 0.9909
0.0146 51.0 4896 0.0448 0.9244 0.9483 0.9362 116 0.9608 0.9304 0.9453 158 0.9685 0.9919 0.9801 124 0.9524 0.9548 0.9536 0.9904
0.0146 52.0 4992 0.0469 0.9174 0.9569 0.9367 116 0.9669 0.9241 0.9450 158 0.9609 0.9919 0.9762 124 0.95 0.9548 0.9524 0.9904
0.0119 53.0 5088 0.0466 0.9160 0.9397 0.9277 116 0.9487 0.9367 0.9427 158 0.9609 0.9919 0.9762 124 0.9429 0.9548 0.9488 0.9898
0.0128 54.0 5184 0.0531 0.9167 0.9483 0.9322 116 0.9548 0.9367 0.9457 158 0.9609 0.9919 0.9762 124 0.9454 0.9573 0.9513 0.9893
0.013 55.0 5280 0.0437 0.9402 0.9483 0.9442 116 0.9481 0.9241 0.9359 158 0.9609 0.9919 0.9762 124 0.9499 0.9523 0.9511 0.9909
0.0125 56.0 5376 0.0464 0.9237 0.9397 0.9316 116 0.9542 0.9241 0.9389 158 0.9609 0.9919 0.9762 124 0.9474 0.9497 0.9486 0.9896
0.012 57.0 5472 0.0429 0.9397 0.9397 0.9397 116 0.9363 0.9304 0.9333 158 0.9685 0.9919 0.9801 124 0.9475 0.9523 0.9499 0.9904
0.012 58.0 5568 0.0482 0.9237 0.9397 0.9316 116 0.9484 0.9304 0.9393 158 0.9685 0.9919 0.9801 124 0.9475 0.9523 0.9499 0.9893
0.0123 59.0 5664 0.0561 0.9167 0.9483 0.9322 116 0.9416 0.9177 0.9295 158 0.9609 0.9919 0.9762 124 0.9403 0.9497 0.9450 0.9887
0.0117 60.0 5760 0.0506 0.9322 0.9483 0.9402 116 0.9545 0.9304 0.9423 158 0.9762 0.9919 0.9840 124 0.9548 0.9548 0.9548 0.9898
0.0115 61.0 5856 0.0482 0.9322 0.9483 0.9402 116 0.9545 0.9304 0.9423 158 0.9762 0.9919 0.9840 124 0.9548 0.9548 0.9548 0.9898
0.0094 62.0 5952 0.0504 0.9322 0.9483 0.9402 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.9548 0.9548 0.9548 0.9901
0.0101 63.0 6048 0.0478 0.9322 0.9483 0.9402 116 0.9484 0.9304 0.9393 158 0.984 0.9919 0.9880 124 0.9548 0.9548 0.9548 0.9904
0.0096 64.0 6144 0.0485 0.9322 0.9483 0.9402 116 0.9608 0.9304 0.9453 158 0.984 0.9919 0.9880 124 0.9596 0.9548 0.9572 0.9915
0.0095 65.0 6240 0.0578 0.9244 0.9483 0.9362 116 0.9545 0.9304 0.9423 158 0.9609 0.9919 0.9762 124 0.9476 0.9548 0.9512 0.9896
0.0113 66.0 6336 0.0549 0.925 0.9569 0.9407 116 0.9605 0.9241 0.9419 158 0.9685 0.9919 0.9801 124 0.9524 0.9548 0.9536 0.9904
0.0088 67.0 6432 0.0499 0.9322 0.9483 0.9402 116 0.9484 0.9304 0.9393 158 0.9609 0.9919 0.9762 124 0.9476 0.9548 0.9512 0.9901
0.0094 68.0 6528 0.0439 0.9402 0.9483 0.9442 116 0.9548 0.9367 0.9457 158 0.9762 0.9919 0.9840 124 0.9573 0.9573 0.9573 0.9912
0.008 69.0 6624 0.0484 0.9402 0.9483 0.9442 116 0.9613 0.9430 0.9521 158 0.9762 0.9919 0.9840 124 0.9598 0.9598 0.9598 0.9907
0.0092 70.0 6720 0.0447 0.9322 0.9483 0.9402 116 0.9484 0.9304 0.9393 158 0.9685 0.9919 0.9801 124 0.95 0.9548 0.9524 0.9907
0.008 71.0 6816 0.0524 0.9244 0.9483 0.9362 116 0.9481 0.9241 0.9359 158 0.9762 0.9919 0.9840 124 0.9499 0.9523 0.9511 0.9898
0.0084 72.0 6912 0.0443 0.9322 0.9483 0.9402 116 0.9548 0.9367 0.9457 158 0.9762 0.9919 0.9840 124 0.9549 0.9573 0.9561 0.9909
0.0085 73.0 7008 0.0468 0.9402 0.9483 0.9442 116 0.9551 0.9430 0.9490 158 0.9685 0.9919 0.9801 124 0.955 0.9598 0.9574 0.9909
0.0081 74.0 7104 0.0473 0.9322 0.9483 0.9402 116 0.9545 0.9304 0.9423 158 0.9762 0.9919 0.9840 124 0.9548 0.9548 0.9548 0.9912
0.0087 75.0 7200 0.0448 0.9328 0.9569 0.9447 116 0.9542 0.9241 0.9389 158 0.9685 0.9919 0.9801 124 0.9524 0.9548 0.9536 0.9912
0.0076 76.0 7296 0.0498 0.9244 0.9483 0.9362 116 0.9487 0.9367 0.9427 158 0.9685 0.9919 0.9801 124 0.9478 0.9573 0.9525 0.9904
0.0074 77.0 7392 0.0477 0.9322 0.9483 0.9402 116 0.9484 0.9304 0.9393 158 0.9685 0.9919 0.9801 124 0.95 0.9548 0.9524 0.9907
0.008 78.0 7488 0.0429 0.9407 0.9569 0.9487 116 0.9613 0.9430 0.9521 158 0.984 0.9919 0.9880 124 0.9623 0.9623 0.9623 0.9923
0.0072 79.0 7584 0.0491 0.9244 0.9483 0.9362 116 0.9545 0.9304 0.9423 158 0.9609 0.9919 0.9762 124 0.9476 0.9548 0.9512 0.9896
0.0066 80.0 7680 0.0501 0.925 0.9569 0.9407 116 0.9673 0.9367 0.9518 158 0.9685 0.9919 0.9801 124 0.955 0.9598 0.9574 0.9912
0.0068 81.0 7776 0.0472 0.9322 0.9483 0.9402 116 0.9548 0.9367 0.9457 158 0.9685 0.9919 0.9801 124 0.9525 0.9573 0.9549 0.9907
0.0066 82.0 7872 0.0534 0.9322 0.9483 0.9402 116 0.9613 0.9430 0.9521 158 0.9609 0.9919 0.9762 124 0.9526 0.9598 0.9562 0.9901
0.0069 83.0 7968 0.0481 0.9322 0.9483 0.9402 116 0.9548 0.9367 0.9457 158 0.9762 0.9919 0.9840 124 0.9549 0.9573 0.9561 0.9909
0.0065 84.0 8064 0.0521 0.9322 0.9483 0.9402 116 0.9613 0.9430 0.9521 158 0.9685 0.9919 0.9801 124 0.955 0.9598 0.9574 0.9904
0.0063 85.0 8160 0.0548 0.9322 0.9483 0.9402 116 0.9613 0.9430 0.9521 158 0.9685 0.9919 0.9801 124 0.955 0.9598 0.9574 0.9904
0.0064 86.0 8256 0.0527 0.9322 0.9483 0.9402 116 0.9613 0.9430 0.9521 158 0.9685 0.9919 0.9801 124 0.955 0.9598 0.9574 0.9904
0.007 87.0 8352 0.0479 0.9402 0.9483 0.9442 116 0.9551 0.9430 0.9490 158 0.9685 0.9919 0.9801 124 0.955 0.9598 0.9574 0.9909
0.0062 88.0 8448 0.0478 0.9402 0.9483 0.9442 116 0.9675 0.9430 0.9551 158 0.9762 0.9919 0.9840 124 0.9622 0.9598 0.9610 0.9918
0.0057 89.0 8544 0.0476 0.9402 0.9483 0.9442 116 0.9613 0.9430 0.9521 158 0.9762 0.9919 0.9840 124 0.9598 0.9598 0.9598 0.9909
0.0058 90.0 8640 0.0475 0.9402 0.9483 0.9442 116 0.9610 0.9367 0.9487 158 0.9762 0.9919 0.9840 124 0.9597 0.9573 0.9585 0.9912
0.0066 91.0 8736 0.0508 0.9402 0.9483 0.9442 116 0.9613 0.9430 0.9521 158 0.9685 0.9919 0.9801 124 0.9574 0.9598 0.9586 0.9907
0.0061 92.0 8832 0.0507 0.9328 0.9569 0.9447 116 0.9675 0.9430 0.9551 158 0.9685 0.9919 0.9801 124 0.9575 0.9623 0.9599 0.9907
0.0065 93.0 8928 0.0476 0.9407 0.9569 0.9487 116 0.9739 0.9430 0.9582 158 0.9685 0.9919 0.9801 124 0.9623 0.9623 0.9623 0.9920
0.0065 94.0 9024 0.0497 0.9402 0.9483 0.9442 116 0.9613 0.9430 0.9521 158 0.9685 0.9919 0.9801 124 0.9574 0.9598 0.9586 0.9907
0.007 95.0 9120 0.0472 0.9402 0.9483 0.9442 116 0.9613 0.9430 0.9521 158 0.9685 0.9919 0.9801 124 0.9574 0.9598 0.9586 0.9912
0.0076 96.0 9216 0.0495 0.9407 0.9569 0.9487 116 0.9675 0.9430 0.9551 158 0.9685 0.9919 0.9801 124 0.9599 0.9623 0.9611 0.9909
0.0057 97.0 9312 0.0483 0.9407 0.9569 0.9487 116 0.9675 0.9430 0.9551 158 0.9685 0.9919 0.9801 124 0.9599 0.9623 0.9611 0.9912
0.0058 98.0 9408 0.0471 0.9402 0.9483 0.9442 116 0.9613 0.9430 0.9521 158 0.9685 0.9919 0.9801 124 0.9574 0.9598 0.9586 0.9909
0.0063 99.0 9504 0.0477 0.9407 0.9569 0.9487 116 0.9675 0.9430 0.9551 158 0.9685 0.9919 0.9801 124 0.9599 0.9623 0.9611 0.9912
0.0055 100.0 9600 0.0474 0.9407 0.9569 0.9487 116 0.9675 0.9430 0.9551 158 0.9685 0.9919 0.9801 124 0.9599 0.9623 0.9611 0.9912

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-pt-pl30-1

Finetuned
(365)
this model