Edit model card

nerui-lora-r16-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0440
  • Location Precision: 0.8318
  • Location Recall: 0.9468
  • Location F1: 0.8856
  • Location Number: 94
  • Organization Precision: 0.8827
  • Organization Recall: 0.8563
  • Organization F1: 0.8693
  • Organization Number: 167
  • Person Precision: 1.0
  • Person Recall: 0.9854
  • Person F1: 0.9926
  • Person Number: 137
  • Overall Precision: 0.9084
  • Overall Recall: 0.9221
  • Overall F1: 0.9152
  • Overall Accuracy: 0.9845

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.0607 1.0 96 0.6834 0.0 0.0 0.0 94 0.0 0.0 0.0 167 0.0 0.0 0.0 137 0.0 0.0 0.0 0.8343
0.6397 2.0 192 0.5323 0.0 0.0 0.0 94 0.6667 0.0240 0.0462 167 0.2 0.0146 0.0272 137 0.375 0.0151 0.0290 0.8367
0.4975 3.0 288 0.3811 0.16 0.0426 0.0672 94 0.2903 0.1617 0.2077 167 0.2606 0.2701 0.2652 137 0.2615 0.1709 0.2067 0.8663
0.3542 4.0 384 0.2727 0.3231 0.2234 0.2642 94 0.4834 0.6108 0.5397 167 0.4343 0.6277 0.5134 137 0.4409 0.5251 0.4794 0.9141
0.2514 5.0 480 0.1973 0.5393 0.5106 0.5246 94 0.6049 0.7425 0.6667 167 0.7532 0.8686 0.8068 137 0.6438 0.7312 0.6847 0.9434
0.2019 6.0 576 0.1491 0.6915 0.6915 0.6915 94 0.7228 0.7964 0.7578 167 0.9161 0.9562 0.9357 137 0.7815 0.8266 0.8034 0.9602
0.1643 7.0 672 0.1244 0.7170 0.8085 0.76 94 0.7308 0.7964 0.7622 167 0.9301 0.9708 0.9500 137 0.7935 0.8593 0.8251 0.9644
0.1449 8.0 768 0.1025 0.7475 0.7872 0.7668 94 0.7697 0.8204 0.7942 167 0.9496 0.9635 0.9565 137 0.8245 0.8618 0.8428 0.9693
0.1318 9.0 864 0.0919 0.8163 0.8511 0.8333 94 0.7838 0.8683 0.8239 167 0.95 0.9708 0.9603 137 0.8463 0.8995 0.8721 0.9721
0.1184 10.0 960 0.0846 0.8 0.8936 0.8442 94 0.8246 0.8443 0.8343 167 0.9504 0.9781 0.9640 137 0.8609 0.9020 0.8810 0.9751
0.11 11.0 1056 0.0744 0.8454 0.8723 0.8586 94 0.8324 0.8623 0.8471 167 0.9571 0.9781 0.9675 137 0.8780 0.9045 0.8911 0.9773
0.103 12.0 1152 0.0714 0.8431 0.9149 0.8776 94 0.8471 0.8623 0.8546 167 0.9571 0.9781 0.9675 137 0.8835 0.9146 0.8988 0.9776
0.0954 13.0 1248 0.0672 0.8586 0.9043 0.8808 94 0.8471 0.8623 0.8546 167 0.9853 0.9781 0.9817 137 0.8963 0.9121 0.9041 0.9790
0.0896 14.0 1344 0.0617 0.8673 0.9043 0.8854 94 0.8466 0.8922 0.8688 167 0.9710 0.9781 0.9745 137 0.8932 0.9246 0.9086 0.9804
0.0894 15.0 1440 0.0573 0.8687 0.9149 0.8912 94 0.8596 0.8802 0.8698 167 0.9640 0.9781 0.9710 137 0.8973 0.9221 0.9095 0.9801
0.0853 16.0 1536 0.0628 0.8462 0.9362 0.8889 94 0.8457 0.8862 0.8655 167 0.9783 0.9854 0.9818 137 0.8897 0.9322 0.9104 0.9798
0.0813 17.0 1632 0.0562 0.8763 0.9043 0.8901 94 0.8629 0.9042 0.8830 167 0.9640 0.9781 0.9710 137 0.9002 0.9296 0.9147 0.9815
0.0804 18.0 1728 0.0545 0.85 0.9043 0.8763 94 0.8529 0.8683 0.8605 167 0.9571 0.9781 0.9675 137 0.8878 0.9146 0.9010 0.9798
0.0761 19.0 1824 0.0517 0.84 0.8936 0.8660 94 0.8675 0.8623 0.8649 167 0.9853 0.9781 0.9817 137 0.9005 0.9095 0.905 0.9812
0.0761 20.0 1920 0.0532 0.84 0.8936 0.8660 94 0.8706 0.8862 0.8783 167 0.9781 0.9781 0.9781 137 0.8993 0.9196 0.9093 0.9815
0.071 21.0 2016 0.0553 0.8462 0.9362 0.8889 94 0.8659 0.8503 0.8580 167 0.9781 0.9781 0.9781 137 0.8988 0.9146 0.9066 0.9812
0.07 22.0 2112 0.0499 0.85 0.9043 0.8763 94 0.8728 0.9042 0.8882 167 0.9926 0.9854 0.9890 137 0.9071 0.9322 0.9195 0.9834
0.0673 23.0 2208 0.0517 0.8286 0.9255 0.8744 94 0.8712 0.8503 0.8606 167 0.9783 0.9854 0.9818 137 0.8966 0.9146 0.9055 0.9820
0.0657 24.0 2304 0.0489 0.8515 0.9149 0.8821 94 0.8772 0.8982 0.8876 167 0.9853 0.9781 0.9817 137 0.9069 0.9296 0.9181 0.9831
0.0643 25.0 2400 0.0501 0.8148 0.9362 0.8713 94 0.8805 0.8383 0.8589 167 1.0 0.9854 0.9926 137 0.9030 0.9121 0.9075 0.9823
0.0607 26.0 2496 0.0486 0.8317 0.8936 0.8615 94 0.8841 0.8683 0.8761 167 1.0 0.9854 0.9926 137 0.91 0.9146 0.9123 0.9837
0.0629 27.0 2592 0.0493 0.8571 0.8936 0.875 94 0.8802 0.8802 0.8802 167 0.9779 0.9708 0.9744 137 0.9077 0.9146 0.9111 0.9826
0.0571 28.0 2688 0.0495 0.85 0.9043 0.8763 94 0.8727 0.8623 0.8675 167 0.9926 0.9781 0.9853 137 0.9075 0.9121 0.9098 0.9823
0.0564 29.0 2784 0.0469 0.8544 0.9362 0.8934 94 0.8909 0.8802 0.8855 167 0.9926 0.9781 0.9853 137 0.9156 0.9271 0.9213 0.9851
0.0578 30.0 2880 0.0486 0.8476 0.9468 0.8945 94 0.875 0.8802 0.8776 167 0.9853 0.9781 0.9817 137 0.9046 0.9296 0.9170 0.9837
0.0571 31.0 2976 0.0466 0.87 0.9255 0.8969 94 0.8727 0.8623 0.8675 167 0.9926 0.9781 0.9853 137 0.9125 0.9171 0.9148 0.9848
0.0517 32.0 3072 0.0480 0.8091 0.9468 0.8725 94 0.8704 0.8443 0.8571 167 1.0 0.9854 0.9926 137 0.8968 0.9171 0.9068 0.9829
0.0509 33.0 3168 0.0467 0.8224 0.9362 0.8756 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9015 0.9196 0.9104 0.9837
0.051 34.0 3264 0.0469 0.8286 0.9255 0.8744 94 0.8780 0.8623 0.8701 167 0.9926 0.9781 0.9853 137 0.9035 0.9171 0.9102 0.9834
0.0509 35.0 3360 0.0447 0.85 0.9043 0.8763 94 0.8848 0.8743 0.8795 167 1.0 0.9854 0.9926 137 0.915 0.9196 0.9173 0.9845
0.0498 36.0 3456 0.0467 0.8614 0.9255 0.8923 94 0.8713 0.8922 0.8817 167 0.9926 0.9781 0.9853 137 0.9091 0.9296 0.9193 0.9843
0.0486 37.0 3552 0.0439 0.86 0.9149 0.8866 94 0.8862 0.8862 0.8862 167 0.9926 0.9781 0.9853 137 0.9154 0.9246 0.92 0.9845
0.0486 38.0 3648 0.0430 0.8529 0.9255 0.8878 94 0.8896 0.8683 0.8788 167 0.9926 0.9781 0.9853 137 0.915 0.9196 0.9173 0.9845
0.0508 39.0 3744 0.0458 0.8224 0.9362 0.8756 94 0.8758 0.8443 0.8598 167 0.9926 0.9781 0.9853 137 0.9007 0.9121 0.9064 0.9837
0.0487 40.0 3840 0.0416 0.8544 0.9362 0.8934 94 0.8869 0.8922 0.8896 167 1.0 0.9854 0.9926 137 0.9163 0.9347 0.9254 0.9859
0.0453 41.0 3936 0.0431 0.8302 0.9362 0.88 94 0.8889 0.8623 0.8754 167 1.0 0.9854 0.9926 137 0.9107 0.9221 0.9164 0.9848
0.0459 42.0 4032 0.0421 0.8673 0.9043 0.8854 94 0.8909 0.8802 0.8855 167 1.0 0.9854 0.9926 137 0.9221 0.9221 0.9221 0.9854
0.0461 43.0 4128 0.0444 0.8286 0.9255 0.8744 94 0.8773 0.8563 0.8667 167 1.0 0.9854 0.9926 137 0.9057 0.9171 0.9114 0.9840
0.0436 44.0 4224 0.0418 0.8515 0.9149 0.8821 94 0.8712 0.8503 0.8606 167 1.0 0.9854 0.9926 137 0.9098 0.9121 0.9109 0.9837
0.0444 45.0 4320 0.0397 0.8614 0.9255 0.8923 94 0.8970 0.8862 0.8916 167 1.0 0.9854 0.9926 137 0.9227 0.9296 0.9262 0.9867
0.042 46.0 4416 0.0421 0.8286 0.9255 0.8744 94 0.8820 0.8503 0.8659 167 1.0 0.9854 0.9926 137 0.9077 0.9146 0.9111 0.9848
0.0425 47.0 4512 0.0443 0.8241 0.9468 0.8812 94 0.8841 0.8683 0.8761 167 1.0 0.9854 0.9926 137 0.9066 0.9271 0.9168 0.9845
0.0416 48.0 4608 0.0418 0.8462 0.9362 0.8889 94 0.9012 0.8743 0.8875 167 1.0 0.9854 0.9926 137 0.9202 0.9271 0.9237 0.9862
0.0401 49.0 4704 0.0418 0.8544 0.9362 0.8934 94 0.8841 0.8683 0.8761 167 1.0 0.9854 0.9926 137 0.9154 0.9246 0.92 0.9854
0.0395 50.0 4800 0.0428 0.8365 0.9255 0.8788 94 0.8773 0.8563 0.8667 167 1.0 0.9854 0.9926 137 0.9080 0.9171 0.9125 0.9848
0.0404 51.0 4896 0.0426 0.8462 0.9362 0.8889 94 0.8712 0.8503 0.8606 167 1.0 0.9854 0.9926 137 0.9080 0.9171 0.9125 0.9848
0.0388 52.0 4992 0.0405 0.8889 0.9362 0.9119 94 0.8824 0.8982 0.8902 167 1.0 0.9854 0.9926 137 0.9233 0.9372 0.9302 0.9876
0.0406 53.0 5088 0.0409 0.87 0.9255 0.8969 94 0.875 0.8802 0.8776 167 1.0 0.9854 0.9926 137 0.9156 0.9271 0.9213 0.9856
0.0403 54.0 5184 0.0410 0.8713 0.9362 0.9026 94 0.8855 0.8802 0.8829 167 1.0 0.9854 0.9926 137 0.9204 0.9296 0.925 0.9856
0.0393 55.0 5280 0.0407 0.8529 0.9255 0.8878 94 0.8580 0.8683 0.8631 167 1.0 0.9854 0.9926 137 0.9039 0.9221 0.9129 0.9854
0.0397 56.0 5376 0.0408 0.8302 0.9362 0.88 94 0.8598 0.8443 0.8520 167 1.0 0.9854 0.9926 137 0.8988 0.9146 0.9066 0.9834
0.0376 57.0 5472 0.0423 0.8257 0.9574 0.8867 94 0.8812 0.8443 0.8624 167 1.0 0.9854 0.9926 137 0.9059 0.9196 0.9127 0.9845
0.0385 58.0 5568 0.0406 0.8687 0.9149 0.8912 94 0.8743 0.8743 0.8743 167 1.0 0.9854 0.9926 137 0.9152 0.9221 0.9186 0.9856
0.0371 59.0 5664 0.0407 0.8776 0.9149 0.8958 94 0.8855 0.8802 0.8829 167 1.0 0.9854 0.9926 137 0.9223 0.9246 0.9235 0.9865
0.0361 60.0 5760 0.0428 0.8318 0.9468 0.8856 94 0.9062 0.8683 0.8869 167 1.0 0.9854 0.9926 137 0.9179 0.9271 0.9225 0.9851
0.036 61.0 5856 0.0413 0.8713 0.9362 0.9026 94 0.8935 0.9042 0.8988 167 1.0 0.9854 0.9926 137 0.9235 0.9397 0.9315 0.9862
0.0383 62.0 5952 0.0421 0.8302 0.9362 0.88 94 0.8909 0.8802 0.8855 167 1.0 0.9854 0.9926 137 0.9113 0.9296 0.9204 0.9848
0.0339 63.0 6048 0.0419 0.8381 0.9362 0.8844 94 0.9012 0.8743 0.8875 167 1.0 0.9854 0.9926 137 0.9179 0.9271 0.9225 0.9854
0.0363 64.0 6144 0.0428 0.8241 0.9468 0.8812 94 0.875 0.8383 0.8563 167 1.0 0.9854 0.9926 137 0.9032 0.9146 0.9089 0.9845
0.0355 65.0 6240 0.0422 0.8224 0.9362 0.8756 94 0.8650 0.8443 0.8545 167 1.0 0.9854 0.9926 137 0.8988 0.9146 0.9066 0.9845
0.0339 66.0 6336 0.0448 0.8241 0.9468 0.8812 94 0.8831 0.8144 0.8474 167 1.0 0.9854 0.9926 137 0.9068 0.9045 0.9057 0.9829
0.0352 67.0 6432 0.0429 0.8318 0.9468 0.8856 94 0.8820 0.8503 0.8659 167 1.0 0.9854 0.9926 137 0.9082 0.9196 0.9139 0.9843
0.0337 68.0 6528 0.0458 0.8241 0.9468 0.8812 94 0.8710 0.8084 0.8385 167 1.0 0.9854 0.9926 137 0.9020 0.9020 0.9020 0.9826
0.0353 69.0 6624 0.0425 0.8381 0.9362 0.8844 94 0.8841 0.8683 0.8761 167 1.0 0.9854 0.9926 137 0.9109 0.9246 0.9177 0.9851
0.0338 70.0 6720 0.0428 0.8365 0.9255 0.8788 94 0.8589 0.8383 0.8485 167 1.0 0.9854 0.9926 137 0.9005 0.9095 0.905 0.9834
0.0348 71.0 6816 0.0432 0.8318 0.9468 0.8856 94 0.9012 0.8743 0.8875 167 0.9926 0.9781 0.9853 137 0.9134 0.9271 0.9202 0.9851
0.0351 72.0 6912 0.0449 0.8091 0.9468 0.8725 94 0.8868 0.8443 0.8650 167 1.0 0.9854 0.9926 137 0.9035 0.9171 0.9102 0.9837
0.0327 73.0 7008 0.0439 0.8091 0.9468 0.8725 94 0.8625 0.8263 0.8440 167 1.0 0.9854 0.9926 137 0.8938 0.9095 0.9016 0.9826
0.0314 74.0 7104 0.0431 0.8462 0.9362 0.8889 94 0.8758 0.8443 0.8598 167 1.0 0.9854 0.9926 137 0.91 0.9146 0.9123 0.9837
0.0332 75.0 7200 0.0430 0.8302 0.9362 0.88 94 0.8485 0.8383 0.8434 167 1.0 0.9854 0.9926 137 0.8941 0.9121 0.9030 0.9834
0.0311 76.0 7296 0.0438 0.8365 0.9255 0.8788 94 0.8598 0.8443 0.8520 167 1.0 0.9854 0.9926 137 0.9007 0.9121 0.9064 0.9840
0.0322 77.0 7392 0.0455 0.8165 0.9468 0.8768 94 0.8671 0.8204 0.8431 167 1.0 0.9854 0.9926 137 0.8980 0.9070 0.9025 0.9823
0.0313 78.0 7488 0.0442 0.8302 0.9362 0.88 94 0.8712 0.8503 0.8606 167 1.0 0.9854 0.9926 137 0.9035 0.9171 0.9102 0.9840
0.0313 79.0 7584 0.0435 0.8302 0.9362 0.88 94 0.8659 0.8503 0.8580 167 1.0 0.9854 0.9926 137 0.9012 0.9171 0.9091 0.9845
0.0321 80.0 7680 0.0450 0.8165 0.9468 0.8768 94 0.8812 0.8443 0.8624 167 1.0 0.9854 0.9926 137 0.9035 0.9171 0.9102 0.9831
0.0303 81.0 7776 0.0441 0.8318 0.9468 0.8856 94 0.8765 0.8503 0.8632 167 1.0 0.9854 0.9926 137 0.9059 0.9196 0.9127 0.9843
0.0322 82.0 7872 0.0442 0.8318 0.9468 0.8856 94 0.8797 0.8323 0.8554 167 1.0 0.9854 0.9926 137 0.9075 0.9121 0.9098 0.9834
0.0313 83.0 7968 0.0447 0.8318 0.9468 0.8856 94 0.8812 0.8443 0.8624 167 1.0 0.9854 0.9926 137 0.9080 0.9171 0.9125 0.9834
0.0292 84.0 8064 0.0448 0.8318 0.9468 0.8856 94 0.8765 0.8503 0.8632 167 1.0 0.9854 0.9926 137 0.9059 0.9196 0.9127 0.9840
0.03 85.0 8160 0.0465 0.8396 0.9468 0.89 94 0.8734 0.8263 0.8492 167 1.0 0.9854 0.9926 137 0.9073 0.9095 0.9084 0.9831
0.0311 86.0 8256 0.0455 0.8302 0.9362 0.88 94 0.8773 0.8563 0.8667 167 1.0 0.9854 0.9926 137 0.9059 0.9196 0.9127 0.9837
0.0302 87.0 8352 0.0458 0.8318 0.9468 0.8856 94 0.8696 0.8383 0.8537 167 1.0 0.9854 0.9926 137 0.9032 0.9146 0.9089 0.9834
0.0311 88.0 8448 0.0445 0.8318 0.9468 0.8856 94 0.8758 0.8443 0.8598 167 1.0 0.9854 0.9926 137 0.9057 0.9171 0.9114 0.9834
0.0306 89.0 8544 0.0432 0.8365 0.9255 0.8788 94 0.8727 0.8623 0.8675 167 1.0 0.9854 0.9926 137 0.9059 0.9196 0.9127 0.9843
0.0292 90.0 8640 0.0444 0.8396 0.9468 0.89 94 0.8827 0.8563 0.8693 167 1.0 0.9854 0.9926 137 0.9107 0.9221 0.9164 0.9837
0.0302 91.0 8736 0.0451 0.8318 0.9468 0.8856 94 0.875 0.8383 0.8563 167 1.0 0.9854 0.9926 137 0.9055 0.9146 0.91 0.9829
0.0288 92.0 8832 0.0445 0.8396 0.9468 0.89 94 0.8773 0.8563 0.8667 167 1.0 0.9854 0.9926 137 0.9084 0.9221 0.9152 0.9845
0.0313 93.0 8928 0.0444 0.8318 0.9468 0.8856 94 0.8827 0.8563 0.8693 167 1.0 0.9854 0.9926 137 0.9084 0.9221 0.9152 0.9845
0.0293 94.0 9024 0.0441 0.8318 0.9468 0.8856 94 0.8944 0.8623 0.8780 167 1.0 0.9854 0.9926 137 0.9132 0.9246 0.9189 0.9848
0.03 95.0 9120 0.0450 0.8318 0.9468 0.8856 94 0.8812 0.8443 0.8624 167 1.0 0.9854 0.9926 137 0.9080 0.9171 0.9125 0.9837
0.0313 96.0 9216 0.0443 0.8318 0.9468 0.8856 94 0.8827 0.8563 0.8693 167 1.0 0.9854 0.9926 137 0.9084 0.9221 0.9152 0.9845
0.0299 97.0 9312 0.0445 0.8318 0.9468 0.8856 94 0.875 0.8383 0.8563 167 1.0 0.9854 0.9926 137 0.9055 0.9146 0.91 0.9837
0.0316 98.0 9408 0.0442 0.8318 0.9468 0.8856 94 0.8827 0.8563 0.8693 167 1.0 0.9854 0.9926 137 0.9084 0.9221 0.9152 0.9845
0.0301 99.0 9504 0.0439 0.8318 0.9468 0.8856 94 0.8827 0.8563 0.8693 167 1.0 0.9854 0.9926 137 0.9084 0.9221 0.9152 0.9845
0.0308 100.0 9600 0.0440 0.8318 0.9468 0.8856 94 0.8827 0.8563 0.8693 167 1.0 0.9854 0.9926 137 0.9084 0.9221 0.9152 0.9845

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-lora-r16-0

Finetuned
(365)
this model