Edit model card

nerui-lora-r16-4

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0460
  • Location Precision: 0.8596
  • Location Recall: 0.9515
  • Location F1: 0.9032
  • Location Number: 103
  • Organization Precision: 0.9018
  • Organization Recall: 0.8596
  • Organization F1: 0.8802
  • Organization Number: 171
  • Person Precision: 0.9621
  • Person Recall: 0.9695
  • Person F1: 0.9658
  • Person Number: 131
  • Overall Precision: 0.9095
  • Overall Recall: 0.9185
  • Overall F1: 0.9140
  • Overall Accuracy: 0.9843

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.0754 1.0 96 0.6777 0.0 0.0 0.0 103 0.0 0.0 0.0 171 0.0 0.0 0.0 131 0.0 0.0 0.0 0.8373
0.6397 2.0 192 0.5210 0.0 0.0 0.0 103 0.0 0.0 0.0 171 0.0 0.0 0.0 131 0.0 0.0 0.0 0.8382
0.4876 3.0 288 0.3655 0.2333 0.0680 0.1053 103 0.3084 0.1930 0.2374 171 0.2721 0.2824 0.2772 131 0.2821 0.1901 0.2271 0.8732
0.3447 4.0 384 0.2576 0.4167 0.3398 0.3743 103 0.4977 0.6199 0.5521 171 0.4751 0.6565 0.5513 131 0.4749 0.5605 0.5142 0.9227
0.2489 5.0 480 0.1880 0.5652 0.5049 0.5333 103 0.6291 0.7836 0.6979 171 0.7197 0.8626 0.7847 131 0.6472 0.7383 0.6897 0.9484
0.1922 6.0 576 0.1418 0.7379 0.7379 0.7379 103 0.75 0.8421 0.7934 171 0.8865 0.9542 0.9191 131 0.7913 0.8519 0.8205 0.9635
0.1585 7.0 672 0.1196 0.7364 0.7864 0.7606 103 0.7526 0.8538 0.8 171 0.9 0.9618 0.9299 131 0.7950 0.8716 0.8316 0.9638
0.1378 8.0 768 0.1014 0.8058 0.8058 0.8058 103 0.7760 0.8713 0.8209 171 0.9270 0.9695 0.9478 131 0.8310 0.8864 0.8578 0.9696
0.1249 9.0 864 0.0929 0.8113 0.8350 0.8230 103 0.7696 0.8596 0.8122 171 0.9203 0.9695 0.9442 131 0.8276 0.8889 0.8571 0.9713
0.1157 10.0 960 0.0842 0.8416 0.8252 0.8333 103 0.7824 0.8830 0.8297 171 0.9065 0.9618 0.9333 131 0.8360 0.8938 0.8640 0.9718
0.106 11.0 1056 0.0774 0.8462 0.8544 0.8502 103 0.8022 0.8538 0.8272 171 0.9403 0.9618 0.9509 131 0.8571 0.8889 0.8727 0.9751
0.1021 12.0 1152 0.0777 0.8505 0.8835 0.8667 103 0.8118 0.8830 0.8459 171 0.9065 0.9618 0.9333 131 0.8519 0.9086 0.8793 0.9754
0.0959 13.0 1248 0.0699 0.8654 0.8738 0.8696 103 0.8678 0.8830 0.8754 171 0.9474 0.9618 0.9545 131 0.8929 0.9062 0.8995 0.9790
0.0915 14.0 1344 0.0697 0.8667 0.8835 0.8750 103 0.875 0.8596 0.8673 171 0.9403 0.9618 0.9509 131 0.8943 0.8988 0.8966 0.9787
0.0875 15.0 1440 0.0640 0.875 0.8835 0.8792 103 0.8263 0.9181 0.8698 171 0.9333 0.9618 0.9474 131 0.8718 0.9235 0.8969 0.9785
0.0837 16.0 1536 0.0616 0.8624 0.9126 0.8868 103 0.8280 0.9006 0.8627 171 0.9478 0.9695 0.9585 131 0.8741 0.9259 0.8993 0.9796
0.0799 17.0 1632 0.0570 0.8942 0.9029 0.8986 103 0.8387 0.9123 0.8739 171 0.9549 0.9695 0.9621 131 0.8889 0.9284 0.9082 0.9804
0.0763 18.0 1728 0.0549 0.8785 0.9126 0.8952 103 0.8603 0.9006 0.8800 171 0.9695 0.9695 0.9695 131 0.8993 0.9259 0.9124 0.9807
0.0732 19.0 1824 0.0561 0.8857 0.9029 0.8942 103 0.8837 0.8889 0.8863 171 0.9695 0.9695 0.9695 131 0.9118 0.9185 0.9151 0.9818
0.072 20.0 1920 0.0518 0.8868 0.9126 0.8995 103 0.8626 0.9181 0.8895 171 0.9846 0.9771 0.9808 131 0.9067 0.9358 0.9210 0.9820
0.069 21.0 2016 0.0508 0.8598 0.8932 0.8762 103 0.9107 0.8947 0.9027 171 0.9695 0.9695 0.9695 131 0.9163 0.9185 0.9174 0.9832
0.0681 22.0 2112 0.0510 0.8704 0.9126 0.8910 103 0.8495 0.9240 0.8852 171 0.9846 0.9771 0.9808 131 0.8962 0.9383 0.9168 0.9829
0.068 23.0 2208 0.0487 0.8704 0.9126 0.8910 103 0.8960 0.9064 0.9012 171 0.9846 0.9771 0.9808 131 0.9173 0.9309 0.9240 0.9845
0.0646 24.0 2304 0.0502 0.8333 0.9223 0.8756 103 0.9080 0.8655 0.8862 171 0.9545 0.9618 0.9582 131 0.9022 0.9111 0.9066 0.9820
0.0641 25.0 2400 0.0478 0.8532 0.9029 0.8774 103 0.8902 0.9006 0.8953 171 0.9695 0.9695 0.9695 131 0.9056 0.9235 0.9144 0.9845
0.0632 26.0 2496 0.0462 0.8846 0.8932 0.8889 103 0.8736 0.9298 0.9008 171 0.9695 0.9695 0.9695 131 0.9065 0.9333 0.9197 0.9851
0.0591 27.0 2592 0.0447 0.8774 0.9029 0.8900 103 0.8908 0.9064 0.8986 171 0.9695 0.9695 0.9695 131 0.9124 0.9259 0.9191 0.9854
0.0586 28.0 2688 0.0448 0.8857 0.9029 0.8942 103 0.8851 0.9006 0.8928 171 0.9695 0.9695 0.9695 131 0.9122 0.9235 0.9178 0.9859
0.0558 29.0 2784 0.0440 0.9208 0.9029 0.9118 103 0.8764 0.9123 0.8940 171 0.9621 0.9695 0.9658 131 0.9148 0.9284 0.9216 0.9862
0.0568 30.0 2880 0.0453 0.8942 0.9029 0.8986 103 0.9006 0.9006 0.9006 171 0.9695 0.9695 0.9695 131 0.9212 0.9235 0.9223 0.9854
0.0547 31.0 2976 0.0436 0.8868 0.9126 0.8995 103 0.8715 0.9123 0.8914 171 0.9695 0.9695 0.9695 131 0.9062 0.9309 0.9184 0.9851
0.0544 32.0 3072 0.0434 0.8796 0.9223 0.9005 103 0.9036 0.8772 0.8902 171 0.9695 0.9695 0.9695 131 0.9185 0.9185 0.9185 0.9859
0.0509 33.0 3168 0.0450 0.8545 0.9126 0.8826 103 0.9187 0.8596 0.8882 171 0.9695 0.9695 0.9695 131 0.9177 0.9086 0.9132 0.9848
0.05 34.0 3264 0.0419 0.8846 0.8932 0.8889 103 0.9112 0.9006 0.9059 171 0.9846 0.9771 0.9808 131 0.9280 0.9235 0.9257 0.9865
0.0496 35.0 3360 0.0427 0.8482 0.9223 0.8837 103 0.8994 0.8889 0.8941 171 0.9695 0.9695 0.9695 131 0.9078 0.9235 0.9155 0.9848
0.0504 36.0 3456 0.0433 0.8468 0.9126 0.8785 103 0.9198 0.8713 0.8949 171 0.9695 0.9695 0.9695 131 0.9158 0.9136 0.9147 0.9856
0.0485 37.0 3552 0.0429 0.9038 0.9126 0.9082 103 0.8764 0.9123 0.8940 171 0.9695 0.9695 0.9695 131 0.9128 0.9309 0.9218 0.9854
0.0466 38.0 3648 0.0439 0.8692 0.9029 0.8857 103 0.8988 0.8830 0.8909 171 0.9695 0.9695 0.9695 131 0.9138 0.9160 0.9149 0.9848
0.0464 39.0 3744 0.0425 0.8649 0.9320 0.8972 103 0.9096 0.8830 0.8961 171 0.9695 0.9695 0.9695 131 0.9167 0.9235 0.9200 0.9856
0.0457 40.0 3840 0.0429 0.8952 0.9126 0.9038 103 0.8941 0.8889 0.8915 171 0.9695 0.9695 0.9695 131 0.9187 0.9210 0.9199 0.9856
0.0454 41.0 3936 0.0413 0.9029 0.9029 0.9029 103 0.8764 0.9123 0.8940 171 0.9695 0.9695 0.9695 131 0.9126 0.9284 0.9204 0.9856
0.0446 42.0 4032 0.0433 0.8962 0.9223 0.9091 103 0.8941 0.8889 0.8915 171 0.9695 0.9695 0.9695 131 0.9189 0.9235 0.9212 0.9859
0.0461 43.0 4128 0.0432 0.8636 0.9223 0.8920 103 0.9 0.8947 0.8974 171 0.9695 0.9695 0.9695 131 0.9124 0.9259 0.9191 0.9851
0.0445 44.0 4224 0.0451 0.8571 0.9320 0.8930 103 0.8817 0.8713 0.8765 171 0.9695 0.9695 0.9695 131 0.9029 0.9185 0.9106 0.9843
0.0439 45.0 4320 0.0427 0.8796 0.9223 0.9005 103 0.9096 0.8830 0.8961 171 0.9695 0.9695 0.9695 131 0.9210 0.9210 0.9210 0.9859
0.0436 46.0 4416 0.0450 0.8348 0.9320 0.8807 103 0.9136 0.8655 0.8889 171 0.9695 0.9695 0.9695 131 0.9093 0.9160 0.9127 0.9851
0.0436 47.0 4512 0.0410 0.9029 0.9029 0.9029 103 0.8807 0.9064 0.8934 171 0.9695 0.9695 0.9695 131 0.9146 0.9259 0.9202 0.9859
0.0418 48.0 4608 0.0459 0.8348 0.9320 0.8807 103 0.9198 0.8713 0.8949 171 0.9695 0.9695 0.9695 131 0.9118 0.9185 0.9151 0.9851
0.0393 49.0 4704 0.0438 0.8348 0.9320 0.8807 103 0.9024 0.8655 0.8836 171 0.9695 0.9695 0.9695 131 0.9049 0.9160 0.9104 0.9843
0.0403 50.0 4800 0.0465 0.8407 0.9223 0.8796 103 0.8916 0.8655 0.8783 171 0.9695 0.9695 0.9695 131 0.9024 0.9136 0.9080 0.9840
0.0397 51.0 4896 0.0455 0.8496 0.9320 0.8889 103 0.9193 0.8655 0.8916 171 0.9695 0.9695 0.9695 131 0.9160 0.9160 0.9160 0.9848
0.0398 52.0 4992 0.0427 0.8952 0.9126 0.9038 103 0.88 0.9006 0.8902 171 0.9695 0.9695 0.9695 131 0.9124 0.9259 0.9191 0.9859
0.0394 53.0 5088 0.0449 0.8584 0.9417 0.8981 103 0.9080 0.8655 0.8862 171 0.9695 0.9695 0.9695 131 0.9140 0.9185 0.9163 0.9851
0.0378 54.0 5184 0.0420 0.8785 0.9126 0.8952 103 0.8743 0.8947 0.8844 171 0.9695 0.9695 0.9695 131 0.9056 0.9235 0.9144 0.9859
0.0387 55.0 5280 0.0446 0.8727 0.9320 0.9014 103 0.8922 0.8713 0.8817 171 0.9695 0.9695 0.9695 131 0.9118 0.9185 0.9151 0.9851
0.0369 56.0 5376 0.0435 0.8559 0.9223 0.8879 103 0.8765 0.8713 0.8739 171 0.9695 0.9695 0.9695 131 0.9005 0.9160 0.9082 0.9845
0.0383 57.0 5472 0.0433 0.8661 0.9417 0.9023 103 0.8929 0.8772 0.8850 171 0.9695 0.9695 0.9695 131 0.9100 0.9235 0.9167 0.9854
0.0368 58.0 5568 0.0444 0.8807 0.9320 0.9057 103 0.8889 0.8889 0.8889 171 0.9695 0.9695 0.9695 131 0.9124 0.9259 0.9191 0.9859
0.0349 59.0 5664 0.0451 0.8624 0.9126 0.8868 103 0.8757 0.8655 0.8706 171 0.9695 0.9695 0.9695 131 0.9022 0.9111 0.9066 0.9843
0.0357 60.0 5760 0.0455 0.8716 0.9223 0.8962 103 0.8855 0.8596 0.8724 171 0.9695 0.9695 0.9695 131 0.9089 0.9111 0.9100 0.9843
0.0365 61.0 5856 0.0456 0.8673 0.9515 0.9074 103 0.9018 0.8596 0.8802 171 0.9695 0.9695 0.9695 131 0.9140 0.9185 0.9163 0.9851
0.0365 62.0 5952 0.0455 0.8462 0.9612 0.9 103 0.9080 0.8655 0.8862 171 0.9695 0.9695 0.9695 131 0.9100 0.9235 0.9167 0.9851
0.034 63.0 6048 0.0424 0.8584 0.9417 0.8981 103 0.8817 0.8713 0.8765 171 0.9695 0.9695 0.9695 131 0.9031 0.9210 0.9120 0.9848
0.0347 64.0 6144 0.0465 0.8596 0.9515 0.9032 103 0.9074 0.8596 0.8829 171 0.9695 0.9695 0.9695 131 0.9140 0.9185 0.9163 0.9848
0.0342 65.0 6240 0.0446 0.8522 0.9515 0.8991 103 0.9085 0.8713 0.8896 171 0.9695 0.9695 0.9695 131 0.9122 0.9235 0.9178 0.9851
0.034 66.0 6336 0.0450 0.8684 0.9612 0.9124 103 0.9080 0.8655 0.8862 171 0.9695 0.9695 0.9695 131 0.9167 0.9235 0.9200 0.9856
0.0338 67.0 6432 0.0458 0.8596 0.9515 0.9032 103 0.9024 0.8655 0.8836 171 0.9695 0.9695 0.9695 131 0.9120 0.9210 0.9165 0.9851
0.0368 68.0 6528 0.0461 0.8571 0.9320 0.8930 103 0.8909 0.8596 0.875 171 0.9846 0.9771 0.9808 131 0.9115 0.9160 0.9138 0.9843
0.036 69.0 6624 0.0454 0.8571 0.9320 0.8930 103 0.8909 0.8596 0.875 171 0.9621 0.9695 0.9658 131 0.9046 0.9136 0.9091 0.9840
0.0318 70.0 6720 0.0479 0.8584 0.9417 0.8981 103 0.9080 0.8655 0.8862 171 0.9695 0.9695 0.9695 131 0.9140 0.9185 0.9163 0.9843
0.0327 71.0 6816 0.0449 0.8522 0.9515 0.8991 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9073 0.9185 0.9129 0.9845
0.0328 72.0 6912 0.0490 0.8596 0.9515 0.9032 103 0.8957 0.8538 0.8743 171 0.9621 0.9695 0.9658 131 0.9071 0.9160 0.9115 0.9843
0.0325 73.0 7008 0.0465 0.8673 0.9515 0.9074 103 0.9080 0.8655 0.8862 171 0.9621 0.9695 0.9658 131 0.9142 0.9210 0.9176 0.9848
0.0337 74.0 7104 0.0479 0.8522 0.9515 0.8991 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9073 0.9185 0.9129 0.9837
0.0329 75.0 7200 0.0460 0.8727 0.9320 0.9014 103 0.8963 0.8596 0.8776 171 0.9695 0.9695 0.9695 131 0.9136 0.9136 0.9136 0.9845
0.0308 76.0 7296 0.0472 0.8522 0.9515 0.8991 103 0.9130 0.8596 0.8855 171 0.9695 0.9695 0.9695 131 0.9140 0.9185 0.9163 0.9845
0.0319 77.0 7392 0.0474 0.8522 0.9515 0.8991 103 0.9074 0.8596 0.8829 171 0.9621 0.9695 0.9658 131 0.9095 0.9185 0.9140 0.9840
0.0317 78.0 7488 0.0454 0.8496 0.9320 0.8889 103 0.8963 0.8596 0.8776 171 0.9771 0.9771 0.9771 131 0.9093 0.9160 0.9127 0.9837
0.0305 79.0 7584 0.0459 0.8496 0.9320 0.8889 103 0.8909 0.8596 0.875 171 0.9621 0.9695 0.9658 131 0.9024 0.9136 0.9080 0.9837
0.0302 80.0 7680 0.0458 0.8496 0.9320 0.8889 103 0.8855 0.8596 0.8724 171 0.9621 0.9695 0.9658 131 0.9002 0.9136 0.9069 0.9837
0.0315 81.0 7776 0.0453 0.8496 0.9320 0.8889 103 0.8855 0.8596 0.8724 171 0.9621 0.9695 0.9658 131 0.9002 0.9136 0.9069 0.9837
0.031 82.0 7872 0.0448 0.8509 0.9417 0.8940 103 0.8963 0.8596 0.8776 171 0.9621 0.9695 0.9658 131 0.9049 0.9160 0.9104 0.9843
0.0322 83.0 7968 0.0446 0.8596 0.9515 0.9032 103 0.9074 0.8596 0.8829 171 0.9695 0.9695 0.9695 131 0.9140 0.9185 0.9163 0.9848
0.0292 84.0 8064 0.0446 0.8496 0.9320 0.8889 103 0.8855 0.8596 0.8724 171 0.9621 0.9695 0.9658 131 0.9002 0.9136 0.9069 0.9837
0.0302 85.0 8160 0.0443 0.8522 0.9515 0.8991 103 0.9024 0.8655 0.8836 171 0.9621 0.9695 0.9658 131 0.9075 0.9210 0.9142 0.9848
0.0298 86.0 8256 0.0455 0.8522 0.9515 0.8991 103 0.9074 0.8596 0.8829 171 0.9695 0.9695 0.9695 131 0.9118 0.9185 0.9151 0.9845
0.0311 87.0 8352 0.0451 0.8522 0.9515 0.8991 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9073 0.9185 0.9129 0.9845
0.0302 88.0 8448 0.0441 0.8522 0.9515 0.8991 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9073 0.9185 0.9129 0.9848
0.0295 89.0 8544 0.0448 0.8522 0.9515 0.8991 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9073 0.9185 0.9129 0.9843
0.0299 90.0 8640 0.0448 0.8509 0.9417 0.8940 103 0.8963 0.8596 0.8776 171 0.9621 0.9695 0.9658 131 0.9049 0.9160 0.9104 0.9840
0.0301 91.0 8736 0.0455 0.8596 0.9515 0.9032 103 0.8963 0.8596 0.8776 171 0.9621 0.9695 0.9658 131 0.9073 0.9185 0.9129 0.9843
0.03 92.0 8832 0.0463 0.8596 0.9515 0.9032 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9095 0.9185 0.9140 0.9843
0.0275 93.0 8928 0.0454 0.8596 0.9515 0.9032 103 0.8963 0.8596 0.8776 171 0.9621 0.9695 0.9658 131 0.9073 0.9185 0.9129 0.9843
0.0311 94.0 9024 0.0462 0.8596 0.9515 0.9032 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9095 0.9185 0.9140 0.9840
0.0285 95.0 9120 0.0458 0.8596 0.9515 0.9032 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9095 0.9185 0.9140 0.9843
0.0304 96.0 9216 0.0459 0.8596 0.9515 0.9032 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9095 0.9185 0.9140 0.9843
0.0285 97.0 9312 0.0460 0.8522 0.9515 0.8991 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9073 0.9185 0.9129 0.9840
0.0299 98.0 9408 0.0462 0.8522 0.9515 0.8991 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9073 0.9185 0.9129 0.9840
0.0296 99.0 9504 0.0461 0.8596 0.9515 0.9032 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9095 0.9185 0.9140 0.9843
0.0279 100.0 9600 0.0460 0.8596 0.9515 0.9032 103 0.9018 0.8596 0.8802 171 0.9621 0.9695 0.9658 131 0.9095 0.9185 0.9140 0.9843

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-lora-r16-4

Finetuned
(365)
this model