Edit model card

nerui-lora-r8-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0463
  • Location Precision: 0.8462
  • Location Recall: 0.9362
  • Location F1: 0.8889
  • Location Number: 94
  • Organization Precision: 0.8667
  • Organization Recall: 0.8563
  • Organization F1: 0.8614
  • Organization Number: 167
  • Person Precision: 1.0
  • Person Recall: 0.9854
  • Person F1: 0.9926
  • Person Number: 137
  • Overall Precision: 0.9059
  • Overall Recall: 0.9196
  • Overall F1: 0.9127
  • Overall Accuracy: 0.9848

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.1434 1.0 96 0.7069 0.0 0.0 0.0 94 0.0 0.0 0.0 167 0.0 0.0 0.0 137 0.0 0.0 0.0 0.8343
0.6699 2.0 192 0.5760 0.0 0.0 0.0 94 1.0 0.0060 0.0119 167 0.0 0.0 0.0 137 0.25 0.0025 0.0050 0.8348
0.5654 3.0 288 0.4641 0.0 0.0 0.0 94 0.4118 0.0419 0.0761 167 0.2414 0.0511 0.0843 137 0.3043 0.0352 0.0631 0.8420
0.4481 4.0 384 0.3466 0.2353 0.0426 0.0721 94 0.3578 0.2335 0.2826 167 0.3774 0.4380 0.4054 137 0.3614 0.2588 0.3016 0.8793
0.3376 5.0 480 0.2613 0.4058 0.2979 0.3436 94 0.5105 0.5808 0.5434 167 0.5081 0.6861 0.5839 137 0.4932 0.5503 0.5202 0.9202
0.2611 6.0 576 0.2025 0.5909 0.5532 0.5714 94 0.5588 0.6826 0.6146 167 0.6905 0.8467 0.7607 137 0.6130 0.7085 0.6573 0.9406
0.2071 7.0 672 0.1615 0.7021 0.7021 0.7021 94 0.6649 0.7605 0.7095 167 0.8224 0.9124 0.8651 137 0.7277 0.7990 0.7617 0.9555
0.1767 8.0 768 0.1337 0.7872 0.7872 0.7872 94 0.7120 0.7844 0.7464 167 0.9306 0.9781 0.9537 137 0.8033 0.8518 0.8268 0.9644
0.1601 9.0 864 0.1165 0.7980 0.8404 0.8187 94 0.7351 0.8144 0.7727 167 0.9306 0.9781 0.9537 137 0.8154 0.8769 0.8450 0.9671
0.1406 10.0 960 0.1041 0.7573 0.8298 0.7919 94 0.7816 0.8144 0.7977 167 0.9371 0.9781 0.9571 137 0.8286 0.8744 0.8509 0.9693
0.1283 11.0 1056 0.0951 0.8021 0.8191 0.8105 94 0.7865 0.8383 0.8116 167 0.9371 0.9781 0.9571 137 0.8417 0.8819 0.8613 0.9704
0.1229 12.0 1152 0.0895 0.8019 0.9043 0.8500 94 0.8 0.8383 0.8187 167 0.9375 0.9854 0.9609 137 0.8471 0.9045 0.8748 0.9715
0.1116 13.0 1248 0.0831 0.83 0.8830 0.8557 94 0.8314 0.8563 0.8437 167 0.9371 0.9781 0.9571 137 0.8675 0.9045 0.8856 0.9743
0.1077 14.0 1344 0.0769 0.8571 0.8936 0.875 94 0.8409 0.8862 0.8630 167 0.9504 0.9781 0.9640 137 0.8819 0.9196 0.9004 0.9760
0.1045 15.0 1440 0.0758 0.8333 0.9043 0.8673 94 0.8430 0.8683 0.8555 167 0.9371 0.9781 0.9571 137 0.8729 0.9146 0.8933 0.9760
0.1 16.0 1536 0.0753 0.8365 0.9255 0.8788 94 0.8111 0.8743 0.8415 167 0.9437 0.9781 0.9606 137 0.8615 0.9221 0.8908 0.9746
0.0961 17.0 1632 0.0690 0.8586 0.9043 0.8808 94 0.8563 0.8922 0.8739 167 0.9571 0.9781 0.9675 137 0.8910 0.9246 0.9075 0.9785
0.0981 18.0 1728 0.0676 0.86 0.9149 0.8866 94 0.8523 0.8982 0.8746 167 0.9504 0.9781 0.9640 137 0.8873 0.9296 0.9080 0.9782
0.0916 19.0 1824 0.0653 0.8333 0.9043 0.8673 94 0.8647 0.8802 0.8724 167 0.9640 0.9781 0.9710 137 0.8905 0.9196 0.9048 0.9790
0.0899 20.0 1920 0.0637 0.8586 0.9043 0.8808 94 0.8563 0.8922 0.8739 167 0.9640 0.9781 0.9710 137 0.8932 0.9246 0.9086 0.9790
0.0856 21.0 2016 0.0656 0.8113 0.9149 0.8600 94 0.8580 0.8683 0.8631 167 0.9571 0.9781 0.9675 137 0.8795 0.9171 0.8979 0.9773
0.0844 22.0 2112 0.0621 0.8416 0.9043 0.8718 94 0.8563 0.8922 0.8739 167 0.9571 0.9781 0.9675 137 0.8867 0.9246 0.9053 0.9782
0.0816 23.0 2208 0.0608 0.85 0.9043 0.8763 94 0.8647 0.8802 0.8724 167 0.9571 0.9781 0.9675 137 0.8927 0.9196 0.9059 0.9798
0.0803 24.0 2304 0.0591 0.8586 0.9043 0.8808 94 0.8671 0.8982 0.8824 167 0.9571 0.9781 0.9675 137 0.8956 0.9271 0.9111 0.9796
0.0793 25.0 2400 0.0577 0.85 0.9043 0.8763 94 0.8824 0.8982 0.8902 167 0.9710 0.9781 0.9745 137 0.9044 0.9271 0.9156 0.9818
0.0744 26.0 2496 0.0576 0.8529 0.9255 0.8878 94 0.8706 0.8862 0.8783 167 0.9710 0.9781 0.9745 137 0.9 0.9271 0.9134 0.9818
0.0761 27.0 2592 0.0571 0.8416 0.9043 0.8718 94 0.8757 0.8862 0.8810 167 0.9640 0.9781 0.9710 137 0.8973 0.9221 0.9095 0.9807
0.0724 28.0 2688 0.0559 0.8586 0.9043 0.8808 94 0.8655 0.8862 0.8757 167 0.9710 0.9781 0.9745 137 0.8995 0.9221 0.9107 0.9809
0.071 29.0 2784 0.0542 0.8687 0.9149 0.8912 94 0.8655 0.8862 0.8757 167 0.9783 0.9854 0.9818 137 0.9044 0.9271 0.9156 0.9818
0.0705 30.0 2880 0.0549 0.8462 0.9362 0.8889 94 0.8690 0.8743 0.8716 167 0.9854 0.9854 0.9854 137 0.9022 0.9271 0.9145 0.9818
0.0702 31.0 2976 0.0517 0.8687 0.9149 0.8912 94 0.8817 0.8922 0.8869 167 1.0 0.9854 0.9926 137 0.9181 0.9296 0.9238 0.9834
0.065 32.0 3072 0.0532 0.8396 0.9468 0.89 94 0.8951 0.8683 0.8815 167 0.9926 0.9854 0.9890 137 0.9134 0.9271 0.9202 0.9826
0.0639 33.0 3168 0.0533 0.8286 0.9255 0.8744 94 0.8780 0.8623 0.8701 167 0.9926 0.9854 0.9890 137 0.9037 0.9196 0.9116 0.9815
0.0642 34.0 3264 0.0520 0.8529 0.9255 0.8878 94 0.875 0.8802 0.8776 167 0.9926 0.9854 0.9890 137 0.9089 0.9271 0.9179 0.9820
0.0652 35.0 3360 0.0518 0.8515 0.9149 0.8821 94 0.8690 0.8743 0.8716 167 0.9926 0.9854 0.9890 137 0.9062 0.9221 0.9141 0.9815
0.0627 36.0 3456 0.0533 0.87 0.9255 0.8969 94 0.8655 0.8862 0.8757 167 0.9854 0.9854 0.9854 137 0.9069 0.9296 0.9181 0.9818
0.0606 37.0 3552 0.0503 0.8878 0.9255 0.9062 94 0.8698 0.8802 0.8750 167 0.9926 0.9854 0.9890 137 0.9156 0.9271 0.9213 0.9826
0.0611 38.0 3648 0.0497 0.87 0.9255 0.8969 94 0.8848 0.8743 0.8795 167 0.9854 0.9854 0.9854 137 0.9154 0.9246 0.92 0.9829
0.0645 39.0 3744 0.0511 0.8431 0.9149 0.8776 94 0.8780 0.8623 0.8701 167 0.9926 0.9854 0.9890 137 0.9080 0.9171 0.9125 0.9823
0.061 40.0 3840 0.0487 0.8687 0.9149 0.8912 94 0.8765 0.8922 0.8843 167 1.0 0.9854 0.9926 137 0.9158 0.9296 0.9227 0.9840
0.0591 41.0 3936 0.0491 0.8515 0.9149 0.8821 94 0.8802 0.8802 0.8802 167 1.0 0.9854 0.9926 137 0.9132 0.9246 0.9189 0.9834
0.058 42.0 4032 0.0480 0.8687 0.9149 0.8912 94 0.8757 0.8862 0.8810 167 1.0 0.9854 0.9926 137 0.9156 0.9271 0.9213 0.9840
0.0587 43.0 4128 0.0494 0.8350 0.9149 0.8731 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9055 0.9146 0.91 0.9820
0.0562 44.0 4224 0.0482 0.8515 0.9149 0.8821 94 0.8788 0.8683 0.8735 167 1.0 0.9854 0.9926 137 0.9127 0.9196 0.9161 0.9829
0.0565 45.0 4320 0.0471 0.8529 0.9255 0.8878 94 0.8795 0.8743 0.8769 167 1.0 0.9854 0.9926 137 0.9132 0.9246 0.9189 0.9837
0.0541 46.0 4416 0.0482 0.8365 0.9255 0.8788 94 0.8795 0.8743 0.8769 167 1.0 0.9854 0.9926 137 0.9086 0.9246 0.9166 0.9831
0.0547 47.0 4512 0.0487 0.8350 0.9149 0.8731 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9055 0.9146 0.91 0.9823
0.0537 48.0 4608 0.0480 0.8269 0.9149 0.8687 94 0.8659 0.8503 0.8580 167 1.0 0.9854 0.9926 137 0.9007 0.9121 0.9064 0.9829
0.0525 49.0 4704 0.0477 0.8416 0.9043 0.8718 94 0.8882 0.8563 0.8720 167 1.0 0.9854 0.9926 137 0.9144 0.9121 0.9132 0.9826
0.0513 50.0 4800 0.0472 0.86 0.9149 0.8866 94 0.8596 0.8802 0.8698 167 1.0 0.9854 0.9926 137 0.9064 0.9246 0.9154 0.9845
0.0507 51.0 4896 0.0481 0.8286 0.9255 0.8744 94 0.875 0.8383 0.8563 167 1.0 0.9854 0.9926 137 0.905 0.9095 0.9073 0.9820
0.0499 52.0 4992 0.0472 0.87 0.9255 0.8969 94 0.8757 0.8862 0.8810 167 1.0 0.9854 0.9926 137 0.9158 0.9296 0.9227 0.9837
0.0519 53.0 5088 0.0471 0.8614 0.9255 0.8923 94 0.8743 0.8743 0.8743 167 1.0 0.9854 0.9926 137 0.9132 0.9246 0.9189 0.9840
0.0523 54.0 5184 0.0483 0.8286 0.9255 0.8744 94 0.8545 0.8443 0.8494 167 1.0 0.9854 0.9926 137 0.8963 0.9121 0.9041 0.9826
0.0507 55.0 5280 0.0465 0.8447 0.9255 0.8832 94 0.8614 0.8563 0.8589 167 1.0 0.9854 0.9926 137 0.9035 0.9171 0.9102 0.9831
0.0506 56.0 5376 0.0465 0.8447 0.9255 0.8832 94 0.8614 0.8563 0.8589 167 1.0 0.9854 0.9926 137 0.9035 0.9171 0.9102 0.9831
0.0504 57.0 5472 0.0475 0.8208 0.9255 0.8700 94 0.8452 0.8503 0.8478 167 1.0 0.9854 0.9926 137 0.8900 0.9146 0.9021 0.9831
0.0484 58.0 5568 0.0462 0.8302 0.9362 0.88 94 0.8659 0.8503 0.8580 167 1.0 0.9854 0.9926 137 0.9012 0.9171 0.9091 0.9837
0.0487 59.0 5664 0.0457 0.8447 0.9255 0.8832 94 0.8727 0.8623 0.8675 167 1.0 0.9854 0.9926 137 0.9082 0.9196 0.9139 0.9837
0.0463 60.0 5760 0.0475 0.8365 0.9255 0.8788 94 0.8623 0.8623 0.8623 167 1.0 0.9854 0.9926 137 0.9015 0.9196 0.9104 0.9848
0.0462 61.0 5856 0.0469 0.8529 0.9255 0.8878 94 0.8655 0.8862 0.8757 167 1.0 0.9854 0.9926 137 0.9069 0.9296 0.9181 0.9848
0.0497 62.0 5952 0.0469 0.8544 0.9362 0.8934 94 0.8521 0.8623 0.8571 167 1.0 0.9854 0.9926 137 0.9017 0.9221 0.9118 0.9845
0.0465 63.0 6048 0.0469 0.8515 0.9149 0.8821 94 0.8683 0.8683 0.8683 167 1.0 0.9854 0.9926 137 0.9082 0.9196 0.9139 0.9848
0.0468 64.0 6144 0.0470 0.86 0.9149 0.8866 94 0.8841 0.8683 0.8761 167 1.0 0.9854 0.9926 137 0.9173 0.9196 0.9184 0.9843
0.0455 65.0 6240 0.0467 0.8462 0.9362 0.8889 94 0.8675 0.8623 0.8649 167 1.0 0.9854 0.9926 137 0.9062 0.9221 0.9141 0.9845
0.0456 66.0 6336 0.0463 0.8431 0.9149 0.8776 94 0.8712 0.8503 0.8606 167 1.0 0.9854 0.9926 137 0.9075 0.9121 0.9098 0.9834
0.0436 67.0 6432 0.0457 0.8365 0.9255 0.8788 94 0.8773 0.8563 0.8667 167 1.0 0.9854 0.9926 137 0.9080 0.9171 0.9125 0.9837
0.0442 68.0 6528 0.0464 0.8365 0.9255 0.8788 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9057 0.9171 0.9114 0.9837
0.0463 69.0 6624 0.0463 0.8447 0.9255 0.8832 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9080 0.9171 0.9125 0.9840
0.0445 70.0 6720 0.0457 0.8529 0.9255 0.8878 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9102 0.9171 0.9136 0.9840
0.0456 71.0 6816 0.0474 0.8462 0.9362 0.8889 94 0.8788 0.8683 0.8735 167 1.0 0.9854 0.9926 137 0.9109 0.9246 0.9177 0.9851
0.0473 72.0 6912 0.0479 0.8381 0.9362 0.8844 94 0.8659 0.8503 0.8580 167 1.0 0.9854 0.9926 137 0.9035 0.9171 0.9102 0.9837
0.0434 73.0 7008 0.0475 0.8381 0.9362 0.8844 94 0.8712 0.8503 0.8606 167 1.0 0.9854 0.9926 137 0.9057 0.9171 0.9114 0.9840
0.042 74.0 7104 0.0463 0.8462 0.9362 0.8889 94 0.8765 0.8503 0.8632 167 1.0 0.9854 0.9926 137 0.9102 0.9171 0.9136 0.9837
0.0438 75.0 7200 0.0463 0.8462 0.9362 0.8889 94 0.8765 0.8503 0.8632 167 1.0 0.9854 0.9926 137 0.9102 0.9171 0.9136 0.9837
0.0437 76.0 7296 0.0459 0.8462 0.9362 0.8889 94 0.8623 0.8623 0.8623 167 1.0 0.9854 0.9926 137 0.9039 0.9221 0.9129 0.9843
0.0455 77.0 7392 0.0469 0.8381 0.9362 0.8844 94 0.8827 0.8563 0.8693 167 1.0 0.9854 0.9926 137 0.9104 0.9196 0.9150 0.9840
0.0426 78.0 7488 0.0467 0.8381 0.9362 0.8844 94 0.8727 0.8623 0.8675 167 1.0 0.9854 0.9926 137 0.9062 0.9221 0.9141 0.9848
0.043 79.0 7584 0.0457 0.8381 0.9362 0.8844 94 0.8735 0.8683 0.8709 167 1.0 0.9854 0.9926 137 0.9064 0.9246 0.9154 0.9854
0.0435 80.0 7680 0.0462 0.8381 0.9362 0.8844 94 0.8727 0.8623 0.8675 167 1.0 0.9854 0.9926 137 0.9062 0.9221 0.9141 0.9851
0.0411 81.0 7776 0.0461 0.8381 0.9362 0.8844 94 0.8606 0.8503 0.8554 167 1.0 0.9854 0.9926 137 0.9012 0.9171 0.9091 0.9843
0.0421 82.0 7872 0.0458 0.8544 0.9362 0.8934 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9104 0.9196 0.9150 0.9843
0.0416 83.0 7968 0.0462 0.8381 0.9362 0.8844 94 0.8773 0.8563 0.8667 167 1.0 0.9854 0.9926 137 0.9082 0.9196 0.9139 0.9843
0.0412 84.0 8064 0.0461 0.8462 0.9362 0.8889 94 0.8788 0.8683 0.8735 167 1.0 0.9854 0.9926 137 0.9109 0.9246 0.9177 0.9851
0.0428 85.0 8160 0.0465 0.8462 0.9362 0.8889 94 0.8773 0.8563 0.8667 167 1.0 0.9854 0.9926 137 0.9104 0.9196 0.9150 0.9845
0.0434 86.0 8256 0.0467 0.8381 0.9362 0.8844 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9059 0.9196 0.9127 0.9840
0.0411 87.0 8352 0.0466 0.8381 0.9362 0.8844 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9059 0.9196 0.9127 0.9840
0.0436 88.0 8448 0.0467 0.8381 0.9362 0.8844 94 0.8780 0.8623 0.8701 167 1.0 0.9854 0.9926 137 0.9084 0.9221 0.9152 0.9848
0.0413 89.0 8544 0.0460 0.8544 0.9362 0.8934 94 0.8795 0.8743 0.8769 167 1.0 0.9854 0.9926 137 0.9134 0.9271 0.9202 0.9854
0.0401 90.0 8640 0.0467 0.8462 0.9362 0.8889 94 0.8675 0.8623 0.8649 167 1.0 0.9854 0.9926 137 0.9062 0.9221 0.9141 0.9848
0.0421 91.0 8736 0.0467 0.8462 0.9362 0.8889 94 0.8780 0.8623 0.8701 167 1.0 0.9854 0.9926 137 0.9107 0.9221 0.9164 0.9845
0.0407 92.0 8832 0.0462 0.8462 0.9362 0.8889 94 0.8773 0.8563 0.8667 167 1.0 0.9854 0.9926 137 0.9104 0.9196 0.9150 0.9845
0.0449 93.0 8928 0.0463 0.8462 0.9362 0.8889 94 0.8773 0.8563 0.8667 167 1.0 0.9854 0.9926 137 0.9104 0.9196 0.9150 0.9845
0.0397 94.0 9024 0.0462 0.8381 0.9362 0.8844 94 0.8667 0.8563 0.8614 167 1.0 0.9854 0.9926 137 0.9037 0.9196 0.9116 0.9845
0.0417 95.0 9120 0.0463 0.8381 0.9362 0.8844 94 0.8667 0.8563 0.8614 167 1.0 0.9854 0.9926 137 0.9037 0.9196 0.9116 0.9845
0.0402 96.0 9216 0.0465 0.8381 0.9362 0.8844 94 0.8780 0.8623 0.8701 167 1.0 0.9854 0.9926 137 0.9084 0.9221 0.9152 0.9848
0.0422 97.0 9312 0.0464 0.8462 0.9362 0.8889 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9082 0.9196 0.9139 0.9851
0.0417 98.0 9408 0.0463 0.8462 0.9362 0.8889 94 0.8720 0.8563 0.8640 167 1.0 0.9854 0.9926 137 0.9082 0.9196 0.9139 0.9851
0.0409 99.0 9504 0.0463 0.8462 0.9362 0.8889 94 0.8667 0.8563 0.8614 167 1.0 0.9854 0.9926 137 0.9059 0.9196 0.9127 0.9848
0.0404 100.0 9600 0.0463 0.8462 0.9362 0.8889 94 0.8667 0.8563 0.8614 167 1.0 0.9854 0.9926 137 0.9059 0.9196 0.9127 0.9848

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerui-lora-r8-0

Finetuned
(365)
this model