Edit model card

nerugm-lora-r8-0

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1291
  • Location Precision: 0.7975
  • Location Recall: 0.9130
  • Location F1: 0.8514
  • Location Number: 69
  • Organization Precision: 0.6338
  • Organization Recall: 0.7759
  • Organization F1: 0.6977
  • Organization Number: 58
  • Person Precision: 0.8415
  • Person Recall: 0.9079
  • Person F1: 0.8734
  • Person Number: 152
  • Quantity Precision: 0.6216
  • Quantity Recall: 0.7667
  • Quantity F1: 0.6866
  • Quantity Number: 30
  • Time Precision: 0.7353
  • Time Recall: 0.8621
  • Time F1: 0.7937
  • Time Number: 29
  • Overall Precision: 0.7636
  • Overall Recall: 0.8698
  • Overall F1: 0.8133
  • Overall Accuracy: 0.9585

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100.0

Training results

Training Loss Epoch Step Validation Loss Location Precision Location Recall Location F1 Location Number Organization Precision Organization Recall Organization F1 Organization Number Person Precision Person Recall Person F1 Person Number Quantity Precision Quantity Recall Quantity F1 Quantity Number Time Precision Time Recall Time F1 Time Number Overall Precision Overall Recall Overall F1 Overall Accuracy
1.2502 1.0 106 0.7255 0.0 0.0 0.0 69 0.0 0.0 0.0 58 0.0 0.0 0.0 152 0.0 0.0 0.0 30 0.0 0.0 0.0 29 0.0 0.0 0.0 0.8397
0.7059 2.0 212 0.6194 0.0 0.0 0.0 69 0.0 0.0 0.0 58 0.2 0.0066 0.0127 152 0.0 0.0 0.0 30 0.0 0.0 0.0 29 0.1667 0.0030 0.0058 0.8405
0.6143 3.0 318 0.5261 0.5 0.0145 0.0282 69 0.0 0.0 0.0 58 0.25 0.0395 0.0682 152 0.0 0.0 0.0 30 0.0 0.0 0.0 29 0.2414 0.0207 0.0381 0.8456
0.5231 4.0 424 0.4244 0.375 0.0435 0.0779 69 0.0 0.0 0.0 58 0.3962 0.2763 0.3256 152 0.0 0.0 0.0 30 0.3846 0.1724 0.2381 29 0.3846 0.1479 0.2137 0.8725
0.4203 5.0 530 0.3370 0.4634 0.2754 0.3455 69 0.2308 0.1034 0.1429 58 0.6341 0.6842 0.6582 152 0.0 0.0 0.0 30 0.6 0.5172 0.5556 29 0.5393 0.4260 0.4760 0.9055
0.3379 6.0 636 0.2737 0.5128 0.5797 0.5442 69 0.3673 0.3103 0.3364 58 0.6895 0.8618 0.7661 152 0.3636 0.4 0.3810 30 0.625 0.6897 0.6557 29 0.5785 0.6538 0.6139 0.9245
0.2785 7.0 742 0.2344 0.6235 0.7681 0.6883 69 0.4237 0.4310 0.4274 58 0.7166 0.8816 0.7906 152 0.3611 0.4333 0.3939 30 0.6774 0.7241 0.7 29 0.6181 0.7278 0.6685 0.9306
0.243 8.0 848 0.2089 0.6790 0.7971 0.7333 69 0.44 0.5690 0.4962 58 0.7068 0.8882 0.7872 152 0.3784 0.4667 0.4179 30 0.7 0.7241 0.7119 29 0.6232 0.7633 0.6862 0.9350
0.2201 9.0 954 0.1920 0.6824 0.8406 0.7532 69 0.4730 0.6034 0.5303 58 0.7143 0.8882 0.7918 152 0.4359 0.5667 0.4928 30 0.6774 0.7241 0.7 29 0.6364 0.7870 0.7037 0.9375
0.2013 10.0 1060 0.1749 0.7317 0.8696 0.7947 69 0.4730 0.6034 0.5303 58 0.7514 0.8947 0.8168 152 0.5 0.6 0.5455 30 0.6875 0.7586 0.7213 29 0.6691 0.8018 0.7295 0.9426
0.1916 11.0 1166 0.1710 0.7143 0.8696 0.7843 69 0.5270 0.6724 0.5909 58 0.7273 0.8947 0.8024 152 0.5135 0.6333 0.5672 30 0.6970 0.7931 0.7419 29 0.6675 0.8195 0.7357 0.9437
0.1808 12.0 1272 0.1595 0.7692 0.8696 0.8163 69 0.5479 0.6897 0.6107 58 0.7391 0.8947 0.8095 152 0.5429 0.6333 0.5846 30 0.7273 0.8276 0.7742 29 0.6923 0.8254 0.7530 0.9472
0.1696 13.0 1378 0.1539 0.7792 0.8696 0.8219 69 0.5417 0.6724 0.6 58 0.7459 0.8882 0.8108 152 0.5833 0.7 0.6364 30 0.7812 0.8621 0.8197 29 0.7035 0.8284 0.7609 0.9480
0.1658 14.0 1484 0.1547 0.7561 0.8986 0.8212 69 0.5412 0.7931 0.6434 58 0.7670 0.8882 0.8232 152 0.5405 0.6667 0.5970 30 0.7273 0.8276 0.7742 29 0.6949 0.8491 0.7643 0.9496
0.159 15.0 1590 0.1460 0.7848 0.8986 0.8378 69 0.5455 0.7241 0.6222 58 0.7803 0.8882 0.8308 152 0.5556 0.6667 0.6061 30 0.7273 0.8276 0.7742 29 0.7111 0.8373 0.7690 0.9524
0.1532 16.0 1696 0.1406 0.7922 0.8841 0.8356 69 0.5405 0.6897 0.6061 58 0.7849 0.8882 0.8333 152 0.5455 0.6 0.5714 30 0.6970 0.7931 0.7419 29 0.7121 0.8195 0.7620 0.9516
0.1509 17.0 1802 0.1486 0.7470 0.8986 0.8158 69 0.55 0.7586 0.6377 58 0.7714 0.8882 0.8257 152 0.6486 0.8 0.7164 30 0.6765 0.7931 0.7302 29 0.7042 0.8521 0.7711 0.9506
0.149 18.0 1908 0.1416 0.7654 0.8986 0.8267 69 0.5541 0.7069 0.6212 58 0.7861 0.8947 0.8369 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7193 0.8491 0.7788 0.9529
0.1443 19.0 2014 0.1380 0.7949 0.8986 0.8435 69 0.5833 0.7241 0.6462 58 0.7907 0.8947 0.8395 152 0.6486 0.8 0.7164 30 0.7576 0.8621 0.8065 29 0.7372 0.8550 0.7918 0.9547
0.1394 20.0 2120 0.1444 0.7326 0.9130 0.8129 69 0.5301 0.7586 0.6241 58 0.7771 0.8947 0.8318 152 0.6053 0.7667 0.6765 30 0.7879 0.8966 0.8387 29 0.7036 0.8639 0.7756 0.9511
0.1375 21.0 2226 0.1398 0.7561 0.8986 0.8212 69 0.5556 0.7759 0.6475 58 0.8 0.8947 0.8447 152 0.6389 0.7667 0.6970 30 0.7879 0.8966 0.8387 29 0.7264 0.8639 0.7892 0.9547
0.1335 22.0 2332 0.1366 0.7975 0.9130 0.8514 69 0.5769 0.7759 0.6618 58 0.8 0.8947 0.8447 152 0.6111 0.7333 0.6667 30 0.8125 0.8966 0.8525 29 0.7392 0.8639 0.7967 0.9565
0.1346 23.0 2438 0.1391 0.7875 0.9130 0.8456 69 0.5867 0.7586 0.6617 58 0.7829 0.9013 0.8379 152 0.6389 0.7667 0.6970 30 0.8125 0.8966 0.8525 29 0.7362 0.8669 0.7962 0.9547
0.1338 24.0 2544 0.1434 0.7529 0.9275 0.8312 69 0.6 0.7759 0.6767 58 0.7784 0.9013 0.8354 152 0.6486 0.8 0.7164 30 0.7353 0.8621 0.7937 29 0.7248 0.8728 0.7919 0.9534
0.1302 25.0 2650 0.1354 0.7975 0.9130 0.8514 69 0.5844 0.7759 0.6667 58 0.8 0.8947 0.8447 152 0.6389 0.7667 0.6970 30 0.7353 0.8621 0.7937 29 0.7374 0.8639 0.7956 0.9554
0.1262 26.0 2756 0.1411 0.7529 0.9275 0.8312 69 0.5844 0.7759 0.6667 58 0.7874 0.9013 0.8405 152 0.6486 0.8 0.7164 30 0.7353 0.8621 0.7937 29 0.7248 0.8728 0.7919 0.9542
0.1258 27.0 2862 0.1355 0.7683 0.9130 0.8344 69 0.6418 0.7414 0.688 58 0.7965 0.9013 0.8457 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7423 0.8609 0.7973 0.9552
0.1271 28.0 2968 0.1359 0.7590 0.9130 0.8289 69 0.5811 0.7414 0.6515 58 0.7953 0.8947 0.8421 152 0.6667 0.8 0.7273 30 0.7353 0.8621 0.7937 29 0.7312 0.8609 0.7908 0.9552
0.1223 29.0 3074 0.1336 0.7683 0.9130 0.8344 69 0.6324 0.7414 0.6825 58 0.8047 0.8947 0.8474 152 0.6389 0.7667 0.6970 30 0.7879 0.8966 0.8387 29 0.75 0.8609 0.8017 0.9572
0.1198 30.0 3180 0.1322 0.7875 0.9130 0.8456 69 0.5972 0.7414 0.6615 58 0.8095 0.8947 0.8500 152 0.6389 0.7667 0.6970 30 0.7647 0.8966 0.8254 29 0.7462 0.8609 0.7995 0.9570
0.1188 31.0 3286 0.1313 0.7805 0.9275 0.8477 69 0.6143 0.7414 0.6719 58 0.8 0.8947 0.8447 152 0.6389 0.7667 0.6970 30 0.7647 0.8966 0.8254 29 0.7449 0.8639 0.8 0.9570
0.1196 32.0 3392 0.1309 0.7778 0.9130 0.84 69 0.5890 0.7414 0.6565 58 0.8095 0.8947 0.8500 152 0.6389 0.7667 0.6970 30 0.7647 0.8966 0.8254 29 0.7423 0.8609 0.7973 0.9562
0.1191 33.0 3498 0.1288 0.7778 0.9130 0.84 69 0.6 0.7241 0.6562 58 0.8144 0.8947 0.8527 152 0.6389 0.7667 0.6970 30 0.7941 0.9310 0.8571 29 0.75 0.8609 0.8017 0.9570
0.1138 34.0 3604 0.1353 0.7805 0.9275 0.8477 69 0.5455 0.7241 0.6222 58 0.8047 0.8947 0.8474 152 0.6111 0.7333 0.6667 30 0.7647 0.8966 0.8254 29 0.7286 0.8580 0.7880 0.9554
0.1137 35.0 3710 0.1366 0.7711 0.9275 0.8421 69 0.5775 0.7069 0.6357 58 0.7965 0.9013 0.8457 152 0.6216 0.7667 0.6866 30 0.7647 0.8966 0.8254 29 0.7330 0.8609 0.7918 0.9552
0.1138 36.0 3816 0.1324 0.8 0.9275 0.8591 69 0.5811 0.7414 0.6515 58 0.8059 0.9013 0.8509 152 0.6216 0.7667 0.6866 30 0.7647 0.8966 0.8254 29 0.7418 0.8669 0.7995 0.9570
0.1151 37.0 3922 0.1325 0.7805 0.9275 0.8477 69 0.5972 0.7414 0.6615 58 0.8144 0.8947 0.8527 152 0.6216 0.7667 0.6866 30 0.7647 0.8966 0.8254 29 0.7449 0.8639 0.8 0.9577
0.1083 38.0 4028 0.1363 0.7619 0.9275 0.8366 69 0.5513 0.7414 0.6324 58 0.8 0.8947 0.8447 152 0.6216 0.7667 0.6866 30 0.7647 0.8966 0.8254 29 0.7246 0.8639 0.7881 0.9536
0.1098 39.0 4134 0.1288 0.8101 0.9275 0.8649 69 0.6471 0.7586 0.6984 58 0.8193 0.8947 0.8553 152 0.6389 0.7667 0.6970 30 0.7647 0.8966 0.8254 29 0.7650 0.8669 0.8128 0.9590
0.1087 40.0 4240 0.1278 0.8101 0.9275 0.8649 69 0.6377 0.7586 0.6929 58 0.8059 0.9013 0.8509 152 0.6111 0.7333 0.6667 30 0.7714 0.9310 0.8438 29 0.7558 0.8698 0.8088 0.9585
0.1075 41.0 4346 0.1270 0.7975 0.9130 0.8514 69 0.6567 0.7586 0.7040 58 0.8059 0.9013 0.8509 152 0.6111 0.7333 0.6667 30 0.7714 0.9310 0.8438 29 0.7571 0.8669 0.8083 0.9583
0.106 42.0 4452 0.1259 0.8 0.9275 0.8591 69 0.6615 0.7414 0.6992 58 0.8059 0.9013 0.8509 152 0.6389 0.7667 0.6970 30 0.7714 0.9310 0.8438 29 0.7617 0.8698 0.8122 0.9588
0.1079 43.0 4558 0.1328 0.7805 0.9275 0.8477 69 0.6301 0.7931 0.7023 58 0.8012 0.9013 0.8483 152 0.6389 0.7667 0.6970 30 0.7273 0.8276 0.7742 29 0.7443 0.8698 0.8022 0.9565
0.1072 44.0 4664 0.1249 0.8 0.9275 0.8591 69 0.6119 0.7069 0.6560 58 0.8012 0.9013 0.8483 152 0.6111 0.7333 0.6667 30 0.7714 0.9310 0.8438 29 0.7481 0.8609 0.8006 0.9577
0.1049 45.0 4770 0.1304 0.7778 0.9130 0.84 69 0.6338 0.7759 0.6977 58 0.8059 0.9013 0.8509 152 0.6216 0.7667 0.6866 30 0.7647 0.8966 0.8254 29 0.7481 0.8698 0.8044 0.9567
0.1033 46.0 4876 0.1285 0.8 0.9275 0.8591 69 0.6216 0.7931 0.6970 58 0.8059 0.9013 0.8509 152 0.6389 0.7667 0.6970 30 0.7714 0.9310 0.8438 29 0.7519 0.8787 0.8104 0.9585
0.1039 47.0 4982 0.1308 0.7901 0.9275 0.8533 69 0.625 0.7759 0.6923 58 0.8084 0.8882 0.8464 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7468 0.8639 0.8011 0.9572
0.0992 48.0 5088 0.1281 0.8077 0.9130 0.8571 69 0.6338 0.7759 0.6977 58 0.8059 0.9013 0.8509 152 0.6389 0.7667 0.6970 30 0.7353 0.8621 0.7937 29 0.7532 0.8669 0.8061 0.9572
0.1023 49.0 5194 0.1259 0.8077 0.9130 0.8571 69 0.6301 0.7931 0.7023 58 0.8012 0.9013 0.8483 152 0.6667 0.8 0.7273 30 0.7714 0.9310 0.8438 29 0.7557 0.8787 0.8126 0.9577
0.103 50.0 5300 0.1216 0.8243 0.8841 0.8531 69 0.6308 0.7069 0.6667 58 0.8193 0.8947 0.8553 152 0.6389 0.7667 0.6970 30 0.7714 0.9310 0.8438 29 0.7660 0.8521 0.8067 0.9585
0.0995 51.0 5406 0.1284 0.7901 0.9275 0.8533 69 0.6111 0.7586 0.6769 58 0.8204 0.9013 0.8589 152 0.6389 0.7667 0.6970 30 0.7353 0.8621 0.7937 29 0.7513 0.8669 0.8049 0.9575
0.0967 52.0 5512 0.1298 0.8 0.9275 0.8591 69 0.6301 0.7931 0.7023 58 0.8383 0.9211 0.8777 152 0.6389 0.7667 0.6970 30 0.7273 0.8276 0.7742 29 0.7635 0.8787 0.8171 0.9583
0.0999 53.0 5618 0.1234 0.8077 0.9130 0.8571 69 0.6269 0.7241 0.672 58 0.8036 0.8882 0.8437 152 0.6389 0.7667 0.6970 30 0.7714 0.9310 0.8438 29 0.7552 0.8580 0.8033 0.9583
0.0994 54.0 5724 0.1243 0.8077 0.9130 0.8571 69 0.6364 0.7241 0.6774 58 0.8144 0.8947 0.8527 152 0.6389 0.7667 0.6970 30 0.7714 0.9310 0.8438 29 0.7618 0.8609 0.8083 0.9583
0.0993 55.0 5830 0.1302 0.7901 0.9275 0.8533 69 0.6216 0.7931 0.6970 58 0.8225 0.9145 0.8660 152 0.6389 0.7667 0.6970 30 0.7353 0.8621 0.7937 29 0.7538 0.8787 0.8115 0.9577
0.0968 56.0 5936 0.1253 0.7901 0.9275 0.8533 69 0.6462 0.7241 0.6829 58 0.8036 0.8882 0.8437 152 0.6389 0.7667 0.6970 30 0.7353 0.8621 0.7937 29 0.7526 0.8550 0.8006 0.9572
0.0955 57.0 6042 0.1264 0.8 0.9275 0.8591 69 0.6377 0.7586 0.6929 58 0.8274 0.9145 0.8687 152 0.6216 0.7667 0.6866 30 0.7429 0.8966 0.8125 29 0.7609 0.8757 0.8143 0.9583
0.0939 58.0 6148 0.1252 0.8101 0.9275 0.8649 69 0.6389 0.7931 0.7077 58 0.8204 0.9013 0.8589 152 0.6389 0.7667 0.6970 30 0.7429 0.8966 0.8125 29 0.7609 0.8757 0.8143 0.9590
0.0939 59.0 6254 0.1213 0.8101 0.9275 0.8649 69 0.6418 0.7414 0.688 58 0.8036 0.8882 0.8437 152 0.6389 0.7667 0.6970 30 0.7714 0.9310 0.8438 29 0.7584 0.8639 0.8077 0.9580
0.0955 60.0 6360 0.1300 0.8 0.9275 0.8591 69 0.6324 0.7414 0.6825 58 0.8204 0.9013 0.8589 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7565 0.8639 0.8066 0.9580
0.0941 61.0 6466 0.1269 0.8 0.9275 0.8591 69 0.6522 0.7759 0.7087 58 0.8144 0.8947 0.8527 152 0.6389 0.7667 0.6970 30 0.7647 0.8966 0.8254 29 0.7617 0.8698 0.8122 0.9588
0.0948 62.0 6572 0.1268 0.8101 0.9275 0.8649 69 0.6471 0.7586 0.6984 58 0.8204 0.9013 0.8589 152 0.6389 0.7667 0.6970 30 0.7353 0.8621 0.7937 29 0.7630 0.8669 0.8116 0.9585
0.0932 63.0 6678 0.1251 0.8101 0.9275 0.8649 69 0.6567 0.7586 0.7040 58 0.8214 0.9079 0.8625 152 0.6389 0.7667 0.6970 30 0.7353 0.8621 0.7937 29 0.7656 0.8698 0.8144 0.9593
0.0918 64.0 6784 0.1311 0.8 0.9275 0.8591 69 0.6377 0.7586 0.6929 58 0.8263 0.9079 0.8652 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7597 0.8698 0.8110 0.9583
0.0924 65.0 6890 0.1265 0.8 0.9275 0.8591 69 0.6429 0.7759 0.7031 58 0.8303 0.9013 0.8644 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7617 0.8698 0.8122 0.9588
0.0951 66.0 6996 0.1274 0.8 0.9275 0.8591 69 0.6429 0.7759 0.7031 58 0.8204 0.9013 0.8589 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7577 0.8698 0.8099 0.9583
0.0905 67.0 7102 0.1302 0.8 0.9275 0.8591 69 0.625 0.7759 0.6923 58 0.8313 0.9079 0.8679 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7584 0.8728 0.8116 0.9580
0.0907 68.0 7208 0.1307 0.8 0.9275 0.8591 69 0.6197 0.7586 0.6822 58 0.8373 0.9145 0.8742 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7603 0.8728 0.8127 0.9575
0.0922 69.0 7314 0.1293 0.8 0.9275 0.8591 69 0.6301 0.7931 0.7023 58 0.8313 0.9079 0.8679 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7590 0.8757 0.8132 0.9580
0.0895 70.0 7420 0.1219 0.8052 0.8986 0.8493 69 0.6667 0.7241 0.6942 58 0.7976 0.8816 0.8375 152 0.5946 0.7333 0.6567 30 0.7576 0.8621 0.8065 29 0.7540 0.8432 0.7961 0.9580
0.0896 71.0 7526 0.1266 0.8077 0.9130 0.8571 69 0.6338 0.7759 0.6977 58 0.8155 0.9013 0.8562 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7552 0.8669 0.8072 0.9580
0.0905 72.0 7632 0.1289 0.7975 0.9130 0.8514 69 0.625 0.7759 0.6923 58 0.8263 0.9079 0.8652 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7558 0.8698 0.8088 0.9575
0.0879 73.0 7738 0.1252 0.8077 0.9130 0.8571 69 0.6471 0.7586 0.6984 58 0.8204 0.9013 0.8589 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7604 0.8639 0.8089 0.9580
0.0902 74.0 7844 0.1328 0.8 0.9275 0.8591 69 0.6164 0.7759 0.6870 58 0.8424 0.9145 0.8770 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7609 0.8757 0.8143 0.9577
0.088 75.0 7950 0.1279 0.7975 0.9130 0.8514 69 0.6197 0.7586 0.6822 58 0.8424 0.9145 0.8770 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7617 0.8698 0.8122 0.9583
0.0881 76.0 8056 0.1255 0.8077 0.9130 0.8571 69 0.6377 0.7586 0.6929 58 0.8424 0.9145 0.8770 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7676 0.8698 0.8155 0.9593
0.0869 77.0 8162 0.1273 0.8077 0.9130 0.8571 69 0.6338 0.7759 0.6977 58 0.8424 0.9145 0.8770 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7662 0.8728 0.8160 0.9588
0.09 78.0 8268 0.1268 0.8077 0.9130 0.8571 69 0.6286 0.7586 0.6875 58 0.8424 0.9145 0.8770 152 0.6216 0.7667 0.6866 30 0.7647 0.8966 0.8254 29 0.7682 0.8728 0.8172 0.9590
0.0875 79.0 8374 0.1275 0.7975 0.9130 0.8514 69 0.6111 0.7586 0.6769 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7571 0.8669 0.8083 0.9572
0.0879 80.0 8480 0.1292 0.7975 0.9130 0.8514 69 0.6111 0.7586 0.6769 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7571 0.8669 0.8083 0.9575
0.0877 81.0 8586 0.1294 0.7975 0.9130 0.8514 69 0.625 0.7759 0.6923 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7597 0.8698 0.8110 0.9572
0.0845 82.0 8692 0.1288 0.7975 0.9130 0.8514 69 0.6286 0.7586 0.6875 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7610 0.8669 0.8105 0.9583
0.0867 83.0 8798 0.1259 0.7949 0.8986 0.8435 69 0.6286 0.7586 0.6875 58 0.8415 0.9079 0.8734 152 0.6216 0.7667 0.6866 30 0.7647 0.8966 0.8254 29 0.7650 0.8669 0.8128 0.9588
0.0861 84.0 8904 0.1289 0.7975 0.9130 0.8514 69 0.6286 0.7586 0.6875 58 0.8415 0.9079 0.8734 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7630 0.8669 0.8116 0.9585
0.0896 85.0 9010 0.1306 0.7975 0.9130 0.8514 69 0.6338 0.7759 0.6977 58 0.8424 0.9145 0.8770 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7642 0.8728 0.8149 0.9585
0.0863 86.0 9116 0.1287 0.7975 0.9130 0.8514 69 0.6286 0.7586 0.6875 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7610 0.8669 0.8105 0.9583
0.0861 87.0 9222 0.1303 0.7975 0.9130 0.8514 69 0.625 0.7759 0.6923 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7597 0.8698 0.8110 0.9577
0.0844 88.0 9328 0.1293 0.7975 0.9130 0.8514 69 0.6429 0.7759 0.7031 58 0.8424 0.9145 0.8770 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7662 0.8728 0.8160 0.9593
0.0858 89.0 9434 0.1283 0.7975 0.9130 0.8514 69 0.6429 0.7759 0.7031 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7636 0.8698 0.8133 0.9590
0.0847 90.0 9540 0.1291 0.7975 0.9130 0.8514 69 0.6338 0.7759 0.6977 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7617 0.8698 0.8122 0.9588
0.0854 91.0 9646 0.1283 0.7975 0.9130 0.8514 69 0.6429 0.7759 0.7031 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7647 0.8966 0.8254 29 0.7662 0.8728 0.8160 0.9590
0.0832 92.0 9752 0.1289 0.7975 0.9130 0.8514 69 0.6429 0.7759 0.7031 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7647 0.8966 0.8254 29 0.7662 0.8728 0.8160 0.9593
0.0849 93.0 9858 0.1296 0.7975 0.9130 0.8514 69 0.6338 0.7759 0.6977 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7617 0.8698 0.8122 0.9588
0.0853 94.0 9964 0.1278 0.8077 0.9130 0.8571 69 0.6429 0.7759 0.7031 58 0.8415 0.9079 0.8734 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7676 0.8698 0.8155 0.9588
0.0858 95.0 10070 0.1280 0.7975 0.9130 0.8514 69 0.6429 0.7759 0.7031 58 0.8415 0.9079 0.8734 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7656 0.8698 0.8144 0.9588
0.0838 96.0 10176 0.1294 0.7975 0.9130 0.8514 69 0.6338 0.7759 0.6977 58 0.8364 0.9079 0.8707 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7617 0.8698 0.8122 0.9588
0.0868 97.0 10282 0.1290 0.7975 0.9130 0.8514 69 0.6429 0.7759 0.7031 58 0.8415 0.9079 0.8734 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7656 0.8698 0.8144 0.9588
0.081 98.0 10388 0.1292 0.7975 0.9130 0.8514 69 0.6338 0.7759 0.6977 58 0.8415 0.9079 0.8734 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7636 0.8698 0.8133 0.9585
0.0834 99.0 10494 0.1290 0.7975 0.9130 0.8514 69 0.6429 0.7759 0.7031 58 0.8415 0.9079 0.8734 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7656 0.8698 0.8144 0.9588
0.0835 100.0 10600 0.1291 0.7975 0.9130 0.8514 69 0.6338 0.7759 0.6977 58 0.8415 0.9079 0.8734 152 0.6216 0.7667 0.6866 30 0.7353 0.8621 0.7937 29 0.7636 0.8698 0.8133 0.9585

Framework versions

  • Transformers 4.39.3
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.1
  • Tokenizers 0.15.2
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model's library. Check the docs .

Model tree for apwic/nerugm-lora-r8-0

Finetuned
(365)
this model