--- license: mit base_model: BAAI/bge-base-en-v1.5 tags: - generated_from_trainer model-index: - name: IKI-Category-multilabel_bge results: [] --- # IKI-Category-multilabel_bge This model is a fine-tuned version of [BAAI/bge-base-en-v1.5](https://huggingface.co./BAAI/bge-base-en-v1.5) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.4541 - Precision-micro: 0.75 - Precision-samples: 0.7708 - Precision-weighted: 0.7517 - Recall-micro: 0.7880 - Recall-samples: 0.7858 - Recall-weighted: 0.7880 - F1-micro: 0.7685 - F1-samples: 0.7537 - F1-weighted: 0.7615 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 4.5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 200 - num_epochs: 15 ### Training results | Training Loss | Epoch | Step | Validation Loss | Precision-micro | Precision-samples | Precision-weighted | Recall-micro | Recall-samples | Recall-weighted | F1-micro | F1-samples | F1-weighted | |:-------------:|:-----:|:----:|:---------------:|:---------------:|:-----------------:|:------------------:|:------------:|:--------------:|:---------------:|:--------:|:----------:|:-----------:| | 0.8999 | 0.99 | 94 | 0.8742 | 0.3889 | 0.0272 | 0.1308 | 0.0169 | 0.0188 | 0.0169 | 0.0323 | 0.0202 | 0.0280 | | 0.7377 | 2.0 | 189 | 0.6770 | 0.4727 | 0.4996 | 0.5333 | 0.5639 | 0.5782 | 0.5639 | 0.5143 | 0.4883 | 0.4998 | | 0.5582 | 2.99 | 283 | 0.5552 | 0.5111 | 0.5585 | 0.5685 | 0.7229 | 0.7357 | 0.7229 | 0.5988 | 0.5959 | 0.6175 | | 0.3943 | 4.0 | 378 | 0.4713 | 0.5616 | 0.6397 | 0.5869 | 0.7904 | 0.8071 | 0.7904 | 0.6567 | 0.6761 | 0.6611 | | 0.2883 | 4.99 | 472 | 0.4555 | 0.6384 | 0.6969 | 0.6444 | 0.7446 | 0.7641 | 0.7446 | 0.6874 | 0.6901 | 0.6854 | | 0.2112 | 6.0 | 567 | 0.4459 | 0.6443 | 0.6968 | 0.6637 | 0.7855 | 0.7942 | 0.7855 | 0.7079 | 0.7123 | 0.7068 | | 0.1608 | 6.99 | 661 | 0.4212 | 0.6508 | 0.7071 | 0.6586 | 0.7904 | 0.7931 | 0.7904 | 0.7138 | 0.7161 | 0.7116 | | 0.1247 | 8.0 | 756 | 0.4177 | 0.6633 | 0.7145 | 0.6650 | 0.7976 | 0.8006 | 0.7976 | 0.7243 | 0.7193 | 0.7195 | | 0.1031 | 8.99 | 850 | 0.4435 | 0.7277 | 0.7523 | 0.7306 | 0.7855 | 0.7875 | 0.7855 | 0.7555 | 0.7425 | 0.7487 | | 0.0851 | 10.0 | 945 | 0.4522 | 0.7380 | 0.7623 | 0.7465 | 0.7807 | 0.7795 | 0.7807 | 0.7588 | 0.7432 | 0.7516 | | 0.074 | 10.99 | 1039 | 0.4548 | 0.7359 | 0.7663 | 0.7368 | 0.7855 | 0.7910 | 0.7855 | 0.7599 | 0.7490 | 0.7521 | | 0.0648 | 12.0 | 1134 | 0.4430 | 0.7425 | 0.7676 | 0.7437 | 0.7783 | 0.7781 | 0.7783 | 0.76 | 0.7461 | 0.7540 | | 0.0605 | 12.99 | 1228 | 0.4478 | 0.7366 | 0.7651 | 0.7379 | 0.7952 | 0.7948 | 0.7952 | 0.7648 | 0.7545 | 0.7579 | | 0.0566 | 14.0 | 1323 | 0.4574 | 0.7506 | 0.7708 | 0.7519 | 0.7904 | 0.7893 | 0.7904 | 0.7700 | 0.7546 | 0.7625 | | 0.0546 | 14.92 | 1410 | 0.4541 | 0.75 | 0.7708 | 0.7517 | 0.7880 | 0.7858 | 0.7880 | 0.7685 | 0.7537 | 0.7615 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.17.0 - Tokenizers 0.15.1