--- base_model: csebuetnlp/banglabert tags: - generated_from_trainer metrics: - f1 - accuracy model-index: - name: banglabert-MLTC-BB1 results: [] --- # banglabert-MLTC-BB1 This model is a fine-tuned version of [csebuetnlp/banglabert](https://huggingface.co./csebuetnlp/banglabert) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.3599 - F1: 0.8582 - F1 Weighted: 0.8565 - Roc Auc: 0.8547 - Accuracy: 0.5835 - Hamming Loss: 0.1452 - Jaccard Score: 0.7516 - Zero One Loss: 0.4165 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | F1 Weighted | Roc Auc | Accuracy | Hamming Loss | Jaccard Score | Zero One Loss | |:-------------:|:-----:|:----:|:---------------:|:------:|:-----------:|:-------:|:--------:|:------------:|:-------------:|:-------------:| | 0.5692 | 1.0 | 49 | 0.5109 | 0.7781 | 0.7194 | 0.7685 | 0.4216 | 0.2314 | 0.6367 | 0.5784 | | 0.4149 | 2.0 | 98 | 0.4230 | 0.8469 | 0.8467 | 0.8405 | 0.5604 | 0.1594 | 0.7345 | 0.4396 | | 0.3732 | 3.0 | 147 | 0.3856 | 0.8479 | 0.8474 | 0.8425 | 0.5527 | 0.1575 | 0.7360 | 0.4473 | | 0.3321 | 4.0 | 196 | 0.3750 | 0.8542 | 0.8522 | 0.8476 | 0.5578 | 0.1523 | 0.7454 | 0.4422 | | 0.2817 | 5.0 | 245 | 0.3721 | 0.8545 | 0.8514 | 0.8482 | 0.5630 | 0.1517 | 0.7460 | 0.4370 | | 0.2781 | 6.0 | 294 | 0.3553 | 0.8561 | 0.8547 | 0.8528 | 0.5656 | 0.1472 | 0.7484 | 0.4344 | | 0.2264 | 7.0 | 343 | 0.3576 | 0.8566 | 0.8550 | 0.8534 | 0.5733 | 0.1465 | 0.7492 | 0.4267 | | 0.2441 | 8.0 | 392 | 0.3595 | 0.8575 | 0.8560 | 0.8534 | 0.5733 | 0.1465 | 0.7505 | 0.4267 | | 0.2547 | 9.0 | 441 | 0.3608 | 0.8561 | 0.8548 | 0.8528 | 0.5784 | 0.1472 | 0.7484 | 0.4216 | | 0.2211 | 10.0 | 490 | 0.3599 | 0.8582 | 0.8565 | 0.8547 | 0.5835 | 0.1452 | 0.7516 | 0.4165 | ### Framework versions - Transformers 4.41.1 - Pytorch 2.1.2 - Datasets 2.19.1 - Tokenizers 0.19.1