Edit model card

PhoBert_70KURL_bo_vn

This model is a fine-tuned version of vinai/phobert-base-v2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0050
  • Accuracy: 0.9980
  • F1: 0.9980

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 2150
  • num_epochs: 20

Training results

Training Loss Epoch Step Validation Loss Accuracy F1
No log 0.4651 200 0.4818 0.8017 0.7745
No log 0.9302 400 0.2638 0.9006 0.9018
No log 1.3953 600 0.2097 0.9222 0.9232
No log 1.8605 800 0.1810 0.9335 0.9343
0.3614 2.3256 1000 0.1468 0.9476 0.9478
0.3614 2.7907 1200 0.1471 0.9451 0.9458
0.3614 3.2558 1400 0.1196 0.9569 0.9570
0.3614 3.7209 1600 0.1065 0.9617 0.9618
0.1608 4.1860 1800 0.1014 0.9641 0.9641
0.1608 4.6512 2000 0.0953 0.9666 0.9668
0.1608 5.1163 2200 0.0843 0.9695 0.9696
0.1608 5.5814 2400 0.0761 0.9729 0.9730
0.1156 6.0465 2600 0.0675 0.9786 0.9787
0.1156 6.5116 2800 0.0547 0.9819 0.9820
0.1156 6.9767 3000 0.0487 0.9843 0.9843
0.1156 7.4419 3200 0.0419 0.9864 0.9865
0.1156 7.9070 3400 0.0460 0.9840 0.9841
0.0814 8.3721 3600 0.0361 0.9884 0.9884
0.0814 8.8372 3800 0.0334 0.9896 0.9896
0.0814 9.3023 4000 0.0327 0.9885 0.9885
0.0814 9.7674 4200 0.0326 0.9890 0.9890
0.0584 10.2326 4400 0.0282 0.9911 0.9911
0.0584 10.6977 4600 0.0222 0.9930 0.9930
0.0584 11.1628 4800 0.0185 0.9942 0.9942
0.0584 11.6279 5000 0.0163 0.9951 0.9951
0.0412 12.0930 5200 0.0235 0.9921 0.9921
0.0412 12.5581 5400 0.0134 0.9956 0.9956
0.0412 13.0233 5600 0.0123 0.9960 0.9960
0.0412 13.4884 5800 0.0111 0.9963 0.9963
0.0412 13.9535 6000 0.0096 0.9968 0.9968
0.0316 14.4186 6200 0.0143 0.9953 0.9953
0.0316 14.8837 6400 0.0088 0.9971 0.9971
0.0316 15.3488 6600 0.0077 0.9973 0.9973
0.0316 15.8140 6800 0.0073 0.9975 0.9975
0.0237 16.2791 7000 0.0066 0.9977 0.9977
0.0237 16.7442 7200 0.0065 0.9977 0.9977
0.0237 17.2093 7400 0.0057 0.9978 0.9978
0.0237 17.6744 7600 0.0072 0.9976 0.9976
0.0188 18.1395 7800 0.0055 0.9979 0.9979
0.0188 18.6047 8000 0.0052 0.9981 0.9981
0.0188 19.0698 8200 0.0052 0.9980 0.9980
0.0188 19.5349 8400 0.0050 0.9980 0.9980
0.0146 20.0 8600 0.0050 0.9980 0.9980

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.1.2
  • Datasets 2.19.2
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
135M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for gechim/PhoBert_70KURL_bo_vn

Finetuned
(182)
this model