Edit model card

Labira/LabiraPJOK_2_50

This model is a fine-tuned version of indolem/indobert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 0.0612
  • Validation Loss: 5.0368
  • Epoch: 49

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': False, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 250, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
5.8505 5.6669 0
5.2925 5.1030 1
4.5442 4.7484 2
4.0958 4.7040 3
3.7810 4.5713 4
3.5676 4.4824 5
3.1885 4.3205 6
2.7673 4.2241 7
2.5267 4.2636 8
2.1790 4.3948 9
1.8900 4.4249 10
1.6497 4.3953 11
1.4075 4.6399 12
1.1854 4.7024 13
0.9754 4.9350 14
0.9994 5.3112 15
0.7262 5.0277 16
0.5385 5.6396 17
0.5031 5.0280 18
0.4707 5.4408 19
0.3623 5.2230 20
0.3844 5.0132 21
0.3438 5.1672 22
0.2012 5.2035 23
0.2089 5.1718 24
0.1978 5.0590 25
0.2140 5.1029 26
0.1903 4.9778 27
0.1750 4.9790 28
0.1228 5.0673 29
0.0892 5.0525 30
0.1576 4.9680 31
0.1337 4.9172 32
0.0976 4.8575 33
0.0649 4.7732 34
0.1050 4.8566 35
0.0885 5.0122 36
0.0725 5.0716 37
0.1004 5.0808 38
0.0443 5.0632 39
0.0514 5.0632 40
0.0632 5.0526 41
0.1997 5.0193 42
0.0600 5.0489 43
0.0482 5.0666 44
0.0862 5.0719 45
0.1512 5.0631 46
0.0815 5.0498 47
0.0462 5.0410 48
0.0612 5.0368 49

Framework versions

  • Transformers 4.44.2
  • TensorFlow 2.17.0
  • Datasets 3.0.1
  • Tokenizers 0.19.1
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Labira/LabiraPJOK_2_50

Finetuned
(365)
this model