t5Indo2Jawa / README.md
pijarcandra22's picture
Training in progress epoch 32
30d8396
|
raw
history blame
2.62 kB
metadata
license: apache-2.0
base_model: t5-small
tags:
  - generated_from_keras_callback
model-index:
  - name: pijarcandra22/t5Indo2Jawa
    results: []

pijarcandra22/t5Indo2Jawa

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 2.3351
  • Validation Loss: 2.1110
  • Epoch: 32

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
3.5149 3.1567 0
3.3816 3.0397 1
3.2812 2.9518 2
3.1977 2.8751 3
3.1223 2.8078 4
3.0599 2.7507 5
3.0019 2.6979 6
2.9517 2.6513 7
2.9034 2.6121 8
2.8638 2.5756 9
2.8232 2.5391 10
2.7856 2.5089 11
2.7541 2.4786 12
2.7219 2.4499 13
2.6935 2.4256 14
2.6658 2.4010 15
2.6389 2.3762 16
2.6143 2.3550 17
2.5899 2.3313 18
2.5665 2.3156 19
2.5445 2.2939 20
2.5224 2.2750 21
2.5022 2.2569 22
2.4834 2.2410 23
2.4641 2.2220 24
2.4443 2.2091 25
2.4267 2.1948 26
2.4129 2.1796 27
2.3937 2.1657 28
2.3782 2.1523 29
2.3616 2.1385 30
2.3471 2.1267 31
2.3351 2.1110 32

Framework versions

  • Transformers 4.35.2
  • TensorFlow 2.14.0
  • Datasets 2.15.0
  • Tokenizers 0.15.0