t5Indo2Jawa / README.md
pijarcandra22's picture
Training in progress epoch 177
6363e85
|
raw
history blame
8.57 kB
metadata
license: apache-2.0
base_model: t5-small
tags:
  - generated_from_keras_callback
model-index:
  - name: pijarcandra22/t5Indo2Jawa
    results: []

pijarcandra22/t5Indo2Jawa

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 1.4387
  • Validation Loss: 1.4507
  • Epoch: 177

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
3.5149 3.1567 0
3.3816 3.0397 1
3.2812 2.9518 2
3.1977 2.8751 3
3.1223 2.8078 4
3.0599 2.7507 5
3.0019 2.6979 6
2.9517 2.6513 7
2.9034 2.6121 8
2.8638 2.5756 9
2.8232 2.5391 10
2.7856 2.5089 11
2.7541 2.4786 12
2.7219 2.4499 13
2.6935 2.4256 14
2.6658 2.4010 15
2.6389 2.3762 16
2.6143 2.3550 17
2.5899 2.3313 18
2.5665 2.3156 19
2.5445 2.2939 20
2.5224 2.2750 21
2.5022 2.2569 22
2.4834 2.2410 23
2.4641 2.2220 24
2.4443 2.2091 25
2.4267 2.1948 26
2.4129 2.1796 27
2.3937 2.1657 28
2.3782 2.1523 29
2.3616 2.1385 30
2.3471 2.1267 31
2.3351 2.1110 32
2.3184 2.0988 33
2.3047 2.0871 34
2.2920 2.0768 35
2.2767 2.0649 36
2.2651 2.0546 37
2.2526 2.0445 38
2.2388 2.0333 39
2.2264 2.0234 40
2.2157 2.0165 41
2.2050 2.0049 42
2.1906 1.9946 43
2.1824 1.9845 44
2.1673 1.9762 45
2.1559 1.9679 46
2.1455 1.9608 47
2.1377 1.9528 48
2.1279 1.9429 49
2.1176 1.9356 50
2.1056 1.9267 51
2.0979 1.9174 52
2.0882 1.9087 53
2.0802 1.8995 54
2.0668 1.8947 55
2.0597 1.8880 56
2.0484 1.8779 57
2.0405 1.8735 58
2.0335 1.8676 59
2.0254 1.8603 60
2.0147 1.8530 61
2.0078 1.8459 62
1.9984 1.8403 63
1.9902 1.8338 64
1.9824 1.8264 65
1.9768 1.8231 66
1.9679 1.8158 67
1.9597 1.8104 68
1.9531 1.8026 69
1.9460 1.7987 70
1.9416 1.7929 71
1.9291 1.7876 72
1.9245 1.7807 73
1.9143 1.7788 74
1.9088 1.7717 75
1.9006 1.7643 76
1.8960 1.7587 77
1.8901 1.7528 78
1.8808 1.7477 79
1.8740 1.7436 80
1.8689 1.7376 81
1.8628 1.7320 82
1.8533 1.7312 83
1.8486 1.7240 84
1.8428 1.7186 85
1.8351 1.7141 86
1.8316 1.7106 87
1.8234 1.7045 88
1.8173 1.6976 89
1.8109 1.6959 90
1.8059 1.6924 91
1.8016 1.6860 92
1.7922 1.6802 93
1.7887 1.6778 94
1.7832 1.6716 95
1.7761 1.6688 96
1.7724 1.6653 97
1.7662 1.6582 98
1.7607 1.6571 99
1.7549 1.6542 100
1.7483 1.6497 101
1.7454 1.6435 102
1.7400 1.6407 103
1.7318 1.6363 104
1.7266 1.6327 105
1.7234 1.6286 106
1.7210 1.6267 107
1.7109 1.6207 108
1.7079 1.6183 109
1.7026 1.6162 110
1.6989 1.6137 111
1.6925 1.6074 112
1.6880 1.6051 113
1.6823 1.6021 114
1.6780 1.5969 115
1.6737 1.5960 116
1.6659 1.5937 117
1.6603 1.5872 118
1.6586 1.5870 119
1.6550 1.5813 120
1.6506 1.5788 121
1.6432 1.5771 122
1.6408 1.5721 123
1.6377 1.5729 124
1.6307 1.5693 125
1.6268 1.5650 126
1.6227 1.5607 127
1.6180 1.5618 128
1.6151 1.5590 129
1.6101 1.5534 130
1.6056 1.5505 131
1.6034 1.5470 132
1.5971 1.5443 133
1.5926 1.5431 134
1.5873 1.5421 135
1.5850 1.5378 136
1.5807 1.5334 137
1.5771 1.5335 138
1.5734 1.5309 139
1.5694 1.5288 140
1.5642 1.5273 141
1.5610 1.5215 142
1.5568 1.5217 143
1.5555 1.5171 144
1.5517 1.5170 145
1.5471 1.5148 146
1.5426 1.5120 147
1.5376 1.5102 148
1.5370 1.5081 149
1.5317 1.5070 150
1.5272 1.5029 151
1.5257 1.5025 152
1.5205 1.4997 153
1.5180 1.4954 154
1.5112 1.4932 155
1.5117 1.4920 156
1.5070 1.4890 157
1.5050 1.4881 158
1.4984 1.4870 159
1.4964 1.4843 160
1.4920 1.4833 161
1.4879 1.4808 162
1.4838 1.4768 163
1.4854 1.4756 164
1.4784 1.4733 165
1.4757 1.4724 166
1.4733 1.4697 167
1.4704 1.4678 168
1.4660 1.4648 169
1.4618 1.4660 170
1.4591 1.4606 171
1.4554 1.4626 172
1.4533 1.4595 173
1.4492 1.4583 174
1.4471 1.4539 175
1.4410 1.4548 176
1.4387 1.4507 177

Framework versions

  • Transformers 4.35.2
  • TensorFlow 2.14.0
  • Datasets 2.15.0
  • Tokenizers 0.15.0