Edit model card

pijarcandra22/t5Indo2Bali

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Train Loss: 1.1069
  • Validation Loss: 1.5697
  • Epoch: 199

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
  • training_precision: float32

Training results

Train Loss Validation Loss Epoch
1.8126 1.8072 0
1.8017 1.8049 1
1.7984 1.8042 2
1.7845 1.7983 3
1.7843 1.7968 4
1.7812 1.7930 5
1.7593 1.7946 6
1.7647 1.7865 7
1.7710 1.7826 8
1.7464 1.7852 9
1.7467 1.7751 10
1.7380 1.7758 11
1.7417 1.7761 12
1.7433 1.7652 13
1.7306 1.7678 14
1.7226 1.7649 15
1.7102 1.7608 16
1.7125 1.7589 17
1.7005 1.7572 18
1.6927 1.7522 19
1.6859 1.7467 20
1.6895 1.7454 21
1.6780 1.7415 22
1.6807 1.7408 23
1.6730 1.7376 24
1.6673 1.7353 25
1.6648 1.7340 26
1.6566 1.7328 27
1.6537 1.7278 28
1.6508 1.7304 29
1.6445 1.7267 30
1.6437 1.7268 31
1.6345 1.7249 32
1.6228 1.7192 33
1.6310 1.7136 34
1.6236 1.7140 35
1.6151 1.7158 36
1.6114 1.7107 37
1.6076 1.7115 38
1.6047 1.7066 39
1.5923 1.7089 40
1.5897 1.7024 41
1.5822 1.7000 42
1.5815 1.6983 43
1.5854 1.6984 44
1.5728 1.6962 45
1.5672 1.6971 46
1.5735 1.6899 47
1.5576 1.6894 48
1.5649 1.6853 49
1.5572 1.6839 50
1.5534 1.6813 51
1.5491 1.6811 52
1.5487 1.6807 53
1.5376 1.6763 54
1.5367 1.6772 55
1.5242 1.6744 56
1.5251 1.6687 57
1.5246 1.6708 58
1.5260 1.6673 59
1.5073 1.6699 60
1.5063 1.6658 61
1.5238 1.6630 62
1.4991 1.6612 63
1.4994 1.6610 64
1.5001 1.6599 65
1.4911 1.6612 66
1.4926 1.6554 67
1.4852 1.6527 68
1.4720 1.6554 69
1.4740 1.6533 70
1.4759 1.6487 71
1.4692 1.6450 72
1.4632 1.6480 73
1.4664 1.6424 74
1.4591 1.6436 75
1.4606 1.6384 76
1.4487 1.6382 77
1.4558 1.6375 78
1.4455 1.6389 79
1.4396 1.6427 80
1.4441 1.6363 81
1.4333 1.6357 82
1.4414 1.6348 83
1.4260 1.6319 84
1.4249 1.6317 85
1.4166 1.6268 86
1.4167 1.6301 87
1.4150 1.6244 88
1.4061 1.6273 89
1.4134 1.6260 90
1.4022 1.6237 91
1.3949 1.6246 92
1.4007 1.6231 93
1.3987 1.6184 94
1.3919 1.6178 95
1.3889 1.6178 96
1.3883 1.6209 97
1.3756 1.6175 98
1.3818 1.6139 99
1.3772 1.6129 100
1.3726 1.6159 101
1.3695 1.6158 102
1.3707 1.6110 103
1.3555 1.6132 104
1.3592 1.6085 105
1.3562 1.6111 106
1.3475 1.6095 107
1.3460 1.6100 108
1.3446 1.6093 109
1.3436 1.6095 110
1.3452 1.6068 111
1.3401 1.6054 112
1.3378 1.6085 113
1.3288 1.6056 114
1.3294 1.6057 115
1.3227 1.6018 116
1.3270 1.5989 117
1.3214 1.5956 118
1.3187 1.5986 119
1.3150 1.5986 120
1.3145 1.5958 121
1.3159 1.5980 122
1.3050 1.5978 123
1.3113 1.5956 124
1.2932 1.5972 125
1.3008 1.5927 126
1.2963 1.5960 127
1.2799 1.5950 128
1.2879 1.5918 129
1.2873 1.5891 130
1.2868 1.5884 131
1.2789 1.5922 132
1.2751 1.5887 133
1.2780 1.5888 134
1.2632 1.5913 135
1.2617 1.5835 136
1.2681 1.5910 137
1.2630 1.5893 138
1.2631 1.5877 139
1.2540 1.5892 140
1.2518 1.5812 141
1.2611 1.5812 142
1.2561 1.5808 143
1.2483 1.5757 144
1.2424 1.5785 145
1.2366 1.5806 146
1.2393 1.5801 147
1.2359 1.5781 148
1.2256 1.5796 149
1.2261 1.5818 150
1.2179 1.5782 151
1.2321 1.5720 152
1.2198 1.5744 153
1.2189 1.5780 154
1.2301 1.5747 155
1.2131 1.5776 156
1.2095 1.5746 157
1.2134 1.5756 158
1.2061 1.5761 159
1.2068 1.5727 160
1.2061 1.5714 161
1.1998 1.5756 162
1.1967 1.5780 163
1.1940 1.5747 164
1.1869 1.5713 165
1.1980 1.5700 166
1.1958 1.5714 167
1.1867 1.5666 168
1.1779 1.5715 169
1.1789 1.5765 170
1.1763 1.5742 171
1.1767 1.5708 172
1.1840 1.5682 173
1.1688 1.5747 174
1.1701 1.5696 175
1.1739 1.5658 176
1.1572 1.5688 177
1.1593 1.5659 178
1.1591 1.5684 179
1.1581 1.5644 180
1.1635 1.5655 181
1.1538 1.5662 182
1.1455 1.5666 183
1.1442 1.5650 184
1.1398 1.5668 185
1.1408 1.5712 186
1.1364 1.5679 187
1.1348 1.5653 188
1.1358 1.5704 189
1.1355 1.5652 190
1.1278 1.5650 191
1.1260 1.5697 192
1.1257 1.5686 193
1.1265 1.5712 194
1.1248 1.5625 195
1.1220 1.5704 196
1.1202 1.5666 197
1.1114 1.5653 198
1.1069 1.5697 199

Framework versions

  • Transformers 4.35.2
  • TensorFlow 2.14.0
  • Datasets 2.15.0
  • Tokenizers 0.15.0
Downloads last month
4
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for pijarcandra22/t5Indo2Bali

Base model

google-t5/t5-small
Finetuned
(1390)
this model