File size: 3,524 Bytes
774258b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
48b6338
 
 
774258b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7307b6e
5d3a54a
131df81
e5d273d
c1c9a0e
5cb0ab3
81e2736
586e0db
0c81e70
2681f61
3e4b798
465da56
0563229
c37918e
fd269a4
58e555c
54c0016
4661b8a
46106ce
c9cdad3
41b71e4
c600ea8
3ee53c2
dc87674
d3d71c9
aa0d827
0c1dd8f
448b850
b0cd3c6
f300909
c115ef3
b1424c9
30d8396
2276888
8fd18de
722a80b
f20dd53
fc631da
81f1d25
2942078
70c0472
1649ea0
3765098
5f0964a
3afaae8
3708377
6d104d9
c5214ce
7eab052
18f8e4d
cbbcac8
2ed34f0
127939a
93b8523
48b6338
774258b
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
---
license: apache-2.0
base_model: t5-small
tags:
- generated_from_keras_callback
model-index:
- name: pijarcandra22/t5Indo2Jawa
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# pijarcandra22/t5Indo2Jawa

This model is a fine-tuned version of [t5-small](https://huggingface.co./t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.0802
- Validation Loss: 1.8995
- Epoch: 54

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 3.5149     | 3.1567          | 0     |
| 3.3816     | 3.0397          | 1     |
| 3.2812     | 2.9518          | 2     |
| 3.1977     | 2.8751          | 3     |
| 3.1223     | 2.8078          | 4     |
| 3.0599     | 2.7507          | 5     |
| 3.0019     | 2.6979          | 6     |
| 2.9517     | 2.6513          | 7     |
| 2.9034     | 2.6121          | 8     |
| 2.8638     | 2.5756          | 9     |
| 2.8232     | 2.5391          | 10    |
| 2.7856     | 2.5089          | 11    |
| 2.7541     | 2.4786          | 12    |
| 2.7219     | 2.4499          | 13    |
| 2.6935     | 2.4256          | 14    |
| 2.6658     | 2.4010          | 15    |
| 2.6389     | 2.3762          | 16    |
| 2.6143     | 2.3550          | 17    |
| 2.5899     | 2.3313          | 18    |
| 2.5665     | 2.3156          | 19    |
| 2.5445     | 2.2939          | 20    |
| 2.5224     | 2.2750          | 21    |
| 2.5022     | 2.2569          | 22    |
| 2.4834     | 2.2410          | 23    |
| 2.4641     | 2.2220          | 24    |
| 2.4443     | 2.2091          | 25    |
| 2.4267     | 2.1948          | 26    |
| 2.4129     | 2.1796          | 27    |
| 2.3937     | 2.1657          | 28    |
| 2.3782     | 2.1523          | 29    |
| 2.3616     | 2.1385          | 30    |
| 2.3471     | 2.1267          | 31    |
| 2.3351     | 2.1110          | 32    |
| 2.3184     | 2.0988          | 33    |
| 2.3047     | 2.0871          | 34    |
| 2.2920     | 2.0768          | 35    |
| 2.2767     | 2.0649          | 36    |
| 2.2651     | 2.0546          | 37    |
| 2.2526     | 2.0445          | 38    |
| 2.2388     | 2.0333          | 39    |
| 2.2264     | 2.0234          | 40    |
| 2.2157     | 2.0165          | 41    |
| 2.2050     | 2.0049          | 42    |
| 2.1906     | 1.9946          | 43    |
| 2.1824     | 1.9845          | 44    |
| 2.1673     | 1.9762          | 45    |
| 2.1559     | 1.9679          | 46    |
| 2.1455     | 1.9608          | 47    |
| 2.1377     | 1.9528          | 48    |
| 2.1279     | 1.9429          | 49    |
| 2.1176     | 1.9356          | 50    |
| 2.1056     | 1.9267          | 51    |
| 2.0979     | 1.9174          | 52    |
| 2.0882     | 1.9087          | 53    |
| 2.0802     | 1.8995          | 54    |


### Framework versions

- Transformers 4.35.2
- TensorFlow 2.14.0
- Datasets 2.15.0
- Tokenizers 0.15.0