File size: 6,924 Bytes
e411b3c c861d59 e411b3c 8e000cf e411b3c 8e000cf 2fe792d 43476d8 2abfce7 7cea751 a93fde4 2783043 819ef5c 0967714 3df7b44 855a894 ebd75fc d3dd02e e9aa8fe 0724760 6419636 e8b534f f421140 b764cf3 4e3ca64 5eef79e 3ebf765 2f3da22 ca1bbd2 05bdf71 274ccba 4b86174 1d4ed1d 29e9a8f 38ed845 d970485 3b9f70d 92c95fe afa299a 3d5295f ffdde10 e5e1766 952b754 e6dc43e 1ed8b2d 2d708fe feec7f8 d074609 1ad901b f728e03 900a0e7 86ed606 f75e298 c1c4751 0be8faf dd1926f b4b782c 16ac756 c8ab12c 31e6ca2 e7fd6a2 2a34e46 1a7d695 1451f4a c119b26 ecbf758 88918a8 3766b0f 3cc80d6 6beb846 ddf892c 2f43684 75f0e7a 280e293 fa995c1 350aef8 9c36842 dcf1fb7 892b19d 7febf76 4e20620 d2e5a65 746af4a 5c045c5 1764deb 3229d1d b9109e1 b28656d 9a7374b d25fdcb 72889ff eaa0233 cce7b13 f19cf29 7d6996d 02f9bc5 18723d8 f94c309 6cc0c2c 029f4de ca3774e 162f081 8f241ab 4fdc4e8 3d55dda 1d9a717 2a4d22c e67c555 1ca6623 9967ac6 a98b215 967a39d d0f90d9 71c4f50 e9aed8a 3ca3864 c36c988 a40395a 4b2c7fe 7162cfc db2cdcd 66c40e5 6c7100b f6bb523 bfcbd68 eb2bc29 f2ff696 51bf13a 49fc198 d04b909 1ad2049 91f4329 904be66 c2c1baa 8adff0c 18f4895 3f33d8b 5723d06 73b34cb fef3ddc 124d8c4 c861d59 e411b3c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 |
---
license: apache-2.0
base_model: facebook/bart-base
tags:
- generated_from_keras_callback
model-index:
- name: pijarcandra22/NMTBaliIndoBART
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# pijarcandra22/NMTBaliIndoBART
This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co./facebook/bart-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 5.4666
- Validation Loss: 6.1473
- Epoch: 136
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 0.02, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 9.3368 | 5.6757 | 0 |
| 5.5627 | 5.5987 | 1 |
| 5.5311 | 5.5419 | 2 |
| 5.5152 | 5.5201 | 3 |
| 5.5005 | 5.6477 | 4 |
| 5.4704 | 5.5914 | 5 |
| 5.4610 | 6.0922 | 6 |
| 5.4584 | 5.7137 | 7 |
| 5.4528 | 5.8658 | 8 |
| 5.4820 | 5.5628 | 9 |
| 5.4874 | 5.5309 | 10 |
| 5.4917 | 5.7595 | 11 |
| 5.4898 | 5.7333 | 12 |
| 5.4833 | 5.6789 | 13 |
| 5.4767 | 5.9588 | 14 |
| 5.4883 | 5.9895 | 15 |
| 5.4694 | 6.0100 | 16 |
| 5.4663 | 6.0316 | 17 |
| 5.4602 | 5.9233 | 18 |
| 5.4576 | 6.0051 | 19 |
| 5.4559 | 5.9966 | 20 |
| 5.4651 | 6.0025 | 21 |
| 5.4660 | 6.0160 | 22 |
| 5.4626 | 5.8324 | 23 |
| 5.4647 | 5.8383 | 24 |
| 5.4695 | 6.0272 | 25 |
| 5.4614 | 6.0724 | 26 |
| 5.4623 | 5.9454 | 27 |
| 5.4678 | 6.0196 | 28 |
| 5.4860 | 5.5949 | 29 |
| 5.4851 | 5.8838 | 30 |
| 5.4666 | 5.8506 | 31 |
| 5.4715 | 6.0391 | 32 |
| 5.4630 | 6.0870 | 33 |
| 5.4646 | 6.2195 | 34 |
| 5.4574 | 5.9696 | 35 |
| 5.4564 | 5.8970 | 36 |
| 5.4570 | 5.9522 | 37 |
| 5.4559 | 6.1518 | 38 |
| 5.4584 | 6.1860 | 39 |
| 5.4732 | 6.1168 | 40 |
| 5.4625 | 6.1588 | 41 |
| 5.4601 | 5.9868 | 42 |
| 5.4645 | 5.9606 | 43 |
| 5.4664 | 6.1495 | 44 |
| 5.4698 | 6.0152 | 45 |
| 5.4666 | 6.2713 | 46 |
| 5.4557 | 6.2708 | 47 |
| 5.4557 | 6.0003 | 48 |
| 5.4693 | 5.9321 | 49 |
| 5.4928 | 5.8971 | 50 |
| 5.5032 | 6.0766 | 51 |
| 5.4749 | 5.8919 | 52 |
| 5.4689 | 5.9853 | 53 |
| 5.4665 | 5.9329 | 54 |
| 5.4574 | 5.9770 | 55 |
| 5.4686 | 6.1022 | 56 |
| 5.4727 | 5.8973 | 57 |
| 5.4692 | 5.9633 | 58 |
| 5.4608 | 6.0480 | 59 |
| 5.4613 | 5.9596 | 60 |
| 5.4607 | 6.1158 | 61 |
| 5.4531 | 6.0617 | 62 |
| 5.4610 | 6.0375 | 63 |
| 5.4631 | 6.1184 | 64 |
| 5.4627 | 6.0465 | 65 |
| 5.4685 | 6.0011 | 66 |
| 5.4642 | 6.0828 | 67 |
| 5.4577 | 6.0883 | 68 |
| 5.4615 | 5.9523 | 69 |
| 5.4673 | 5.7216 | 70 |
| 5.4724 | 6.0274 | 71 |
| 5.4601 | 6.0344 | 72 |
| 5.4640 | 5.9661 | 73 |
| 5.4590 | 6.0013 | 74 |
| 5.4622 | 6.0172 | 75 |
| 5.4666 | 5.8407 | 76 |
| 5.4669 | 6.0261 | 77 |
| 5.4859 | 5.9295 | 78 |
| 5.5042 | 6.1254 | 79 |
| 5.4845 | 5.8930 | 80 |
| 5.5001 | 5.8867 | 81 |
| 5.4923 | 5.9480 | 82 |
| 5.4909 | 6.0475 | 83 |
| 5.4780 | 5.9289 | 84 |
| 5.4867 | 5.8134 | 85 |
| 5.4877 | 6.0032 | 86 |
| 5.4806 | 6.0884 | 87 |
| 5.4784 | 6.0567 | 88 |
| 5.4830 | 5.9790 | 89 |
| 5.4894 | 5.8919 | 90 |
| 5.4890 | 5.9626 | 91 |
| 5.4774 | 6.0267 | 92 |
| 5.5033 | 6.1150 | 93 |
| 5.4765 | 5.9776 | 94 |
| 5.4657 | 6.1395 | 95 |
| 5.4720 | 5.9938 | 96 |
| 5.4748 | 5.9656 | 97 |
| 5.4701 | 6.0163 | 98 |
| 5.4718 | 6.1462 | 99 |
| 5.4672 | 6.0804 | 100 |
| 5.4775 | 6.1055 | 101 |
| 5.4775 | 6.0936 | 102 |
| 5.4673 | 5.9839 | 103 |
| 5.4691 | 5.8972 | 104 |
| 5.4694 | 5.8271 | 105 |
| 5.5106 | 5.5305 | 106 |
| 5.5135 | 5.8806 | 107 |
| 5.4786 | 6.1380 | 108 |
| 5.4770 | 5.9899 | 109 |
| 5.4709 | 6.1072 | 110 |
| 5.4701 | 5.9356 | 111 |
| 5.4636 | 5.8304 | 112 |
| 5.4670 | 6.0451 | 113 |
| 5.4598 | 6.0311 | 114 |
| 5.4731 | 5.9862 | 115 |
| 5.4798 | 5.9589 | 116 |
| 5.4674 | 5.9356 | 117 |
| 5.4634 | 6.0088 | 118 |
| 5.4709 | 5.9534 | 119 |
| 5.4891 | 5.9995 | 120 |
| 5.4737 | 5.8611 | 121 |
| 5.4725 | 6.0112 | 122 |
| 5.4835 | 5.6280 | 123 |
| 5.5217 | 5.6917 | 124 |
| 5.4821 | 5.9458 | 125 |
| 5.4898 | 5.7593 | 126 |
| 5.4866 | 5.9110 | 127 |
| 5.4744 | 5.9463 | 128 |
| 5.4673 | 6.0359 | 129 |
| 5.4838 | 6.0166 | 130 |
| 5.4864 | 6.0046 | 131 |
| 5.4896 | 5.9479 | 132 |
| 5.4722 | 6.0699 | 133 |
| 5.4627 | 6.0684 | 134 |
| 5.4690 | 6.0577 | 135 |
| 5.4666 | 6.1473 | 136 |
### Framework versions
- Transformers 4.40.2
- TensorFlow 2.15.0
- Datasets 2.19.1
- Tokenizers 0.19.1
|