File size: 4,668 Bytes
e411b3c b9109e1 e411b3c 8e000cf e411b3c 8e000cf 2fe792d 43476d8 2abfce7 7cea751 a93fde4 2783043 819ef5c 0967714 3df7b44 855a894 ebd75fc d3dd02e e9aa8fe 0724760 6419636 e8b534f f421140 b764cf3 4e3ca64 5eef79e 3ebf765 2f3da22 ca1bbd2 05bdf71 274ccba 4b86174 1d4ed1d 29e9a8f 38ed845 d970485 3b9f70d 92c95fe afa299a 3d5295f ffdde10 e5e1766 952b754 e6dc43e 1ed8b2d 2d708fe feec7f8 d074609 1ad901b f728e03 900a0e7 86ed606 f75e298 c1c4751 0be8faf dd1926f b4b782c 16ac756 c8ab12c 31e6ca2 e7fd6a2 2a34e46 1a7d695 1451f4a c119b26 ecbf758 88918a8 3766b0f 3cc80d6 6beb846 ddf892c 2f43684 75f0e7a 280e293 fa995c1 350aef8 9c36842 dcf1fb7 892b19d 7febf76 4e20620 d2e5a65 746af4a 5c045c5 1764deb 3229d1d b9109e1 e411b3c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 |
---
license: apache-2.0
base_model: facebook/bart-base
tags:
- generated_from_keras_callback
model-index:
- name: pijarcandra22/NMTBaliIndoBART
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# pijarcandra22/NMTBaliIndoBART
This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co./facebook/bart-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 5.5001
- Validation Loss: 5.8867
- Epoch: 81
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 0.02, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 9.3368 | 5.6757 | 0 |
| 5.5627 | 5.5987 | 1 |
| 5.5311 | 5.5419 | 2 |
| 5.5152 | 5.5201 | 3 |
| 5.5005 | 5.6477 | 4 |
| 5.4704 | 5.5914 | 5 |
| 5.4610 | 6.0922 | 6 |
| 5.4584 | 5.7137 | 7 |
| 5.4528 | 5.8658 | 8 |
| 5.4820 | 5.5628 | 9 |
| 5.4874 | 5.5309 | 10 |
| 5.4917 | 5.7595 | 11 |
| 5.4898 | 5.7333 | 12 |
| 5.4833 | 5.6789 | 13 |
| 5.4767 | 5.9588 | 14 |
| 5.4883 | 5.9895 | 15 |
| 5.4694 | 6.0100 | 16 |
| 5.4663 | 6.0316 | 17 |
| 5.4602 | 5.9233 | 18 |
| 5.4576 | 6.0051 | 19 |
| 5.4559 | 5.9966 | 20 |
| 5.4651 | 6.0025 | 21 |
| 5.4660 | 6.0160 | 22 |
| 5.4626 | 5.8324 | 23 |
| 5.4647 | 5.8383 | 24 |
| 5.4695 | 6.0272 | 25 |
| 5.4614 | 6.0724 | 26 |
| 5.4623 | 5.9454 | 27 |
| 5.4678 | 6.0196 | 28 |
| 5.4860 | 5.5949 | 29 |
| 5.4851 | 5.8838 | 30 |
| 5.4666 | 5.8506 | 31 |
| 5.4715 | 6.0391 | 32 |
| 5.4630 | 6.0870 | 33 |
| 5.4646 | 6.2195 | 34 |
| 5.4574 | 5.9696 | 35 |
| 5.4564 | 5.8970 | 36 |
| 5.4570 | 5.9522 | 37 |
| 5.4559 | 6.1518 | 38 |
| 5.4584 | 6.1860 | 39 |
| 5.4732 | 6.1168 | 40 |
| 5.4625 | 6.1588 | 41 |
| 5.4601 | 5.9868 | 42 |
| 5.4645 | 5.9606 | 43 |
| 5.4664 | 6.1495 | 44 |
| 5.4698 | 6.0152 | 45 |
| 5.4666 | 6.2713 | 46 |
| 5.4557 | 6.2708 | 47 |
| 5.4557 | 6.0003 | 48 |
| 5.4693 | 5.9321 | 49 |
| 5.4928 | 5.8971 | 50 |
| 5.5032 | 6.0766 | 51 |
| 5.4749 | 5.8919 | 52 |
| 5.4689 | 5.9853 | 53 |
| 5.4665 | 5.9329 | 54 |
| 5.4574 | 5.9770 | 55 |
| 5.4686 | 6.1022 | 56 |
| 5.4727 | 5.8973 | 57 |
| 5.4692 | 5.9633 | 58 |
| 5.4608 | 6.0480 | 59 |
| 5.4613 | 5.9596 | 60 |
| 5.4607 | 6.1158 | 61 |
| 5.4531 | 6.0617 | 62 |
| 5.4610 | 6.0375 | 63 |
| 5.4631 | 6.1184 | 64 |
| 5.4627 | 6.0465 | 65 |
| 5.4685 | 6.0011 | 66 |
| 5.4642 | 6.0828 | 67 |
| 5.4577 | 6.0883 | 68 |
| 5.4615 | 5.9523 | 69 |
| 5.4673 | 5.7216 | 70 |
| 5.4724 | 6.0274 | 71 |
| 5.4601 | 6.0344 | 72 |
| 5.4640 | 5.9661 | 73 |
| 5.4590 | 6.0013 | 74 |
| 5.4622 | 6.0172 | 75 |
| 5.4666 | 5.8407 | 76 |
| 5.4669 | 6.0261 | 77 |
| 5.4859 | 5.9295 | 78 |
| 5.5042 | 6.1254 | 79 |
| 5.4845 | 5.8930 | 80 |
| 5.5001 | 5.8867 | 81 |
### Framework versions
- Transformers 4.40.2
- TensorFlow 2.15.0
- Datasets 2.19.1
- Tokenizers 0.19.1
|