File size: 3,667 Bytes
e3e21d4
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
---
license: mit
base_model: gogamza/kobart-base-v2
tags:
- generated_from_trainer
model-index:
- name: KoBART_base_v2-trial2
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# KoBART_base_v2-trial2

This model is a fine-tuned version of [gogamza/kobart-base-v2](https://huggingface.co./gogamza/kobart-base-v2) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.1820

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.0005
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 20
- num_epochs: 5
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss |
|:-------------:|:-----:|:----:|:---------------:|
| 2.3889        | 0.11  | 50   | 0.5425          |
| 0.5339        | 0.22  | 100  | 0.4328          |
| 0.4609        | 0.32  | 150  | 0.4180          |
| 0.4631        | 0.43  | 200  | 0.4167          |
| 0.4065        | 0.54  | 250  | 0.3775          |
| 0.3898        | 0.65  | 300  | 0.3539          |
| 0.3637        | 0.76  | 350  | 0.3389          |
| 0.3347        | 0.87  | 400  | 0.3275          |
| 0.3428        | 0.97  | 450  | 0.3087          |
| 0.2871        | 1.08  | 500  | 0.3189          |
| 0.2843        | 1.19  | 550  | 0.3016          |
| 0.2685        | 1.3   | 600  | 0.2954          |
| 0.2603        | 1.41  | 650  | 0.2860          |
| 0.2636        | 1.52  | 700  | 0.2804          |
| 0.2586        | 1.62  | 750  | 0.2821          |
| 0.2485        | 1.73  | 800  | 0.2674          |
| 0.2483        | 1.84  | 850  | 0.2662          |
| 0.2322        | 1.95  | 900  | 0.2525          |
| 0.2052        | 2.06  | 950  | 0.2634          |
| 0.1838        | 2.16  | 1000 | 0.2472          |
| 0.1859        | 2.27  | 1050 | 0.2432          |
| 0.1887        | 2.38  | 1100 | 0.2392          |
| 0.1756        | 2.49  | 1150 | 0.2314          |
| 0.1697        | 2.6   | 1200 | 0.2332          |
| 0.1741        | 2.71  | 1250 | 0.2257          |
| 0.1665        | 2.81  | 1300 | 0.2204          |
| 0.1655        | 2.92  | 1350 | 0.2097          |
| 0.1539        | 3.03  | 1400 | 0.2141          |
| 0.126         | 3.14  | 1450 | 0.2129          |
| 0.1241        | 3.25  | 1500 | 0.2068          |
| 0.1266        | 3.35  | 1550 | 0.1999          |
| 0.1161        | 3.46  | 1600 | 0.1996          |
| 0.1183        | 3.57  | 1650 | 0.1943          |
| 0.1123        | 3.68  | 1700 | 0.1914          |
| 0.1096        | 3.79  | 1750 | 0.1881          |
| 0.1089        | 3.9   | 1800 | 0.1835          |
| 0.1096        | 4.0   | 1850 | 0.1803          |
| 0.0857        | 4.11  | 1900 | 0.1873          |
| 0.0833        | 4.22  | 1950 | 0.1857          |
| 0.0791        | 4.33  | 2000 | 0.1871          |
| 0.0825        | 4.44  | 2050 | 0.1852          |
| 0.0813        | 4.55  | 2100 | 0.1834          |
| 0.0806        | 4.65  | 2150 | 0.1830          |
| 0.0805        | 4.76  | 2200 | 0.1822          |
| 0.0786        | 4.87  | 2250 | 0.1820          |
| 0.0775        | 4.98  | 2300 | 0.1820          |


### Framework versions

- Transformers 4.36.0
- Pytorch 2.0.1+cu117
- Datasets 2.15.0
- Tokenizers 0.15.0