tuanio commited on
Commit
6032f90
1 Parent(s): 61da31e

End of training

Browse files
Files changed (2) hide show
  1. README.md +165 -0
  2. trainer_state.json +0 -0
README.md ADDED
@@ -0,0 +1,165 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: facebook/wav2vec2-xls-r-300m
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: 1-bpe20k-freeze_cnn-drop.1
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # 1-bpe20k-freeze_cnn-drop.1
17
+
18
+ This model is a fine-tuned version of [facebook/wav2vec2-xls-r-300m](https://huggingface.co/facebook/wav2vec2-xls-r-300m) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 1.0929
21
+ - Wer: 1.0076
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - distributed_type: multi-GPU
45
+ - num_devices: 4
46
+ - total_train_batch_size: 32
47
+ - total_eval_batch_size: 32
48
+ - optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
49
+ - lr_scheduler_type: linear
50
+ - lr_scheduler_warmup_ratio: 0.1
51
+ - training_steps: 200000
52
+ - mixed_precision_training: Native AMP
53
+
54
+ ### Training results
55
+
56
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
57
+ |:-------------:|:-----:|:------:|:---------------:|:------:|
58
+ | 91.1741 | 0.05 | 2000 | 179.2776 | 1.0 |
59
+ | 56.7966 | 0.09 | 4000 | 111.7828 | 1.0 |
60
+ | 11.4825 | 0.14 | 6000 | 16.6212 | 1.0 |
61
+ | 7.5731 | 0.19 | 8000 | 8.3634 | 1.0 |
62
+ | 7.3291 | 0.23 | 10000 | 8.3149 | 1.0 |
63
+ | 7.1995 | 0.28 | 12000 | 8.1837 | 1.0 |
64
+ | 7.1092 | 0.33 | 14000 | 8.1877 | 1.0 |
65
+ | 6.9362 | 0.38 | 16000 | 7.8934 | 1.0 |
66
+ | 6.7999 | 0.42 | 18000 | 7.7947 | 1.0 |
67
+ | 6.7773 | 0.47 | 20000 | 7.8173 | 1.0 |
68
+ | 6.7489 | 0.52 | 22000 | 7.8197 | 1.0117 |
69
+ | 6.7139 | 0.56 | 24000 | 7.6664 | 1.0078 |
70
+ | 6.6024 | 0.61 | 26000 | 7.8397 | 1.0 |
71
+ | 6.3451 | 0.66 | 28000 | 7.2039 | 1.0044 |
72
+ | 5.7315 | 0.7 | 30000 | 6.3228 | 1.0039 |
73
+ | 4.8494 | 0.75 | 32000 | 5.3767 | 1.0058 |
74
+ | 4.4103 | 0.8 | 34000 | 4.5216 | 1.0073 |
75
+ | 3.8434 | 0.85 | 36000 | 3.9239 | 1.0125 |
76
+ | 3.3268 | 0.89 | 38000 | 3.4547 | 1.0074 |
77
+ | 2.9034 | 0.94 | 40000 | 3.0617 | 1.0079 |
78
+ | 2.5747 | 0.99 | 42000 | 2.7700 | 1.0108 |
79
+ | 2.4532 | 1.03 | 44000 | 2.5497 | 1.0113 |
80
+ | 2.3169 | 1.08 | 46000 | 2.3780 | 1.0105 |
81
+ | 2.2304 | 1.13 | 48000 | 2.2581 | 1.0075 |
82
+ | 2.0592 | 1.17 | 50000 | 2.1189 | 1.0051 |
83
+ | 2.023 | 1.22 | 52000 | 2.0325 | 1.0049 |
84
+ | 1.8011 | 1.27 | 54000 | 1.9284 | 1.0081 |
85
+ | 1.817 | 1.31 | 56000 | 1.8635 | 1.0091 |
86
+ | 1.6902 | 1.36 | 58000 | 1.7967 | 1.0076 |
87
+ | 1.6398 | 1.41 | 60000 | 1.7413 | 1.0081 |
88
+ | 1.8097 | 1.46 | 62000 | 1.7066 | 1.0054 |
89
+ | 1.5504 | 1.5 | 64000 | 1.6525 | 1.0074 |
90
+ | 1.5759 | 1.55 | 66000 | 1.6402 | 1.0098 |
91
+ | 1.6652 | 1.6 | 68000 | 1.5884 | 1.0055 |
92
+ | 1.5074 | 1.64 | 70000 | 1.5423 | 1.0066 |
93
+ | 1.414 | 1.69 | 72000 | 1.5397 | 1.0095 |
94
+ | 1.4792 | 1.74 | 74000 | 1.4822 | 1.0062 |
95
+ | 1.4648 | 1.78 | 76000 | 1.5023 | 1.0083 |
96
+ | 1.4326 | 1.83 | 78000 | 1.4627 | 1.0070 |
97
+ | 1.3691 | 1.88 | 80000 | 1.4435 | 1.0073 |
98
+ | 1.3624 | 1.92 | 82000 | 1.4290 | 1.0066 |
99
+ | 1.5254 | 1.97 | 84000 | 1.4090 | 1.0104 |
100
+ | 1.3362 | 2.02 | 86000 | 1.3790 | 1.0065 |
101
+ | 1.3586 | 2.07 | 88000 | 1.3730 | 1.0061 |
102
+ | 1.29 | 2.11 | 90000 | 1.3825 | 1.0099 |
103
+ | 1.387 | 2.16 | 92000 | 1.3499 | 1.0087 |
104
+ | 1.356 | 2.21 | 94000 | 1.3591 | 1.0102 |
105
+ | 1.3145 | 2.25 | 96000 | 1.3154 | 1.0084 |
106
+ | 1.2747 | 2.3 | 98000 | 1.3095 | 1.0080 |
107
+ | 1.3444 | 2.35 | 100000 | 1.2910 | 1.0087 |
108
+ | 1.2304 | 2.39 | 102000 | 1.2850 | 1.0074 |
109
+ | 1.3306 | 2.44 | 104000 | 1.2698 | 1.0085 |
110
+ | 1.2431 | 2.49 | 106000 | 1.2645 | 1.0087 |
111
+ | 1.1938 | 2.54 | 108000 | 1.2497 | 1.0061 |
112
+ | 1.2476 | 2.58 | 110000 | 1.2416 | 1.0077 |
113
+ | 1.1743 | 2.63 | 112000 | 1.2460 | 1.0064 |
114
+ | 1.2554 | 2.68 | 114000 | 1.2369 | 1.0076 |
115
+ | 1.1518 | 2.72 | 116000 | 1.2212 | 1.0076 |
116
+ | 1.1812 | 2.77 | 118000 | 1.1954 | 1.0054 |
117
+ | 1.2447 | 2.82 | 120000 | 1.2144 | 1.0082 |
118
+ | 1.2839 | 2.86 | 122000 | 1.1976 | 1.0076 |
119
+ | 1.117 | 2.91 | 124000 | 1.1809 | 1.0060 |
120
+ | 1.1151 | 2.96 | 126000 | 1.1930 | 1.0087 |
121
+ | 1.182 | 3.0 | 128000 | 1.1750 | 1.0079 |
122
+ | 1.214 | 3.05 | 130000 | 1.1687 | 1.0047 |
123
+ | 1.1701 | 3.1 | 132000 | 1.1651 | 1.0071 |
124
+ | 1.1268 | 3.15 | 134000 | 1.1635 | 1.0068 |
125
+ | 1.193 | 3.19 | 136000 | 1.1550 | 1.0075 |
126
+ | 1.1245 | 3.24 | 138000 | 1.1436 | 1.0068 |
127
+ | 1.136 | 3.29 | 140000 | 1.1378 | 1.0059 |
128
+ | 1.1318 | 3.33 | 142000 | 1.1526 | 1.0074 |
129
+ | 1.1129 | 3.38 | 144000 | 1.1452 | 1.0074 |
130
+ | 1.1175 | 3.43 | 146000 | 1.1491 | 1.0092 |
131
+ | 1.0812 | 3.47 | 148000 | 1.1300 | 1.0075 |
132
+ | 1.0735 | 3.52 | 150000 | 1.1442 | 1.0083 |
133
+ | 1.0626 | 3.57 | 152000 | 1.1298 | 1.0083 |
134
+ | 1.0952 | 3.61 | 154000 | 1.1326 | 1.0087 |
135
+ | 1.0756 | 3.66 | 156000 | 1.1341 | 1.0089 |
136
+ | 1.0695 | 3.71 | 158000 | 1.1287 | 1.0092 |
137
+ | 1.0672 | 3.76 | 160000 | 1.1379 | 1.0085 |
138
+ | 1.1802 | 3.8 | 162000 | 1.1235 | 1.0074 |
139
+ | 1.094 | 3.85 | 164000 | 1.1155 | 1.0068 |
140
+ | 1.0811 | 3.9 | 166000 | 1.1172 | 1.0078 |
141
+ | 0.9901 | 3.94 | 168000 | 1.0964 | 1.0064 |
142
+ | 1.0907 | 3.99 | 170000 | 1.1031 | 1.0084 |
143
+ | 1.0799 | 4.04 | 172000 | 1.0924 | 1.0071 |
144
+ | 1.0271 | 4.08 | 174000 | 1.1020 | 1.0086 |
145
+ | 1.0482 | 4.13 | 176000 | 1.1067 | 1.0084 |
146
+ | 1.1078 | 4.18 | 178000 | 1.1083 | 1.0088 |
147
+ | 1.1798 | 4.23 | 180000 | 1.0964 | 1.0064 |
148
+ | 1.0933 | 4.27 | 182000 | 1.1034 | 1.0080 |
149
+ | 1.0272 | 4.32 | 184000 | 1.1036 | 1.0075 |
150
+ | 1.1125 | 4.37 | 186000 | 1.1022 | 1.0084 |
151
+ | 1.033 | 4.41 | 188000 | 1.0906 | 1.0070 |
152
+ | 1.1048 | 4.46 | 190000 | 1.0923 | 1.0079 |
153
+ | 1.1565 | 4.51 | 192000 | 1.0976 | 1.0078 |
154
+ | 1.0698 | 4.55 | 194000 | 1.0950 | 1.0073 |
155
+ | 1.0735 | 4.6 | 196000 | 1.0920 | 1.0074 |
156
+ | 1.0137 | 4.65 | 198000 | 1.0899 | 1.0073 |
157
+ | 1.0669 | 4.69 | 200000 | 1.0929 | 1.0076 |
158
+
159
+
160
+ ### Framework versions
161
+
162
+ - Transformers 4.35.0
163
+ - Pytorch 2.1.0+cu118
164
+ - Datasets 2.14.6
165
+ - Tokenizers 0.14.1
trainer_state.json ADDED
The diff for this file is too large to render. See raw diff