sqrk commited on
Commit
a2a13f8
1 Parent(s): e97bbd2

Model save

Browse files
README.md ADDED
@@ -0,0 +1,107 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ base_model: openai/whisper-large-v3
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - wer
8
+ model-index:
9
+ - name: Sep29-Mixat-whisper-lg-3-translation-0.1trainasval
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # Sep29-Mixat-whisper-lg-3-translation-0.1trainasval
17
+
18
+ This model is a fine-tuned version of [openai/whisper-large-v3](https://huggingface.co/openai/whisper-large-v3) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.8625
21
+ - Wer: 46.4677
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 1e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - gradient_accumulation_steps: 2
45
+ - total_train_batch_size: 16
46
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
47
+ - lr_scheduler_type: linear
48
+ - lr_scheduler_warmup_steps: 500
49
+ - num_epochs: 100
50
+ - mixed_precision_training: Native AMP
51
+
52
+ ### Training results
53
+
54
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
55
+ |:-------------:|:-------:|:----:|:---------------:|:-------:|
56
+ | 0.8851 | 0.4762 | 100 | 0.6177 | 57.9329 |
57
+ | 0.5912 | 0.9524 | 200 | 0.5425 | 55.1678 |
58
+ | 0.4614 | 1.4286 | 300 | 0.5272 | 49.1317 |
59
+ | 0.466 | 1.9048 | 400 | 0.5086 | 48.0863 |
60
+ | 0.3466 | 2.3810 | 500 | 0.5289 | 46.4003 |
61
+ | 0.3488 | 2.8571 | 600 | 0.5107 | 44.7311 |
62
+ | 0.2382 | 3.3333 | 700 | 0.5503 | 44.7648 |
63
+ | 0.2208 | 3.8095 | 800 | 0.5494 | 47.0578 |
64
+ | 0.1624 | 4.2857 | 900 | 0.5938 | 45.3718 |
65
+ | 0.1237 | 4.7619 | 1000 | 0.5893 | 45.4055 |
66
+ | 0.0966 | 5.2381 | 1100 | 0.6492 | 45.2032 |
67
+ | 0.0712 | 5.7143 | 1200 | 0.6321 | 43.5003 |
68
+ | 0.0614 | 6.1905 | 1300 | 0.6663 | 46.0968 |
69
+ | 0.0422 | 6.6667 | 1400 | 0.6621 | 45.1526 |
70
+ | 0.0423 | 7.1429 | 1500 | 0.6943 | 44.7142 |
71
+ | 0.0292 | 7.6190 | 1600 | 0.6971 | 45.5572 |
72
+ | 0.0311 | 8.0952 | 1700 | 0.7240 | 45.3212 |
73
+ | 0.022 | 8.5714 | 1800 | 0.7203 | 44.8828 |
74
+ | 0.0252 | 9.0476 | 1900 | 0.7415 | 46.6026 |
75
+ | 0.0186 | 9.5238 | 2000 | 0.7361 | 45.4224 |
76
+ | 0.0189 | 10.0 | 2100 | 0.7539 | 46.2148 |
77
+ | 0.0133 | 10.4762 | 2200 | 0.7797 | 44.9671 |
78
+ | 0.0188 | 10.9524 | 2300 | 0.7688 | 45.4392 |
79
+ | 0.0138 | 11.4286 | 2400 | 0.7763 | 44.7985 |
80
+ | 0.013 | 11.9048 | 2500 | 0.7762 | 45.0008 |
81
+ | 0.0121 | 12.3810 | 2600 | 0.7999 | 43.0787 |
82
+ | 0.0132 | 12.8571 | 2700 | 0.7931 | 43.7194 |
83
+ | 0.011 | 13.3333 | 2800 | 0.8111 | 46.0293 |
84
+ | 0.0113 | 13.8095 | 2900 | 0.7986 | 44.2084 |
85
+ | 0.0111 | 14.2857 | 3000 | 0.7936 | 43.0787 |
86
+ | 0.0097 | 14.7619 | 3100 | 0.8021 | 45.1357 |
87
+ | 0.0105 | 15.2381 | 3200 | 0.8137 | 46.2991 |
88
+ | 0.0101 | 15.7143 | 3300 | 0.8118 | 44.2590 |
89
+ | 0.0095 | 16.1905 | 3400 | 0.8126 | 43.8375 |
90
+ | 0.007 | 16.6667 | 3500 | 0.8326 | 45.1357 |
91
+ | 0.0077 | 17.1429 | 3600 | 0.8108 | 43.6520 |
92
+ | 0.0059 | 17.6190 | 3700 | 0.8436 | 44.6805 |
93
+ | 0.0071 | 18.0952 | 3800 | 0.8633 | 44.8997 |
94
+ | 0.0064 | 18.5714 | 3900 | 0.8487 | 44.2421 |
95
+ | 0.007 | 19.0476 | 4000 | 0.8321 | 45.0851 |
96
+ | 0.0057 | 19.5238 | 4100 | 0.8478 | 45.2875 |
97
+ | 0.0064 | 20.0 | 4200 | 0.8485 | 45.1189 |
98
+ | 0.0068 | 20.4762 | 4300 | 0.8531 | 44.7479 |
99
+ | 0.0073 | 20.9524 | 4400 | 0.8625 | 46.4677 |
100
+
101
+
102
+ ### Framework versions
103
+
104
+ - Transformers 4.43.4
105
+ - Pytorch 2.4.1
106
+ - Datasets 3.0.0
107
+ - Tokenizers 0.19.1
generation_config.json ADDED
@@ -0,0 +1,257 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alignment_heads": [
3
+ [
4
+ 7,
5
+ 0
6
+ ],
7
+ [
8
+ 10,
9
+ 17
10
+ ],
11
+ [
12
+ 12,
13
+ 18
14
+ ],
15
+ [
16
+ 13,
17
+ 12
18
+ ],
19
+ [
20
+ 16,
21
+ 1
22
+ ],
23
+ [
24
+ 17,
25
+ 14
26
+ ],
27
+ [
28
+ 19,
29
+ 11
30
+ ],
31
+ [
32
+ 21,
33
+ 4
34
+ ],
35
+ [
36
+ 24,
37
+ 1
38
+ ],
39
+ [
40
+ 25,
41
+ 6
42
+ ]
43
+ ],
44
+ "begin_suppress_tokens": [
45
+ 220,
46
+ 50257
47
+ ],
48
+ "bos_token_id": 50257,
49
+ "decoder_start_token_id": 50258,
50
+ "eos_token_id": 50257,
51
+ "is_multilingual": true,
52
+ "lang_to_id": {
53
+ "<|af|>": 50327,
54
+ "<|am|>": 50334,
55
+ "<|ar|>": 50272,
56
+ "<|as|>": 50350,
57
+ "<|az|>": 50304,
58
+ "<|ba|>": 50355,
59
+ "<|be|>": 50330,
60
+ "<|bg|>": 50292,
61
+ "<|bn|>": 50302,
62
+ "<|bo|>": 50347,
63
+ "<|br|>": 50309,
64
+ "<|bs|>": 50315,
65
+ "<|ca|>": 50270,
66
+ "<|cs|>": 50283,
67
+ "<|cy|>": 50297,
68
+ "<|da|>": 50285,
69
+ "<|de|>": 50261,
70
+ "<|el|>": 50281,
71
+ "<|en|>": 50259,
72
+ "<|es|>": 50262,
73
+ "<|et|>": 50307,
74
+ "<|eu|>": 50310,
75
+ "<|fa|>": 50300,
76
+ "<|fi|>": 50277,
77
+ "<|fo|>": 50338,
78
+ "<|fr|>": 50265,
79
+ "<|gl|>": 50319,
80
+ "<|gu|>": 50333,
81
+ "<|haw|>": 50352,
82
+ "<|ha|>": 50354,
83
+ "<|he|>": 50279,
84
+ "<|hi|>": 50276,
85
+ "<|hr|>": 50291,
86
+ "<|ht|>": 50339,
87
+ "<|hu|>": 50286,
88
+ "<|hy|>": 50312,
89
+ "<|id|>": 50275,
90
+ "<|is|>": 50311,
91
+ "<|it|>": 50274,
92
+ "<|ja|>": 50266,
93
+ "<|jw|>": 50356,
94
+ "<|ka|>": 50329,
95
+ "<|kk|>": 50316,
96
+ "<|km|>": 50323,
97
+ "<|kn|>": 50306,
98
+ "<|ko|>": 50264,
99
+ "<|la|>": 50294,
100
+ "<|lb|>": 50345,
101
+ "<|ln|>": 50353,
102
+ "<|lo|>": 50336,
103
+ "<|lt|>": 50293,
104
+ "<|lv|>": 50301,
105
+ "<|mg|>": 50349,
106
+ "<|mi|>": 50295,
107
+ "<|mk|>": 50308,
108
+ "<|ml|>": 50296,
109
+ "<|mn|>": 50314,
110
+ "<|mr|>": 50320,
111
+ "<|ms|>": 50282,
112
+ "<|mt|>": 50343,
113
+ "<|my|>": 50346,
114
+ "<|ne|>": 50313,
115
+ "<|nl|>": 50271,
116
+ "<|nn|>": 50342,
117
+ "<|no|>": 50288,
118
+ "<|oc|>": 50328,
119
+ "<|pa|>": 50321,
120
+ "<|pl|>": 50269,
121
+ "<|ps|>": 50340,
122
+ "<|pt|>": 50267,
123
+ "<|ro|>": 50284,
124
+ "<|ru|>": 50263,
125
+ "<|sa|>": 50344,
126
+ "<|sd|>": 50332,
127
+ "<|si|>": 50322,
128
+ "<|sk|>": 50298,
129
+ "<|sl|>": 50305,
130
+ "<|sn|>": 50324,
131
+ "<|so|>": 50326,
132
+ "<|sq|>": 50317,
133
+ "<|sr|>": 50303,
134
+ "<|su|>": 50357,
135
+ "<|sv|>": 50273,
136
+ "<|sw|>": 50318,
137
+ "<|ta|>": 50287,
138
+ "<|te|>": 50299,
139
+ "<|tg|>": 50331,
140
+ "<|th|>": 50289,
141
+ "<|tk|>": 50341,
142
+ "<|tl|>": 50348,
143
+ "<|tr|>": 50268,
144
+ "<|tt|>": 50351,
145
+ "<|uk|>": 50280,
146
+ "<|ur|>": 50290,
147
+ "<|uz|>": 50337,
148
+ "<|vi|>": 50278,
149
+ "<|yi|>": 50335,
150
+ "<|yo|>": 50325,
151
+ "<|yue|>": 50358,
152
+ "<|zh|>": 50260
153
+ },
154
+ "language": "arabic",
155
+ "max_initial_timestamp_index": 50,
156
+ "max_length": 448,
157
+ "no_timestamps_token_id": 50364,
158
+ "pad_token_id": 50257,
159
+ "prev_sot_token_id": 50362,
160
+ "return_timestamps": false,
161
+ "suppress_tokens": [
162
+ 1,
163
+ 2,
164
+ 7,
165
+ 8,
166
+ 9,
167
+ 10,
168
+ 14,
169
+ 25,
170
+ 26,
171
+ 27,
172
+ 28,
173
+ 29,
174
+ 31,
175
+ 58,
176
+ 59,
177
+ 60,
178
+ 61,
179
+ 62,
180
+ 63,
181
+ 90,
182
+ 91,
183
+ 92,
184
+ 93,
185
+ 359,
186
+ 503,
187
+ 522,
188
+ 542,
189
+ 873,
190
+ 893,
191
+ 902,
192
+ 918,
193
+ 922,
194
+ 931,
195
+ 1350,
196
+ 1853,
197
+ 1982,
198
+ 2460,
199
+ 2627,
200
+ 3246,
201
+ 3253,
202
+ 3268,
203
+ 3536,
204
+ 3846,
205
+ 3961,
206
+ 4183,
207
+ 4667,
208
+ 6585,
209
+ 6647,
210
+ 7273,
211
+ 9061,
212
+ 9383,
213
+ 10428,
214
+ 10929,
215
+ 11938,
216
+ 12033,
217
+ 12331,
218
+ 12562,
219
+ 13793,
220
+ 14157,
221
+ 14635,
222
+ 15265,
223
+ 15618,
224
+ 16553,
225
+ 16604,
226
+ 18362,
227
+ 18956,
228
+ 20075,
229
+ 21675,
230
+ 22520,
231
+ 26130,
232
+ 26161,
233
+ 26435,
234
+ 28279,
235
+ 29464,
236
+ 31650,
237
+ 32302,
238
+ 32470,
239
+ 36865,
240
+ 42863,
241
+ 47425,
242
+ 49870,
243
+ 50254,
244
+ 50258,
245
+ 50359,
246
+ 50360,
247
+ 50361,
248
+ 50362,
249
+ 50363
250
+ ],
251
+ "task": "transcribe",
252
+ "task_to_id": {
253
+ "transcribe": 50360,
254
+ "translate": 50359
255
+ },
256
+ "transformers_version": "4.43.4"
257
+ }
model-00001-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:758dbd56e606e5070d9a348910de88e0cf4180d993b0271e04d9fd104b3cb6f1
3
  size 4993448880
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:80999bbd8945ee105a538c7d87c9f4f3acf23ffe7d4012dfe97d893440b254b3
3
  size 4993448880
model-00002-of-00002.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:b42a8b464069fe155d1d6315d17179a04650503d915312295809070b48004601
3
  size 1180663192
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3045c44fb10f9a9368e254789cbc962b08221af9cb51bd2b86cc4a3c9d6c8494
3
  size 1180663192