Whisper Fine Tuned
Collection
Fine Tuned Whisper Models.
code: https://github.com/JacobLinCool/wft
•
6 items
•
Updated
This model is a fine-tuned version of openai/whisper-large-v3-turbo on the JacobLinCool/mozilla-foundation-common_voice_16_1-zh-TW-preprocessed dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training:
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
No log | 0 | 0 | 2.7503 | 76.5675 | 20.3917 |
0.9352 | 0.9987 | 377 | 0.2472 | 47.9301 | 13.6656 |
0.73 | 1.9980 | 754 | 0.2502 | 47.0056 | 13.5652 |
0.4985 | 2.9974 | 1131 | 0.2559 | 46.2018 | 13.7057 |
0.1928 | 3.9993 | 1509 | 0.2595 | 45.9606 | 13.0906 |
0.2539 | 4.9987 | 1886 | 0.2522 | 44.7950 | 13.1459 |
0.0607 | 5.9980 | 2263 | 0.2422 | 44.7548 | 12.5006 |
0.0826 | 6.9974 | 2640 | 0.2488 | 43.8907 | 12.4906 |
0.0151 | 7.9993 | 3018 | 0.2403 | 40.2331 | 11.4537 |
0.0056 | 8.9987 | 3395 | 0.2390 | 39.8312 | 11.5290 |
0.0056 | 9.9927 | 3770 | 0.2346 | 38.5450 | 10.8963 |