Configuration Parsing
Warning:
In adapter_config.json: "peft.task_type" must be a string
Whisper Turbo ko
This model is a fine-tuned version of openai/whisper-large-v3-turbo on the custom dataset. It achieves the following results on the evaluation set:
- Loss: 0.0713
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 64
- eval_batch_size: 256
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 200
- training_steps: 1000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
0.7868 | 0.0556 | 10 | 1.9245 |
0.6788 | 0.1111 | 20 | 1.7030 |
0.3649 | 0.1667 | 30 | 1.3972 |
0.2325 | 0.2222 | 40 | 1.2200 |
0.1942 | 0.2778 | 50 | 1.0071 |
0.1721 | 0.3333 | 60 | 0.8545 |
0.1254 | 0.3889 | 70 | 0.7825 |
0.1236 | 0.4444 | 80 | 0.7093 |
0.1112 | 0.5 | 90 | 0.6678 |
0.1171 | 0.5556 | 100 | 0.6197 |
0.126 | 0.6111 | 110 | 0.5429 |
0.1121 | 0.6667 | 120 | 0.5091 |
0.0969 | 0.7222 | 130 | 0.4473 |
0.0983 | 0.7778 | 140 | 0.4367 |
0.099 | 0.8333 | 150 | 0.4170 |
0.1149 | 0.8889 | 160 | 0.3990 |
0.0904 | 0.9444 | 170 | 0.3741 |
0.0877 | 1.0 | 180 | 0.3511 |
0.0701 | 1.0556 | 190 | 0.3545 |
0.0714 | 1.1111 | 200 | 0.3226 |
0.1064 | 1.1667 | 210 | 0.3853 |
0.0953 | 1.2222 | 220 | 0.3814 |
0.1059 | 1.2778 | 230 | 0.3946 |
0.085 | 1.3333 | 240 | 0.3276 |
0.0947 | 1.3889 | 250 | 0.3017 |
0.0832 | 1.4444 | 260 | 0.3057 |
0.0829 | 1.5 | 270 | 0.2909 |
0.0834 | 1.5556 | 280 | 0.2573 |
0.0758 | 1.6111 | 290 | 0.2458 |
0.0962 | 1.6667 | 300 | 0.2393 |
0.0766 | 1.7222 | 310 | 0.2512 |
0.0849 | 1.7778 | 320 | 0.2315 |
0.0779 | 1.8333 | 330 | 0.2261 |
0.0726 | 1.8889 | 340 | 0.2388 |
0.0708 | 1.9444 | 350 | 0.2266 |
0.0749 | 2.0 | 360 | 0.2096 |
0.0691 | 2.0556 | 370 | 0.2128 |
0.0452 | 2.1111 | 380 | 0.2102 |
0.0587 | 2.1667 | 390 | 0.1989 |
0.0538 | 2.2222 | 400 | 0.1903 |
0.0496 | 2.2778 | 410 | 0.1804 |
0.0462 | 2.3333 | 420 | 0.1787 |
0.0535 | 2.3889 | 430 | 0.1749 |
0.0582 | 2.4444 | 440 | 0.1683 |
0.0641 | 2.5 | 450 | 0.1703 |
0.0532 | 2.5556 | 460 | 0.1675 |
0.0561 | 2.6111 | 470 | 0.1664 |
0.091 | 2.6667 | 480 | 0.1475 |
0.0575 | 2.7222 | 490 | 0.1453 |
0.0483 | 2.7778 | 500 | 0.1485 |
0.0495 | 2.8333 | 510 | 0.1401 |
0.0529 | 2.8889 | 520 | 0.1368 |
0.0543 | 2.9444 | 530 | 0.1411 |
0.0508 | 3.0 | 540 | 0.1323 |
0.033 | 3.0556 | 550 | 0.1320 |
0.0447 | 3.1111 | 560 | 0.1285 |
0.0354 | 3.1667 | 570 | 0.1194 |
0.0343 | 3.2222 | 580 | 0.1177 |
0.0317 | 3.2778 | 590 | 0.1121 |
0.0314 | 3.3333 | 600 | 0.1107 |
0.0306 | 3.3889 | 610 | 0.1114 |
0.0337 | 3.4444 | 620 | 0.1087 |
0.0486 | 3.5 | 630 | 0.1150 |
0.0394 | 3.5556 | 640 | 0.1054 |
0.036 | 3.6111 | 650 | 0.1067 |
0.0338 | 3.6667 | 660 | 0.1005 |
0.0326 | 3.7222 | 670 | 0.1000 |
0.0427 | 3.7778 | 680 | 0.1017 |
0.0359 | 3.8333 | 690 | 0.0963 |
0.0419 | 3.8889 | 700 | 0.0987 |
0.0527 | 3.9444 | 710 | 0.1002 |
0.0345 | 4.0 | 720 | 0.0944 |
0.0252 | 4.0556 | 730 | 0.0927 |
0.0275 | 4.1111 | 740 | 0.0879 |
0.0426 | 4.1667 | 750 | 0.1208 |
0.0274 | 4.2222 | 760 | 0.1156 |
0.0246 | 4.2778 | 770 | 0.1141 |
0.0212 | 4.3333 | 780 | 0.1144 |
0.0232 | 4.3889 | 790 | 0.1097 |
0.0311 | 4.4444 | 800 | 0.1081 |
0.0313 | 4.5 | 810 | 0.1090 |
0.0228 | 4.5556 | 820 | 0.1064 |
0.0215 | 4.6111 | 830 | 0.1026 |
0.0241 | 4.6667 | 840 | 0.1009 |
0.0222 | 4.7222 | 850 | 0.0987 |
0.0235 | 4.7778 | 860 | 0.0982 |
0.0267 | 4.8333 | 870 | 0.0970 |
0.0273 | 4.8889 | 880 | 0.0966 |
0.0247 | 4.9444 | 890 | 0.0940 |
0.0247 | 5.0 | 900 | 0.0931 |
0.0161 | 5.0556 | 910 | 0.0931 |
0.0179 | 5.1111 | 920 | 0.0929 |
0.0511 | 5.1667 | 930 | 0.0798 |
0.0175 | 5.2222 | 940 | 0.0742 |
0.0186 | 5.2778 | 950 | 0.0724 |
0.0169 | 5.3333 | 960 | 0.0716 |
0.018 | 5.3889 | 970 | 0.0712 |
0.0177 | 5.4444 | 980 | 0.0713 |
0.0182 | 5.5 | 990 | 0.0713 |
0.0184 | 5.5556 | 1000 | 0.0713 |
Framework versions
- PEFT 0.14.0
- Transformers 4.47.1
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 2