--- library_name: transformers license: apache-2.0 base_model: openai/whisper-small tags: - generated_from_trainer metrics: - wer model-index: - name: Moroccan-Darija-STT-small-v1.6.12 results: [] --- # Moroccan-Darija-STT-small-v1.6.12 This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co./openai/whisper-small) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.4252 - Wer: 92.0934 - Cer: 58.5359 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1.25e-05 - train_batch_size: 128 - eval_batch_size: 128 - seed: 42 - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 10 - num_epochs: 6 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | Cer | |:-------------:|:------:|:----:|:---------------:|:--------:|:-------:| | 0.6307 | 0.1775 | 30 | 0.4701 | 105.2209 | 63.1049 | | 0.5074 | 0.3550 | 60 | 0.4290 | 96.8541 | 55.8502 | | 0.4718 | 0.5325 | 90 | 0.4017 | 100.9454 | 58.4092 | | 0.4624 | 0.7101 | 120 | 0.3961 | 94.0930 | 52.6933 | | 0.4522 | 0.8876 | 150 | 0.3964 | 86.7888 | 47.0770 | | 0.3857 | 1.0651 | 180 | 0.3818 | 109.6804 | 72.2092 | | 0.4146 | 1.2426 | 210 | 0.3745 | 105.5054 | 69.9272 | | 0.3874 | 1.4201 | 240 | 0.3826 | 89.1148 | 51.8386 | | 0.4055 | 1.5976 | 270 | 0.4276 | 101.4307 | 62.0036 | | 0.4487 | 1.7751 | 300 | 0.4205 | 92.6121 | 54.7743 | | 0.4128 | 1.9527 | 330 | 0.4344 | 89.1650 | 53.9128 | | 0.422 | 2.1302 | 360 | 0.4396 | 100.8450 | 64.9291 | | 0.4195 | 2.3077 | 390 | 0.4374 | 97.0298 | 63.1134 | | 0.4009 | 2.4852 | 420 | 0.4387 | 104.1499 | 69.3073 | | 0.3506 | 2.6627 | 450 | 0.4350 | 95.8584 | 60.6827 | | 0.4044 | 2.8402 | 480 | 0.4302 | 108.3333 | 74.0469 | | 0.3977 | 3.0178 | 510 | 0.4269 | 94.4863 | 55.5715 | | 0.3751 | 3.1953 | 540 | 0.4295 | 93.5910 | 59.0207 | | 0.3689 | 3.3728 | 570 | 0.4362 | 96.8039 | 64.7889 | | 0.3671 | 3.5503 | 600 | 0.4277 | 94.7707 | 60.2047 | | 0.3646 | 3.7278 | 630 | 0.4256 | 128.2463 | 87.3368 | | 0.361 | 3.9053 | 660 | 0.4187 | 92.8129 | 58.0460 | | 0.3598 | 4.0828 | 690 | 0.4181 | 87.0482 | 51.4771 | | 0.3603 | 4.2604 | 720 | 0.4336 | 105.2293 | 67.8361 | | 0.3462 | 4.4379 | 750 | 0.4282 | 96.0592 | 61.0560 | | 0.3581 | 4.6154 | 780 | 0.4266 | 90.8718 | 52.9517 | | 0.3712 | 4.7929 | 810 | 0.4337 | 89.7507 | 54.4364 | | 0.3834 | 4.9704 | 840 | 0.4425 | 97.4565 | 63.3633 | | 0.3384 | 5.1479 | 870 | 0.4276 | 90.6710 | 55.8384 | | 0.3646 | 5.3254 | 900 | 0.4219 | 90.5120 | 55.1611 | | 0.3516 | 5.5030 | 930 | 0.4210 | 96.6449 | 60.1439 | | 0.362 | 5.6805 | 960 | 0.4361 | 96.9545 | 62.9985 | | 0.342 | 5.8580 | 990 | 0.4252 | 92.0934 | 58.5359 | ### Framework versions - Transformers 4.48.0 - Pytorch 2.5.1+cu124 - Datasets 3.1.0 - Tokenizers 0.21.0