Edit model card

Whisper Tiny Hu CV18

This model is a fine-tuned version of openai/whisper-tiny on the Common Voice 18.0 dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2308
  • Wer Ortho: 51.3968
  • Wer: 46.1952

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 7.5e-05
  • train_batch_size: 64
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 5
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Ortho Wer
0.5967 0.1723 250 1.2539 69.9246 66.9344
0.4456 0.3446 500 1.2200 65.5249 61.7108
0.3713 0.5169 750 1.1422 61.8665 58.0019
0.3337 0.6892 1000 1.1139 60.4332 55.7999
0.2829 0.8615 1250 1.1074 59.6528 56.2502
0.1931 1.0338 1500 1.1087 58.2686 53.8969
0.1855 1.2061 1750 1.1643 57.5828 52.8577
0.1827 1.3784 2000 1.1136 58.2951 54.2260
0.177 1.5507 2250 1.1326 57.4353 52.1628
0.1686 1.7229 2500 1.0970 54.8396 49.9067
0.1654 1.8952 2750 1.0957 56.3953 51.6975
0.0886 2.0675 3000 1.1150 53.5349 48.7805
0.0966 2.2398 3250 1.1417 54.4060 49.0113
0.0921 2.4121 3500 1.1387 53.9975 48.6001
0.0968 2.5844 3750 1.1587 53.8147 49.2660
0.0968 2.7567 4000 1.1459 52.8176 48.4438
0.086 2.9290 4250 1.1298 52.4784 47.4702
0.0456 3.1013 4500 1.1714 52.6663 47.3920
0.0487 3.2736 4750 1.1730 52.9524 48.1499
0.0475 3.4459 5000 1.1945 52.7898 47.3668
0.0442 3.6182 5250 1.2042 52.3410 47.3037
0.0434 3.7905 5500 1.1851 53.0205 47.7262
0.0438 3.9628 5750 1.1912 52.5869 48.0541
0.0211 4.1351 6000 1.2191 52.3562 47.9923
0.0198 4.3074 6250 1.2203 51.6136 46.6290
0.0185 4.4797 6500 1.2287 52.0297 46.4537
0.0196 4.6520 6750 1.2363 51.7183 46.2696
0.0196 4.8243 7000 1.2320 51.5594 46.0817
0.0178 4.9966 7250 1.2308 51.3968 46.1952

Framework versions

  • Transformers 4.44.0
  • Pytorch 2.3.0+cu121
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
80
Safetensors
Model size
37.8M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for sarpba/whisper-tiny-cv18-hu-cleaned

Finetuned
this model

Evaluation results