Edit model card

whisper-large-cit-do1.5-wd1e-3

This model is a fine-tuned version of openai/whisper-large-v3 on the SF 200 dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6797
  • Wer: 33.6384

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-06
  • train_batch_size: 4
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • training_steps: 200
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
1.1289 0.8889 10 1.1191 48.9703
1.095 1.7778 20 1.0078 40.9611
0.936 2.6667 30 0.8691 39.3593
0.7555 3.5556 40 0.7930 33.6384
0.7013 4.4444 50 0.7202 34.7826
0.6006 5.3333 60 0.6553 32.4943
0.5082 6.2222 70 0.6172 31.5789
0.4133 7.1111 80 0.5908 33.4096
0.3771 8.0 90 0.5728 32.4943
0.3013 8.8889 100 0.5693 33.4096
0.266 9.7778 110 0.5728 33.4096
0.2148 10.6667 120 0.5830 32.2654
0.1829 11.5556 130 0.5947 32.7231
0.1531 12.4444 140 0.6069 31.3501
0.1246 13.3333 150 0.6206 34.0961
0.1186 14.2222 160 0.6353 33.1808
0.1013 15.1111 170 0.6533 35.0114
0.0869 16.0 180 0.6650 33.6384
0.0812 16.8889 190 0.6763 33.1808
0.0763 17.7778 200 0.6797 33.6384

Framework versions

  • Transformers 4.41.1
  • Pytorch 1.13.1+cu117
  • Datasets 2.19.1
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
1.61B params
Tensor type
FP16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Makkoen/whisper-large-cit-do1.5-wd1e-3

Finetuned
(297)
this model