Baselhany's picture
Training finished
89e6b61 verified
|
raw
history blame
2.95 kB
metadata
library_name: transformers
language:
  - ar
license: apache-2.0
base_model: openai/whisper-tiny
tags:
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: Whisper tiny AR - BH
    results: []

Whisper tiny AR - BH

This model is a fine-tuned version of openai/whisper-tiny on the quran-ayat-speech-to-text dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0256
  • Wer: 0.1250
  • Cer: 0.0445

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.0263 0.9973 187 0.0205 0.1624 0.0613
0.0093 2.0 375 0.0149 0.1519 0.0529
0.0051 2.9973 562 0.0157 0.1580 0.0512
0.004 4.0 750 0.0181 0.1636 0.0539
0.002 4.9973 937 0.0193 0.1557 0.0502
0.0011 6.0 1125 0.0206 0.1558 0.0506
0.0009 6.9973 1312 0.0213 0.1513 0.0498
0.0005 8.0 1500 0.0214 0.1544 0.0504
0.0004 8.9973 1687 0.0220 0.1464 0.0458
0.0004 10.0 1875 0.0216 0.1459 0.0461
0.0002 10.9973 2062 0.0224 0.1452 0.0454
0.0001 12.0 2250 0.0224 0.1437 0.0452
0.0001 12.9973 2437 0.0234 0.2224 0.0832
0.0 14.0 2625 0.0231 0.1356 0.0540
0.0 14.9973 2812 0.0236 0.2134 0.0797
0.0 16.0 3000 0.0241 0.2159 0.0796
0.0 16.9973 3187 0.0253 0.1338 0.0517
0.0 18.0 3375 0.0257 0.1271 0.0493
0.0 18.9973 3562 0.0264 0.1287 0.0492
0.0 19.9467 3740 0.0266 0.1280 0.0489

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.19.1