Marcusxx's picture
End of training
46253b5 verified
|
raw
history blame
2.13 kB
metadata
language:
  - ko
license: apache-2.0
base_model: openai/whisper-small
tags:
  - hf-asr-leaderboard
  - generated_from_trainer
datasets:
  - Marcusxx/chungnam_firestation
model-index:
  - name: chungnam_firestation_small_model
    results: []

chungnam_firestation_small_model

This model is a fine-tuned version of openai/whisper-small on the Marcusxx/chungnam_firestation dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0667
  • Cer: 165.4388

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 250
  • training_steps: 2000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Cer
0.439 1.3245 200 0.4522 37.7842
0.075 2.6490 400 0.1504 189.0647
0.0296 3.9735 600 0.0767 102.6187
0.0049 5.2980 800 0.0577 196.9784
0.0015 6.6225 1000 0.0607 241.2950
0.0005 7.9470 1200 0.0629 161.6691
0.0004 9.2715 1400 0.0650 173.6978
0.0004 10.5960 1600 0.0660 173.4964
0.0003 11.9205 1800 0.0666 167.5108
0.0003 13.2450 2000 0.0667 165.4388

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.2.2+cu121
  • Datasets 3.2.0
  • Tokenizers 0.19.1