nb-whisper-tiny / README.md
pere's picture
Saving weights and logs of step 35000 - epoch 4
b7d1d51
|
raw
history blame
4.93 kB
metadata
language:
  - 'no'
license: apache-2.0
base_model: NbAiLab/nb-whisper-tiny-RC1
tags:
  - audio
  - asr
  - automatic-speech-recognition
  - hf-asr-leaderboard
model-index:
  - name: nb-whisper-tiny-v0.8-vad3
    results: []

nb-whisper-tiny-v0.8-vad3

This model is a fine-tuned version of NbAiLab/nb-whisper-tiny-RC1 on the NbAiLab/ncc_speech_styling_v2_vad3 dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.00015
  • lr_scheduler_type: linear
  • per_device_train_batch_size: 32
  • total_train_batch_size_per_node: 128
  • total_train_batch_size: 1024
  • total_optimization_steps: 50,000
  • starting_optimization_step: None
  • finishing_optimization_step: 50,000
  • num_train_dataset_workers: 32
  • num_hosts: 8
  • total_num_training_examples: 51,200,000
  • steps_per_epoch: 7455
  • num_beams: None
  • weight_decay: 0.01
  • adam_beta1: 0.9
  • adam_beta2: 0.98
  • adam_epsilon: 1e-06
  • dropout: True
  • bpe_dropout_probability: 0.2
  • activation_dropout_probability: 0.1

Training results

step validation_nst_loss train_loss validation_nst_wer validation_nst_cer validation_nst_exact_wer validation_nst_exact_cer validation_clean_stortinget_no_loss validation_clean_stortinget_no_wer validation_clean_stortinget_no_cer validation_clean_stortinget_no_exact_wer validation_clean_stortinget_no_exact_cer
0 0.5115 1.2809 7.9808 2.5868 8.9172 2.7299 0.6754 16.0516 8.8512 19.8245 9.5377
5000 0.5513 0.7948 9.8699 3.1480 10.7083 3.2666 0.8315 18.2473 9.7512 22.3081 10.4913
10000 0.5434 0.7729 9.5433 3.0613 10.3490 3.1851 0.7654 17.9536 9.6422 21.9855 10.3974
15000 0.5380 0.7629 9.0914 2.9523 9.9243 3.0779 0.7651 17.6694 9.2759 21.5443 9.9923
20000 0.5364 0.7173 8.7974 2.8684 9.6412 2.9882 0.7469 17.0725 9.1391 20.9963 9.8515
25000 0.5311 0.7168 8.6559 2.7453 9.5160 2.8700 0.7347 16.8167 9.0130 20.7615 9.7248
30000 0.5282 0.6977 8.4545 2.7090 9.2711 2.8362 0.7588 16.5632 8.9523 20.4650 9.6476
35000 0.5232 0.6851 8.4000 2.7062 9.2547 2.8316 0.7314 16.2577 8.7608 20.2135 9.4749

Framework versions

  • Transformers 4.34.1
  • Datasets 2.16.1
  • Tokenizers 0.14.1