nb-whisper-medium / README.md
pere's picture
Saving weights and logs of step 20000 - epoch 2
9c9b3e7
|
raw
history blame
3.93 kB
metadata
language:
  - 'no'
license: apache-2.0
base_model: NbAiLab/nb-whisper-medium-RC1
tags:
  - audio
  - asr
  - automatic-speech-recognition
  - hf-asr-leaderboard
model-index:
  - name: nb-whisper-medium-v0.8-vad3
    results: []

nb-whisper-medium-v0.8-vad3

This model is a fine-tuned version of NbAiLab/nb-whisper-medium-RC1 on the NbAiLab/ncc_speech_styling_v2_vad3 dataset.

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2.5e-05
  • lr_scheduler_type: linear
  • per_device_train_batch_size: 32
  • total_train_batch_size_per_node: 128
  • total_train_batch_size: 1024
  • total_optimization_steps: 50,000
  • starting_optimization_step: None
  • finishing_optimization_step: 50,000
  • num_train_dataset_workers: 32
  • num_hosts: 8
  • total_num_training_examples: 51,200,000
  • steps_per_epoch: 7455
  • num_beams: None
  • weight_decay: 0.01
  • adam_beta1: 0.9
  • adam_beta2: 0.98
  • adam_epsilon: 1e-06
  • dropout: True
  • bpe_dropout_probability: 0.2
  • activation_dropout_probability: 0.1

Training results

step validation_nst_loss train_loss validation_nst_wer validation_nst_cer validation_nst_exact_wer validation_nst_exact_cer validation_clean_stortinget_no_loss validation_clean_stortinget_no_wer validation_clean_stortinget_no_cer validation_clean_stortinget_no_exact_wer validation_clean_stortinget_no_exact_cer
0 0.4223 0.8343 2.3463 0.7206 2.9397 0.8105 0.6313 8.8868 5.7697 11.8752 6.2280
5000 0.4364 0.5289 2.6077 0.8063 3.2555 0.9057 0.6300 9.1071 5.8300 12.0840 6.3028
10000 0.4353 0.4901 2.4824 0.7765 3.0867 0.8709 0.6463 9.2563 5.9382 12.2144 6.4042
15000 0.4338 0.4760 2.4062 0.7290 3.0541 0.8324 0.6788 9.0052 5.7931 12.0816 6.2720
20000 0.4343 0.4553 2.5695 0.7868 3.1956 0.8819 0.7058 9.1710 5.9434 12.2168 6.4280

Framework versions

  • Transformers 4.34.1
  • Datasets 2.16.1
  • Tokenizers 0.14.1