csikasote's picture
End of training
ca6af2f verified
metadata
library_name: transformers
license: cc-by-nc-4.0
base_model: mms-meta/mms-zeroshot-300m
tags:
  - automatic-speech-recognition
  - BembaSpeech
  - mms
  - generated_from_trainer
metrics:
  - wer
model-index:
  - name: mms-zeroshot-300m-bembaspeech-model
    results: []

mms-zeroshot-300m-bembaspeech-model

This model is a fine-tuned version of mms-meta/mms-zeroshot-300m on the BEMBASPEECH - BEM dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2038
  • Wer: 0.4007

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 100
  • num_epochs: 30.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 0.1405 200 2.2803 1.0
No log 0.2811 400 0.2945 0.5101
2.741 0.4216 600 0.2641 0.4834
2.741 0.5622 800 0.2611 0.4743
0.5962 0.7027 1000 0.2608 0.4831
0.5962 0.8433 1200 0.2477 0.4563
0.5962 0.9838 1400 0.2407 0.4567
0.536 1.1244 1600 0.2343 0.4412
0.536 1.2649 1800 0.2307 0.4423
0.5221 1.4055 2000 0.2252 0.4348
0.5221 1.5460 2200 0.2228 0.4326
0.5221 1.6866 2400 0.2162 0.4253
0.5027 1.8271 2600 0.2200 0.4188
0.5027 1.9677 2800 0.2131 0.4142
0.4818 2.1082 3000 0.2281 0.4281
0.4818 2.2488 3200 0.2178 0.4147
0.4818 2.3893 3400 0.2123 0.4155
0.4619 2.5299 3600 0.2142 0.4079
0.4619 2.6704 3800 0.2156 0.4011
0.464 2.8110 4000 0.2072 0.4009
0.464 2.9515 4200 0.2128 0.4013
0.464 3.0921 4400 0.2056 0.3982
0.4464 3.2326 4600 0.2038 0.4007
0.4464 3.3732 4800 0.2089 0.3987
0.4418 3.5137 5000 0.2043 0.4009
0.4418 3.6543 5200 0.2050 0.3964

Framework versions

  • Transformers 4.46.0.dev0
  • Pytorch 2.4.1+cu121
  • Datasets 3.0.1
  • Tokenizers 0.20.0