kartikay101's picture
Upload tokenizer
52c1811 verified
metadata
license: apache-2.0
tags:
  - generated_from_trainer
base_model: facebook/wav2vec2-base-960h
metrics:
  - wer
model-index:
  - name: wtimit-base-960h-normal-reduced-learning-rate-all
    results: []

wtimit-base-960h-normal-reduced-learning-rate-all

This model is a fine-tuned version of facebook/wav2vec2-base-960h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3181
  • Wer: 0.2132

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 1000
  • num_epochs: 20
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
0.4297 2.1552 1000 0.3046 0.2440
0.3137 4.3103 2000 0.2941 0.2240
0.2578 6.4655 3000 0.2982 0.2176
0.2153 8.6207 4000 0.3063 0.2166
0.1998 10.7759 5000 0.3036 0.2155
0.1913 12.9310 6000 0.3049 0.2122
0.1836 15.0862 7000 0.3160 0.2161
0.1755 17.2414 8000 0.3192 0.2152
0.1681 19.3966 9000 0.3181 0.2132

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.2
  • Tokenizers 0.19.1