w2v2-base-pretrained_lr5e-5_at0.8_da0.7
This model is a fine-tuned version of facebook/wav2vec2-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 2.2042
- Wer: 0.1884
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 32
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- training_steps: 4000
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
24.3422 | 7.81 | 250 | 5.6070 | 1.0 |
3.5981 | 15.62 | 500 | 3.2535 | 1.0 |
3.1121 | 23.44 | 750 | 3.1577 | 1.0 |
3.0596 | 31.25 | 1000 | 3.1214 | 1.0 |
3.0143 | 39.06 | 1250 | 2.9603 | 1.0 |
1.4861 | 46.88 | 1500 | 1.2406 | 0.4007 |
0.2223 | 54.69 | 1750 | 1.3926 | 0.2324 |
0.1147 | 62.5 | 2000 | 1.5275 | 0.2136 |
0.0775 | 70.31 | 2250 | 1.8277 | 0.1986 |
0.0601 | 78.12 | 2500 | 1.9747 | 0.1944 |
0.0479 | 85.94 | 2750 | 2.0632 | 0.1909 |
0.042 | 93.75 | 3000 | 2.1333 | 0.1991 |
0.0353 | 101.56 | 3250 | 2.1743 | 0.1982 |
0.0315 | 109.38 | 3500 | 2.1585 | 0.1939 |
0.0274 | 117.19 | 3750 | 2.1521 | 0.1914 |
0.0279 | 125.0 | 4000 | 2.2042 | 0.1884 |
Framework versions
- Transformers 4.35.0
- Pytorch 2.0.0
- Datasets 2.14.6
- Tokenizers 0.14.1
- Downloads last month
- 5
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for MelanieKoe/w2v2-base-pretrained_lr5e-5_at0.8_da0.7
Base model
facebook/wav2vec2-base