wav2vec2-xls-r-300m-scandinavian-E4-100h-30-epochs-20250201_v1.2
This model is a fine-tuned version of facebook/wav2vec2-xls-r-300m on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.8034
- Wer: 87.9662
- Cer: 25.6683
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_steps: 5000
- num_epochs: 30
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
3.5002 | 0.7788 | 1000 | 3.3325 | 99.9561 | 99.1218 |
2.9685 | 1.5576 | 2000 | 2.9292 | 99.9551 | 99.1218 |
0.8981 | 2.3364 | 3000 | 0.6540 | 51.2661 | 14.4010 |
0.7203 | 3.1153 | 4000 | 0.3972 | 35.4549 | 9.8695 |
0.6642 | 3.8941 | 5000 | 0.3259 | 29.5895 | 8.1753 |
0.6634 | 4.6729 | 6000 | 0.2929 | 26.4860 | 7.3066 |
0.5613 | 5.4517 | 7000 | 0.4702 | 31.8866 | 8.7770 |
1.3205 | 6.2305 | 8000 | 0.9818 | 76.5633 | 29.0254 |
1.304 | 7.0093 | 9000 | 0.9924 | 99.5143 | 42.4246 |
1.1542 | 7.7882 | 10000 | 0.7604 | 56.7837 | 14.0385 |
0.9404 | 8.5670 | 11000 | 0.7283 | 49.1434 | 13.0316 |
0.9964 | 9.3458 | 12000 | 0.7189 | 46.0305 | 11.7982 |
1.2419 | 10.1246 | 13000 | 0.7556 | 65.1854 | 17.5888 |
1.2285 | 10.9034 | 14000 | 0.8025 | 89.1539 | 26.5302 |
1.0388 | 11.6822 | 15000 | 0.8034 | 87.9672 | 25.6645 |
1.0672 | 12.4611 | 16000 | 0.8034 | 87.9662 | 25.6671 |
1.2088 | 13.2399 | 17000 | 0.8034 | 87.9651 | 25.6707 |
1.0838 | 14.0187 | 18000 | 0.8034 | 87.9672 | 25.6647 |
1.0981 | 14.7975 | 19000 | 0.8034 | 87.9672 | 25.6644 |
1.0798 | 15.5763 | 20000 | 0.8034 | 87.9662 | 25.6653 |
1.2961 | 16.3551 | 21000 | 0.8034 | 87.9703 | 25.6658 |
1.155 | 17.1340 | 22000 | 0.8034 | 87.9682 | 25.6669 |
1.151 | 17.9128 | 23000 | 0.8034 | 87.9682 | 25.6645 |
1.0612 | 18.6916 | 24000 | 0.8034 | 87.9630 | 25.6674 |
1.2208 | 19.4704 | 25000 | 0.8034 | 87.9630 | 25.6665 |
1.1297 | 20.2492 | 26000 | 0.8034 | 87.9724 | 25.6642 |
1.0827 | 21.0280 | 27000 | 0.8034 | 87.9620 | 25.6647 |
1.0807 | 21.8069 | 28000 | 0.8034 | 87.9651 | 25.6649 |
1.2411 | 22.5857 | 29000 | 0.8034 | 87.9703 | 25.6673 |
1.2018 | 23.3645 | 30000 | 0.8034 | 87.9672 | 25.6656 |
1.0622 | 24.1433 | 31000 | 0.8034 | 87.9557 | 25.6655 |
1.1159 | 24.9221 | 32000 | 0.8034 | 87.9724 | 25.6676 |
1.2522 | 25.7009 | 33000 | 0.8034 | 87.9703 | 25.6664 |
1.1681 | 26.4798 | 34000 | 0.8034 | 87.9693 | 25.6674 |
1.1144 | 27.2586 | 35000 | 0.8034 | 87.9641 | 25.6669 |
1.1385 | 28.0374 | 36000 | 0.8034 | 87.9641 | 25.6665 |
1.147 | 28.8162 | 37000 | 0.8034 | 87.9651 | 25.6653 |
1.2391 | 29.5950 | 38000 | 0.8034 | 87.9662 | 25.6683 |
Framework versions
- Transformers 4.48.2
- Pytorch 2.5.1+cu124
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for davidilag/wav2vec2-xls-r-300m-scandinavian-E4-100h-30-epochs-20250201_v1.2
Base model
facebook/wav2vec2-xls-r-300m