--- license: apache-2.0 base_model: facebook/hubert-large-ls960-ft tags: - generated_from_trainer metrics: - wer model-index: - name: hubert-large-ls960-ft-V2-5 results: [] --- # hubert-large-ls960-ft-V2-5 This model is a fine-tuned version of [facebook/hubert-large-ls960-ft](https://huggingface.co./facebook/hubert-large-ls960-ft) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.5760 - Wer: 0.1085 - Per: 0.0892 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0001 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 20 ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | Per | |:-------------:|:-----:|:----:|:---------------:|:------:|:------:| | 19.707 | 1.0 | 82 | 3.5254 | 1.0 | 1.0 | | 3.4906 | 2.0 | 164 | 3.2483 | 1.0 | 1.0 | | 3.233 | 3.0 | 246 | 3.1368 | 1.0 | 1.0 | | 3.0468 | 4.0 | 328 | 2.9600 | 1.0 | 1.0 | | 2.6751 | 5.0 | 410 | 2.3348 | 1.0 | 1.0 | | 2.0881 | 6.0 | 492 | 1.7351 | 0.8568 | 0.8726 | | 1.4875 | 7.0 | 574 | 1.2264 | 0.6059 | 0.6134 | | 1.0922 | 8.0 | 656 | 0.9666 | 0.4068 | 0.3972 | | 0.8148 | 9.0 | 738 | 0.7746 | 0.3249 | 0.3138 | | 0.6332 | 10.0 | 820 | 0.6755 | 0.2477 | 0.2313 | | 0.4797 | 11.0 | 902 | 0.6262 | 0.1612 | 0.1410 | | 0.3807 | 12.0 | 984 | 0.5765 | 0.1384 | 0.1172 | | 0.3195 | 13.0 | 1066 | 0.5666 | 0.1191 | 0.0992 | | 0.2526 | 14.0 | 1148 | 0.5759 | 0.1165 | 0.0970 | | 0.2417 | 15.0 | 1230 | 0.5460 | 0.1138 | 0.0946 | | 0.2072 | 16.0 | 1312 | 0.5551 | 0.1095 | 0.0912 | | 0.1881 | 17.0 | 1394 | 0.5745 | 0.1102 | 0.0917 | | 0.1888 | 18.0 | 1476 | 0.5731 | 0.1094 | 0.0907 | | 0.202 | 19.0 | 1558 | 0.5774 | 0.1081 | 0.0893 | | 0.1813 | 20.0 | 1640 | 0.5760 | 0.1085 | 0.0892 | ### Framework versions - Transformers 4.35.2 - Pytorch 2.1.0+cu121 - Datasets 2.16.1 - Tokenizers 0.15.0