--- license: apache-2.0 base_model: facebook/wav2vec2-base tags: - generated_from_trainer metrics: - accuracy model-index: - name: wav2vec2-base-finetuned-ravdess results: [] --- # wav2vec2-base-finetuned-ravdess This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co./facebook/wav2vec2-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.8783 - Accuracy: 0.7535 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 3e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 50 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | No log | 1.0 | 9 | 2.0739 | 0.1562 | | 2.0781 | 2.0 | 18 | 2.0611 | 0.1181 | | 2.0668 | 3.0 | 27 | 2.0308 | 0.2535 | | 2.0429 | 4.0 | 36 | 1.9606 | 0.2604 | | 1.974 | 5.0 | 45 | 1.8449 | 0.2847 | | 1.8594 | 6.0 | 54 | 1.7678 | 0.2917 | | 1.7675 | 7.0 | 63 | 1.7700 | 0.2708 | | 1.6932 | 8.0 | 72 | 1.6049 | 0.3889 | | 1.5656 | 9.0 | 81 | 1.5510 | 0.4444 | | 1.4658 | 10.0 | 90 | 1.4535 | 0.4583 | | 1.4658 | 11.0 | 99 | 1.4101 | 0.4514 | | 1.3843 | 12.0 | 108 | 1.3687 | 0.5 | | 1.3085 | 13.0 | 117 | 1.3333 | 0.5035 | | 1.2264 | 14.0 | 126 | 1.3208 | 0.5208 | | 1.1349 | 15.0 | 135 | 1.3048 | 0.5312 | | 1.0861 | 16.0 | 144 | 1.2428 | 0.5799 | | 0.9836 | 17.0 | 153 | 1.1886 | 0.5799 | | 0.9273 | 18.0 | 162 | 1.1574 | 0.6146 | | 0.8686 | 19.0 | 171 | 1.1356 | 0.6111 | | 0.814 | 20.0 | 180 | 1.1261 | 0.6285 | | 0.814 | 21.0 | 189 | 1.0796 | 0.6007 | | 0.7279 | 22.0 | 198 | 1.0277 | 0.6493 | | 0.6845 | 23.0 | 207 | 1.0408 | 0.6840 | | 0.6283 | 24.0 | 216 | 0.9708 | 0.7153 | | 0.5835 | 25.0 | 225 | 0.9926 | 0.6875 | | 0.5445 | 26.0 | 234 | 1.0126 | 0.6840 | | 0.497 | 27.0 | 243 | 0.9502 | 0.6979 | | 0.4508 | 28.0 | 252 | 0.9432 | 0.7118 | | 0.4331 | 29.0 | 261 | 0.9246 | 0.7014 | | 0.4023 | 30.0 | 270 | 0.9649 | 0.6875 | | 0.4023 | 31.0 | 279 | 0.9114 | 0.7049 | | 0.3924 | 32.0 | 288 | 0.9460 | 0.7118 | | 0.3797 | 33.0 | 297 | 0.9605 | 0.7118 | | 0.3494 | 34.0 | 306 | 0.8505 | 0.7396 | | 0.3195 | 35.0 | 315 | 0.8830 | 0.7188 | | 0.3148 | 36.0 | 324 | 0.9352 | 0.7014 | | 0.2856 | 37.0 | 333 | 0.8551 | 0.7292 | | 0.2831 | 38.0 | 342 | 0.8505 | 0.7326 | | 0.2718 | 39.0 | 351 | 0.8800 | 0.7396 | | 0.2624 | 40.0 | 360 | 0.8991 | 0.7153 | | 0.2624 | 41.0 | 369 | 0.8724 | 0.7465 | | 0.2612 | 42.0 | 378 | 0.9138 | 0.7049 | | 0.2511 | 43.0 | 387 | 0.8914 | 0.7257 | | 0.2324 | 44.0 | 396 | 0.8783 | 0.7535 | | 0.2228 | 45.0 | 405 | 0.9215 | 0.7188 | | 0.2244 | 46.0 | 414 | 0.8904 | 0.7431 | | 0.2192 | 47.0 | 423 | 0.9142 | 0.7326 | | 0.217 | 48.0 | 432 | 0.8891 | 0.7361 | | 0.2146 | 49.0 | 441 | 0.9009 | 0.7326 | | 0.215 | 50.0 | 450 | 0.8994 | 0.7361 | ### Framework versions - Transformers 4.31.0 - Pytorch 2.0.1+cu118 - Datasets 2.14.4 - Tokenizers 0.13.3