Edit model card

wav2vec2-base-960h-EMOPIA-10sec-full-100epoc

This model is a fine-tuned version of facebook/wav2vec2-base-960h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.5358
  • Accuracy: 0.8737

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.2276 1.0 2248 1.3142 0.5730
1.4493 2.0 4496 1.5339 0.5907
1.4328 3.0 6744 1.5229 0.6584
1.3839 4.0 8992 1.3336 0.6922
1.4591 5.0 11240 1.4407 0.6762
1.3609 6.0 13488 1.1341 0.7117
1.222 7.0 15736 1.5204 0.7028
1.1713 8.0 17984 1.3222 0.7473
1.1414 9.0 20232 1.3233 0.7562
1.0977 10.0 22480 1.3587 0.7384
0.9768 11.0 24728 1.3482 0.7633
0.9219 12.0 26976 1.3743 0.7580
0.8636 13.0 29224 1.1195 0.8167
0.8543 14.0 31472 1.1716 0.8043
0.8053 15.0 33720 1.2033 0.8114
0.7718 16.0 35968 1.2491 0.7989
0.6904 17.0 38216 1.0851 0.8363
0.6545 18.0 40464 1.2963 0.8007
0.6858 19.0 42712 1.3231 0.8078
0.6444 20.0 44960 1.1918 0.8238
0.6166 21.0 47208 1.1358 0.8345
0.5437 22.0 49456 1.2446 0.8256
0.4719 23.0 51704 1.4120 0.8149
0.4802 24.0 53952 1.2611 0.8203
0.484 25.0 56200 1.2840 0.8363
0.3649 26.0 58448 1.2421 0.8434
0.4146 27.0 60696 1.3465 0.8292
0.3998 28.0 62944 1.2309 0.8505
0.4113 29.0 65192 1.1663 0.8523
0.3385 30.0 67440 1.2567 0.8470
0.3188 31.0 69688 1.2581 0.8434
0.3203 32.0 71936 1.2454 0.8541
0.2766 33.0 74184 1.2542 0.8523
0.2505 34.0 76432 1.5897 0.8149
0.2777 35.0 78680 1.3483 0.8363
0.2816 36.0 80928 1.2510 0.8523
0.2728 37.0 83176 1.4422 0.8327
0.255 38.0 85424 1.2928 0.8488
0.2172 39.0 87672 1.4022 0.8452
0.2204 40.0 89920 1.4114 0.8381
0.2232 41.0 92168 1.4324 0.8416
0.2301 42.0 94416 1.3528 0.8488
0.1751 43.0 96664 1.4649 0.8434
0.1982 44.0 98912 1.2216 0.8754
0.1803 45.0 101160 1.4569 0.8452
0.1582 46.0 103408 1.3650 0.8665
0.1837 47.0 105656 1.2877 0.8541
0.1458 48.0 107904 1.7389 0.8310
0.1664 49.0 110152 1.4001 0.8541
0.1473 50.0 112400 1.2979 0.8701
0.1341 51.0 114648 1.5705 0.8470
0.1603 52.0 116896 1.6043 0.8381
0.1133 53.0 119144 1.6194 0.8452
0.107 54.0 121392 1.4173 0.8630
0.116 55.0 123640 1.5268 0.8541
0.0988 56.0 125888 1.6092 0.8523
0.139 57.0 128136 1.4312 0.8648
0.0798 58.0 130384 1.7888 0.8327
0.0776 59.0 132632 1.5457 0.8665
0.1288 60.0 134880 1.4554 0.8630
0.0828 61.0 137128 1.7078 0.8559
0.0823 62.0 139376 1.4734 0.8754
0.0803 63.0 141624 1.6007 0.8594
0.0947 64.0 143872 1.4467 0.8701
0.0916 65.0 146120 1.4410 0.8737
0.0814 66.0 148368 1.7116 0.8470
0.0938 67.0 150616 1.5838 0.8630
0.066 68.0 152864 1.6458 0.8559
0.096 69.0 155112 1.6926 0.8559
0.0638 70.0 157360 1.5233 0.8630
0.063 71.0 159608 1.5641 0.8594
0.0758 72.0 161856 1.6767 0.8505
0.0579 73.0 164104 1.5338 0.8630
0.0379 74.0 166352 1.6348 0.8630
0.0351 75.0 168600 1.7037 0.8559
0.0472 76.0 170848 1.5682 0.8754
0.0253 77.0 173096 1.7067 0.8559
0.073 78.0 175344 1.4460 0.8754
0.049 79.0 177592 1.5897 0.8594
0.0503 80.0 179840 1.6017 0.8648
0.0497 81.0 182088 1.5319 0.8683
0.0553 82.0 184336 1.5479 0.8612
0.0416 83.0 186584 1.5556 0.8577
0.0641 84.0 188832 1.5675 0.8594
0.0425 85.0 191080 1.6854 0.8559
0.0311 86.0 193328 1.4628 0.8737
0.0456 87.0 195576 1.5069 0.8701
0.0224 88.0 197824 1.6130 0.8665
0.0345 89.0 200072 1.5750 0.8701
0.041 90.0 202320 1.5230 0.8719
0.0165 91.0 204568 1.6564 0.8594
0.0478 92.0 206816 1.5940 0.8630
0.032 93.0 209064 1.4741 0.8808
0.0433 94.0 211312 1.5333 0.8719
0.0243 95.0 213560 1.5165 0.8719
0.0165 96.0 215808 1.5775 0.8683
0.0177 97.0 218056 1.5302 0.8772
0.0253 98.0 220304 1.5424 0.8754
0.0224 99.0 222552 1.5462 0.8719
0.0213 100.0 224800 1.5358 0.8737

Framework versions

  • Transformers 4.45.2
  • Pytorch 2.4.1+cu118
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
16
Safetensors
Model size
95M params
Tensor type
F32
·
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for kurosekurose/wav2vec2-base-960h-EMOPIA-10sec-full-100epoc

Finetuned
(107)
this model