ai-light-dance_drums_ft_pretrain_wav2vec2-base-new_onset-rbma13-2_7k

This model is a fine-tuned version of gary109/ai-light-dance_drums_pretrain_wav2vec2-base-new-7k on the GARY109/AI_LIGHT_DANCE - ONSET-RBMA13-2 dataset. It achieves the following results on the evaluation set:

  • Loss: 2.3330
  • Wer: 1.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0003
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 30
  • num_epochs: 100.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer
No log 1.0 1 68.1358 1.0
No log 2.0 2 68.1358 1.0
No log 3.0 3 68.1358 1.0
No log 4.0 4 68.0245 1.0
No log 5.0 5 67.7874 1.0
No log 6.0 6 67.4535 1.0
No log 7.0 7 67.0142 1.0
No log 8.0 8 67.0142 1.0
No log 9.0 9 66.4335 1.0
38.4011 10.0 10 65.7100 1.0
38.4011 11.0 11 64.8206 1.0
38.4011 12.0 12 63.8239 1.0
38.4011 13.0 13 62.6489 1.0
38.4011 14.0 14 61.3071 1.0
38.4011 15.0 15 59.7427 1.0
38.4011 16.0 16 58.0256 0.98
38.4011 17.0 17 56.0327 1.0
38.4011 18.0 18 53.7724 1.0
38.4011 19.0 19 51.2556 1.0
33.2554 20.0 20 48.4956 1.0
33.2554 21.0 21 45.4038 1.0
33.2554 22.0 22 41.9980 1.0
33.2554 23.0 23 41.9980 1.0
33.2554 24.0 24 38.2281 1.0
33.2554 25.0 25 34.1577 1.0
33.2554 26.0 26 29.7985 1.0
33.2554 27.0 27 25.1146 1.0
33.2554 28.0 28 20.2287 1.0
33.2554 29.0 29 15.3406 1.0
15.1206 30.0 30 10.7693 1.0
15.1206 31.0 31 6.8998 1.0
15.1206 32.0 32 4.5907 1.0
15.1206 33.0 33 3.3596 1.0
15.1206 34.0 34 2.7711 1.0
15.1206 35.0 35 2.5962 1.0
15.1206 36.0 36 2.9002 1.0
15.1206 37.0 37 3.0061 1.0
15.1206 38.0 38 2.8175 1.0
15.1206 39.0 39 2.4512 1.0
2.4298 40.0 40 2.3330 1.0
2.4298 41.0 41 2.3766 1.0
2.4298 42.0 42 2.5626 1.0
2.4298 43.0 43 2.9632 1.0
2.4298 44.0 44 3.2796 1.0
2.4298 45.0 45 3.4015 1.0
2.4298 46.0 46 3.2808 1.0
2.4298 47.0 47 3.2373 1.0
2.4298 48.0 48 3.2462 1.0
2.4298 49.0 49 3.6168 1.0
1.6143 50.0 50 3.6625 1.0
1.6143 51.0 51 3.7593 1.0
1.6143 52.0 52 3.9327 1.0
1.6143 53.0 53 3.7185 1.0
1.6143 54.0 54 3.9100 1.0
1.6143 55.0 55 4.3123 1.0
1.6143 56.0 56 4.2904 1.0
1.6143 57.0 57 3.9519 1.0
1.6143 58.0 58 3.4518 1.0
1.6143 59.0 59 3.0197 1.0
1.4054 60.0 60 2.8863 1.0
1.4054 61.0 61 2.9754 1.0
1.4054 62.0 62 3.2998 1.0
1.4054 63.0 63 3.8715 1.0
1.4054 64.0 64 4.1898 1.0
1.4054 65.0 65 4.1813 1.0
1.4054 66.0 66 3.9025 1.0
1.4054 67.0 67 3.4319 1.0
1.4054 68.0 68 3.2755 1.0
1.4054 69.0 69 3.3349 1.0
1.3121 70.0 70 3.5485 1.0
1.3121 71.0 71 3.9019 1.0
1.3121 72.0 72 4.0819 1.0
1.3121 73.0 73 3.9955 1.0
1.3121 74.0 74 3.7088 1.0
1.3121 75.0 75 3.2957 1.0
1.3121 76.0 76 3.1141 1.0
1.3121 77.0 77 3.0852 1.0
1.3121 78.0 78 3.1871 1.0
1.3121 79.0 79 3.4127 1.0
1.2576 80.0 80 3.6913 1.0
1.2576 81.0 81 3.8286 1.0
1.2576 82.0 82 3.8157 1.0
1.2576 83.0 83 3.6814 1.0
1.2576 84.0 84 3.4496 1.0
1.2576 85.0 85 3.2844 1.0
1.2576 86.0 86 3.2254 1.0
1.2576 87.0 87 3.2683 1.0
1.2576 88.0 88 3.3791 1.0
1.2576 89.0 89 3.5501 1.0
1.2373 90.0 90 3.6622 1.0
1.2373 91.0 91 3.7207 1.0
1.2373 92.0 92 3.6961 1.0
1.2373 93.0 93 3.6099 1.0
1.2373 94.0 94 3.5336 1.0
1.2373 95.0 95 3.4342 1.0
1.2373 96.0 96 3.3170 1.0
1.2373 97.0 97 3.2624 1.0
1.2373 98.0 98 3.2437 1.0
1.2373 99.0 99 3.2591 1.0
1.1952 100.0 100 3.2927 1.0

Framework versions

  • Transformers 4.25.0.dev0
  • Pytorch 1.8.1+cu111
  • Datasets 2.7.1.dev0
  • Tokenizers 0.13.2
Downloads last month
12
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.