intent_classify / README.md
van-ng's picture
Model save
dab6c20 verified
|
raw
history blame
No virus
14.1 kB
metadata
license: apache-2.0
base_model: facebook/wav2vec2-base
tags:
  - generated_from_trainer
datasets:
  - minds14
metrics:
  - accuracy
model-index:
  - name: intent_classify
    results:
      - task:
          name: Audio Classification
          type: audio-classification
        dataset:
          name: minds14
          type: minds14
          config: en-US
          split: train
          args: en-US
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.07058823529411765

intent_classify

This model is a fine-tuned version of facebook/wav2vec2-base on the minds14 dataset. It achieves the following results on the evaluation set:

  • Loss: 7.7654
  • Accuracy: 0.0706

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Accuracy
2.6382 1.0 15 2.6424 0.1059
2.6403 2.0 30 2.6453 0.0706
2.6356 3.0 45 2.6431 0.0471
2.6339 4.0 60 2.6444 0.0471
2.6314 5.0 75 2.6529 0.0471
2.6296 6.0 90 2.6488 0.0588
2.6242 7.0 105 2.6569 0.0824
2.6215 8.0 120 2.6629 0.0353
2.6134 9.0 135 2.6670 0.0588
2.611 10.0 150 2.6836 0.0235
2.5978 11.0 165 2.6735 0.0706
2.5974 12.0 180 2.6879 0.0235
2.5974 13.0 195 2.6895 0.0471
2.5914 14.0 210 2.7066 0.0235
2.5677 15.0 225 2.7058 0.0353
2.567 16.0 240 2.7118 0.0471
2.5579 17.0 255 2.7248 0.0471
2.559 18.0 270 2.6907 0.0353
2.5335 19.0 285 2.7091 0.0706
2.5327 20.0 300 2.7387 0.0824
2.499 21.0 315 2.7275 0.0235
2.4624 22.0 330 2.7613 0.0706
2.4557 23.0 345 2.7961 0.0235
2.4194 24.0 360 2.8100 0.0235
2.4061 25.0 375 2.7907 0.0706
2.3704 26.0 390 2.8218 0.0588
2.3319 27.0 405 2.8153 0.0353
2.3214 28.0 420 2.8860 0.0824
2.3314 29.0 435 2.8715 0.0706
2.2418 30.0 450 2.8179 0.0588
2.2144 31.0 465 2.8949 0.0471
2.2343 32.0 480 2.8831 0.0706
2.1429 33.0 495 2.9078 0.0235
2.1252 34.0 510 2.8757 0.0824
2.0381 35.0 525 2.8561 0.0706
1.9718 36.0 540 2.9334 0.0235
1.9676 37.0 555 2.9418 0.0353
1.9167 38.0 570 3.0164 0.0353
1.8649 39.0 585 2.9759 0.0471
1.8494 40.0 600 2.9568 0.1059
1.768 41.0 615 2.9746 0.0588
1.6931 42.0 630 2.9224 0.0588
1.7221 43.0 645 2.9451 0.0941
1.6963 44.0 660 3.0814 0.0824
1.6515 45.0 675 2.9779 0.0941
1.5451 46.0 690 3.1134 0.1059
1.4958 47.0 705 2.9865 0.0824
1.432 48.0 720 3.1357 0.0588
1.4167 49.0 735 3.2270 0.0824
1.3876 50.0 750 3.2101 0.0353
1.3323 51.0 765 3.1803 0.0824
1.2767 52.0 780 3.2781 0.0353
1.3037 53.0 795 3.2814 0.0588
1.214 54.0 810 3.3239 0.0941
1.1582 55.0 825 3.2247 0.1176
1.1028 56.0 840 3.3801 0.0706
1.117 57.0 855 3.3110 0.0941
1.0498 58.0 870 3.3820 0.0588
0.9688 59.0 885 3.2971 0.0706
0.9991 60.0 900 3.4578 0.0706
0.915 61.0 915 3.5240 0.0706
0.9858 62.0 930 3.4743 0.0706
0.8826 63.0 945 3.4516 0.0588
0.8748 64.0 960 3.4834 0.0824
0.8671 65.0 975 3.4300 0.0706
0.8005 66.0 990 3.5403 0.0588
0.7662 67.0 1005 3.6394 0.0588
0.7789 68.0 1020 3.6355 0.0235
0.6816 69.0 1035 3.7145 0.0235
0.678 70.0 1050 3.7057 0.0353
0.6307 71.0 1065 3.6650 0.0588
0.6853 72.0 1080 3.7011 0.0353
0.5857 73.0 1095 3.6480 0.0706
0.5405 74.0 1110 3.7454 0.0588
0.6295 75.0 1125 3.6397 0.0824
0.5667 76.0 1140 3.6528 0.0588
0.5558 77.0 1155 3.8219 0.0471
0.4908 78.0 1170 3.9318 0.0353
0.4427 79.0 1185 3.8695 0.0471
0.4437 80.0 1200 4.0755 0.0353
0.3798 81.0 1215 4.0077 0.0353
0.477 82.0 1230 3.9117 0.0706
0.4199 83.0 1245 4.1337 0.0471
0.4037 84.0 1260 4.0306 0.0235
0.3283 85.0 1275 4.1248 0.0471
0.4361 86.0 1290 4.0707 0.0235
0.3949 87.0 1305 4.2368 0.0235
0.3577 88.0 1320 4.2299 0.0706
0.2885 89.0 1335 4.3665 0.0471
0.2737 90.0 1350 4.1773 0.0706
0.3 91.0 1365 4.5002 0.0471
0.2936 92.0 1380 4.5914 0.0235
0.3035 93.0 1395 4.3489 0.0471
0.2401 94.0 1410 4.3683 0.0706
0.1996 95.0 1425 4.4946 0.0588
0.232 96.0 1440 4.6429 0.0588
0.291 97.0 1455 4.5975 0.0353
0.2111 98.0 1470 4.5378 0.0353
0.1986 99.0 1485 4.5688 0.0471
0.2242 100.0 1500 4.5640 0.0118
0.1679 101.0 1515 4.7323 0.0588
0.1897 102.0 1530 4.6266 0.0235
0.2212 103.0 1545 4.8046 0.0706
0.2138 104.0 1560 4.6699 0.0588
0.1921 105.0 1575 4.7727 0.0235
0.1625 106.0 1590 4.8053 0.0471
0.1206 107.0 1605 5.0319 0.0588
0.1841 108.0 1620 4.9295 0.0353
0.1288 109.0 1635 4.9922 0.0588
0.1647 110.0 1650 5.1317 0.0235
0.1758 111.0 1665 5.1606 0.0353
0.1281 112.0 1680 5.2126 0.0118
0.1925 113.0 1695 5.1029 0.0471
0.1177 114.0 1710 5.3407 0.0353
0.128 115.0 1725 4.9612 0.0706
0.1078 116.0 1740 5.4318 0.0235
0.0747 117.0 1755 5.3757 0.0588
0.1359 118.0 1770 5.3949 0.0471
0.0971 119.0 1785 5.4532 0.0471
0.0671 120.0 1800 5.6330 0.0353
0.0819 121.0 1815 5.5617 0.0471
0.0892 122.0 1830 5.6881 0.0353
0.0861 123.0 1845 5.7083 0.0353
0.0649 124.0 1860 5.8477 0.0824
0.0674 125.0 1875 5.6822 0.0588
0.0788 126.0 1890 5.7720 0.0824
0.0439 127.0 1905 5.8210 0.0706
0.0586 128.0 1920 5.9101 0.0588
0.0674 129.0 1935 5.7681 0.0588
0.0563 130.0 1950 5.7770 0.0824
0.0284 131.0 1965 6.1912 0.0588
0.0717 132.0 1980 6.0938 0.0588
0.0424 133.0 1995 6.0714 0.0824
0.0768 134.0 2010 6.1924 0.0706
0.0592 135.0 2025 6.5515 0.0118
0.0217 136.0 2040 6.1961 0.0706
0.0544 137.0 2055 6.4168 0.0353
0.0417 138.0 2070 6.4916 0.0588
0.0339 139.0 2085 6.6678 0.0235
0.0208 140.0 2100 6.4968 0.0588
0.0436 141.0 2115 6.5245 0.0588
0.033 142.0 2130 6.6816 0.0706
0.037 143.0 2145 6.3041 0.0824
0.0132 144.0 2160 6.6597 0.0588
0.0484 145.0 2175 6.6440 0.0824
0.0264 146.0 2190 6.7801 0.0353
0.0115 147.0 2205 6.7156 0.0471
0.027 148.0 2220 6.7250 0.0706
0.0394 149.0 2235 6.8474 0.0706
0.0113 150.0 2250 6.8180 0.0824
0.0157 151.0 2265 6.8688 0.0824
0.0385 152.0 2280 6.8874 0.0824
0.0224 153.0 2295 7.0014 0.0706
0.0522 154.0 2310 7.1680 0.0706
0.0099 155.0 2325 7.1595 0.0471
0.01 156.0 2340 7.1259 0.0471
0.0144 157.0 2355 7.1538 0.0471
0.0175 158.0 2370 7.0335 0.0706
0.008 159.0 2385 7.0295 0.0588
0.0311 160.0 2400 7.1288 0.0706
0.0416 161.0 2415 7.1012 0.0471
0.0333 162.0 2430 7.3391 0.0588
0.0241 163.0 2445 7.2666 0.0588
0.0068 164.0 2460 7.1324 0.0706
0.0194 165.0 2475 7.1494 0.0824
0.0089 166.0 2490 7.2136 0.0824
0.0071 167.0 2505 7.2442 0.0706
0.0174 168.0 2520 7.3070 0.0588
0.0056 169.0 2535 7.3370 0.0588
0.0054 170.0 2550 7.3814 0.0588
0.0087 171.0 2565 7.3903 0.0588
0.0052 172.0 2580 7.4102 0.0588
0.0255 173.0 2595 7.3886 0.0588
0.0056 174.0 2610 7.4785 0.0588
0.005 175.0 2625 7.5349 0.0588
0.0078 176.0 2640 7.5136 0.0588
0.0214 177.0 2655 7.5146 0.0706
0.0827 178.0 2670 7.5079 0.0706
0.0046 179.0 2685 7.5157 0.0941
0.0098 180.0 2700 7.5161 0.0941
0.0049 181.0 2715 7.5169 0.0824
0.0063 182.0 2730 7.5643 0.0824
0.0147 183.0 2745 7.6032 0.0824
0.0279 184.0 2760 7.6901 0.0706
0.0044 185.0 2775 7.7511 0.0706
0.0106 186.0 2790 7.6778 0.0588
0.0042 187.0 2805 7.6374 0.0588
0.0043 188.0 2820 7.6470 0.0706
0.0279 189.0 2835 7.6876 0.0706
0.0089 190.0 2850 7.6849 0.0706
0.0128 191.0 2865 7.6929 0.0706
0.0046 192.0 2880 7.6891 0.0706
0.0041 193.0 2895 7.7036 0.0706
0.0215 194.0 2910 7.7134 0.0706
0.0041 195.0 2925 7.7340 0.0706
0.0041 196.0 2940 7.7637 0.0706
0.0278 197.0 2955 7.7695 0.0706
0.004 198.0 2970 7.7673 0.0706
0.0039 199.0 2985 7.7663 0.0706
0.01 200.0 3000 7.7654 0.0706

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.2
  • Datasets 2.12.0
  • Tokenizers 0.13.2