File size: 4,561 Bytes
dc58e92
aa6dd69
 
dc58e92
 
 
 
 
 
 
 
 
 
 
 
 
 
aa6dd69
dc58e92
aa6dd69
 
dc58e92
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
aa6dd69
dc58e92
 
 
 
 
 
 
 
 
 
 
 
 
 
 
aa6dd69
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
dc58e92
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
---
license: apache-2.0
base_model: facebook/wav2vec2-large
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: ft-wav2vec2-with-minds
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# ft-wav2vec2-with-minds

This model is a fine-tuned version of [facebook/wav2vec2-large](https://huggingface.co./facebook/wav2vec2-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.0732
- Accuracy: 0.9822

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 120
- eval_batch_size: 120
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 480
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:--------:|
| 2.0814        | 1.0   | 9    | 2.0883          | 0.1143   |
| 2.064         | 2.0   | 18   | 2.0619          | 0.1678   |
| 2.0232        | 3.0   | 27   | 1.9712          | 0.2709   |
| 1.861         | 4.0   | 36   | 1.7455          | 0.3880   |
| 1.6003        | 5.0   | 45   | 1.5115          | 0.4724   |
| 1.4972        | 6.0   | 54   | 1.2623          | 0.5998   |
| 1.2332        | 7.0   | 63   | 1.0138          | 0.6935   |
| 1.081         | 8.0   | 72   | 0.8169          | 0.7601   |
| 0.9925        | 9.0   | 81   | 0.7757          | 0.7873   |
| 0.8516        | 10.0  | 90   | 0.6470          | 0.8163   |
| 0.7544        | 11.0  | 99   | 0.7208          | 0.7873   |
| 0.7006        | 12.0  | 108  | 0.5074          | 0.8557   |
| 0.591         | 13.0  | 117  | 0.4326          | 0.8782   |
| 0.5155        | 14.0  | 126  | 0.3707          | 0.9053   |
| 0.4715        | 15.0  | 135  | 0.3116          | 0.9091   |
| 0.4461        | 16.0  | 144  | 0.3167          | 0.9138   |
| 0.445         | 17.0  | 153  | 0.2963          | 0.9250   |
| 0.3899        | 18.0  | 162  | 0.2499          | 0.9353   |
| 0.3656        | 19.0  | 171  | 0.2756          | 0.9194   |
| 0.3255        | 20.0  | 180  | 0.2280          | 0.9297   |
| 0.2756        | 21.0  | 189  | 0.2178          | 0.9438   |
| 0.3119        | 22.0  | 198  | 0.1858          | 0.9513   |
| 0.2595        | 23.0  | 207  | 0.1794          | 0.9475   |
| 0.2713        | 24.0  | 216  | 0.1737          | 0.9466   |
| 0.2336        | 25.0  | 225  | 0.1758          | 0.9531   |
| 0.2359        | 26.0  | 234  | 0.1690          | 0.9485   |
| 0.2229        | 27.0  | 243  | 0.1336          | 0.9606   |
| 0.2145        | 28.0  | 252  | 0.1338          | 0.9700   |
| 0.1986        | 29.0  | 261  | 0.1525          | 0.9625   |
| 0.1811        | 30.0  | 270  | 0.1415          | 0.9653   |
| 0.165         | 31.0  | 279  | 0.1208          | 0.9672   |
| 0.1755        | 32.0  | 288  | 0.1266          | 0.9634   |
| 0.175         | 33.0  | 297  | 0.1269          | 0.9672   |
| 0.149         | 34.0  | 306  | 0.1072          | 0.9728   |
| 0.1606        | 35.0  | 315  | 0.1183          | 0.9738   |
| 0.161         | 36.0  | 324  | 0.1009          | 0.9719   |
| 0.1533        | 37.0  | 333  | 0.1000          | 0.9728   |
| 0.1239        | 38.0  | 342  | 0.1109          | 0.9691   |
| 0.1353        | 39.0  | 351  | 0.0905          | 0.9775   |
| 0.1287        | 40.0  | 360  | 0.0920          | 0.9738   |
| 0.223         | 41.0  | 369  | 0.0855          | 0.9775   |
| 0.1302        | 42.0  | 378  | 0.0748          | 0.9794   |
| 0.1249        | 43.0  | 387  | 0.0732          | 0.9822   |
| 0.1552        | 44.0  | 396  | 0.0688          | 0.9822   |
| 0.098         | 45.0  | 405  | 0.0777          | 0.9766   |
| 0.1459        | 46.0  | 414  | 0.0634          | 0.9813   |
| 0.1267        | 47.0  | 423  | 0.0653          | 0.9822   |
| 0.149         | 48.0  | 432  | 0.0709          | 0.9794   |
| 0.1135        | 49.0  | 441  | 0.0660          | 0.9813   |
| 0.118         | 50.0  | 450  | 0.0652          | 0.9813   |


### Framework versions

- Transformers 4.35.2
- Pytorch 1.12.1+cu116
- Datasets 2.15.0
- Tokenizers 0.15.2