hubert-rinnna-jp-jdrtsp-fw07sp-clean
This model is a fine-tuned version of rinna/japanese-hubert-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.2393
- Wer: 0.2187
- Cer: 0.1210
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 1000
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Wer | Cer |
---|---|---|---|---|---|
8.6058 | 1.0 | 164 | 6.2883 | 0.9743 | 0.9861 |
4.6616 | 2.0 | 328 | 4.0451 | 0.9743 | 0.9861 |
3.8526 | 3.0 | 492 | 3.5417 | 0.9743 | 0.9861 |
3.2384 | 4.0 | 656 | 3.0505 | 0.9743 | 0.9861 |
2.7948 | 5.0 | 820 | 2.6706 | 0.9743 | 0.9861 |
2.549 | 6.0 | 984 | 2.4268 | 0.9743 | 0.9861 |
2.1808 | 7.0 | 1148 | 1.8554 | 0.9743 | 0.9861 |
1.6069 | 8.0 | 1312 | 1.2551 | 0.6822 | 0.6231 |
1.1916 | 9.0 | 1476 | 0.7985 | 0.3679 | 0.2242 |
0.9977 | 10.0 | 1640 | 0.6234 | 0.3118 | 0.1827 |
0.836 | 11.0 | 1804 | 0.5103 | 0.2801 | 0.1643 |
0.7515 | 12.0 | 1968 | 0.4305 | 0.2663 | 0.1549 |
0.7045 | 13.0 | 2132 | 0.3688 | 0.2489 | 0.1413 |
0.6533 | 14.0 | 2296 | 0.3258 | 0.2399 | 0.1340 |
0.5906 | 15.0 | 2460 | 0.2941 | 0.2318 | 0.1288 |
0.5746 | 16.0 | 2624 | 0.2748 | 0.2300 | 0.1278 |
0.5169 | 17.0 | 2788 | 0.2573 | 0.2240 | 0.1242 |
0.5511 | 18.0 | 2952 | 0.2479 | 0.2211 | 0.1228 |
0.5318 | 19.0 | 3116 | 0.2410 | 0.2186 | 0.1210 |
0.5174 | 20.0 | 3280 | 0.2393 | 0.2187 | 0.1210 |
Framework versions
- Transformers 4.34.0.dev0
- Pytorch 2.0.1+cu118
- Datasets 2.14.5
- Tokenizers 0.13.3
- Downloads last month
- 119
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for mskhattori/hubert-rinnna-jp-jdrtsp-fw07sp-clean
Base model
rinna/japanese-hubert-base