Edit model card

base-japanese-product-classification

This model is a fine-tuned version of tohoku-nlp/bert-base-japanese-v3 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9961

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss
No log 1.0 106 3.5780
No log 2.0 212 2.6684
No log 3.0 318 2.1910
No log 4.0 424 1.8626
2.5157 5.0 530 1.6483
2.5157 6.0 636 1.4929
2.5157 7.0 742 1.3580
2.5157 8.0 848 1.2658
2.5157 9.0 954 1.1877
0.9057 10.0 1060 1.1381
0.9057 11.0 1166 1.0945
0.9057 12.0 1272 1.0589
0.9057 13.0 1378 1.0071
0.9057 14.0 1484 1.0002
0.4676 15.0 1590 0.9647
0.4676 16.0 1696 0.9762
0.4676 17.0 1802 0.9129
0.4676 18.0 1908 0.9316
0.2485 19.0 2014 0.9119
0.2485 20.0 2120 0.8805
0.2485 21.0 2226 0.8808
0.2485 22.0 2332 0.8769
0.2485 23.0 2438 0.8701
0.1331 24.0 2544 0.8651
0.1331 25.0 2650 0.8621
0.1331 26.0 2756 0.8759
0.1331 27.0 2862 0.8702
0.1331 28.0 2968 0.8634
0.0748 29.0 3074 0.8778
0.0748 30.0 3180 0.8776
0.0748 31.0 3286 0.8486
0.0748 32.0 3392 0.8695
0.0748 33.0 3498 0.8479
0.0416 34.0 3604 0.8661
0.0416 35.0 3710 0.8731
0.0416 36.0 3816 0.8681
0.0416 37.0 3922 0.8942
0.0255 38.0 4028 0.8841
0.0255 39.0 4134 0.8842
0.0255 40.0 4240 0.8875
0.0255 41.0 4346 0.8760
0.0255 42.0 4452 0.8820
0.0166 43.0 4558 0.8975
0.0166 44.0 4664 0.8890
0.0166 45.0 4770 0.8795
0.0166 46.0 4876 0.8882
0.0166 47.0 4982 0.8950
0.0123 48.0 5088 0.8923
0.0123 49.0 5194 0.9018
0.0123 50.0 5300 0.8975
0.0123 51.0 5406 0.9078
0.0097 52.0 5512 0.9124
0.0097 53.0 5618 0.9250
0.0097 54.0 5724 0.9663
0.0097 55.0 5830 0.9651
0.0097 56.0 5936 0.9570
0.0078 57.0 6042 0.9530
0.0078 58.0 6148 0.9548
0.0078 59.0 6254 0.9490
0.0078 60.0 6360 0.9563
0.0078 61.0 6466 0.9614
0.0064 62.0 6572 0.9602
0.0064 63.0 6678 0.9614
0.0064 64.0 6784 0.9625
0.0064 65.0 6890 0.9587
0.0064 66.0 6996 0.9601
0.0055 67.0 7102 0.9664
0.0055 68.0 7208 0.9688
0.0055 69.0 7314 0.9725
0.0055 70.0 7420 0.9726
0.0047 71.0 7526 0.9693
0.0047 72.0 7632 0.9737
0.0047 73.0 7738 0.9720
0.0047 74.0 7844 0.9717
0.0047 75.0 7950 0.9683
0.0041 76.0 8056 0.9732
0.0041 77.0 8162 0.9740
0.0041 78.0 8268 0.9748
0.0041 79.0 8374 0.9789
0.0041 80.0 8480 0.9788
0.0036 81.0 8586 0.9788
0.0036 82.0 8692 0.9829
0.0036 83.0 8798 0.9842
0.0036 84.0 8904 0.9810
0.0032 85.0 9010 0.9862
0.0032 86.0 9116 0.9858
0.0032 87.0 9222 0.9881
0.0032 88.0 9328 0.9889
0.0032 89.0 9434 0.9902
0.003 90.0 9540 0.9909
0.003 91.0 9646 0.9927
0.003 92.0 9752 0.9926
0.003 93.0 9858 0.9942
0.003 94.0 9964 0.9949
0.0028 95.0 10070 0.9926
0.0028 96.0 10176 0.9938
0.0028 97.0 10282 0.9949
0.0028 98.0 10388 0.9960
0.0028 99.0 10494 0.9960
0.0027 100.0 10600 0.9961

Framework versions

  • Transformers 4.39.1
  • Pytorch 2.2.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
37
Safetensors
Model size
111M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ikeno-ada/base-japanese-product-classification

Finetuned
(31)
this model