Edit model card

TrOCR-SIN-DeiT-Handwritten-Beam10-maxseq128

This model is a fine-tuned version of kavg/TrOCR-SIN-DeiT on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.7352
  • Cer: 0.5340

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 2600
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Cer Validation Loss
0.9936 1.75 100 0.6193 1.6907
0.0819 3.51 200 0.6011 1.8343
0.1437 5.26 300 0.6579 2.1956
0.0857 7.02 400 0.6435 2.6580
0.0531 8.77 500 0.5595 1.9046
0.1282 10.53 600 0.6121 2.1264
0.0247 12.28 700 0.6218 2.5938
0.0071 14.04 800 0.6402 2.2984
0.0235 15.79 900 0.5961 2.3736
0.152 17.54 1000 0.5674 2.0205
0.0521 19.3 1100 0.5802 2.5917
0.0047 21.05 1200 0.6116 2.6910
0.065 22.81 1300 0.5757 2.2894
0.0313 24.56 1400 0.5647 2.6897
0.0586 26.32 1500 0.5398 2.0499
0.0015 28.07 1600 0.5505 2.3662
0.0125 29.82 1700 0.6250 2.1673
0.0207 31.58 1800 0.5674 2.0626
0.0015 33.33 1900 0.6260 2.9868
0.0004 35.09 2000 0.5792 2.5184
0.001 36.84 2100 0.5557 2.8804
0.0134 38.6 2200 0.6166 2.7627
0.0017 40.35 2300 0.5477 2.2333
0.0046 42.11 2400 0.5871 3.2010
0.0003 43.86 2500 0.5485 2.7037
0.0007 45.61 2600 0.5340 2.7352

Framework versions

  • Transformers 4.35.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.1
Downloads last month
0
Safetensors
Model size
241M params
Tensor type
F32
·
Inference API
Inference API (serverless) does not yet support transformers models for this pipeline type.

Model tree for kavg/TrOCR-SIN-DeiT-Handwritten-Beam10-maxseq128

Finetuned
(4)
this model