Edit model card

my_awesome_opus_books_model

This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.2696
  • Bleu: 0.0071
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Bleu Gen Len
No log 1.0 15 6.4375 0.0013 19.0
No log 2.0 30 5.7374 0.0016 19.0
No log 3.0 45 5.4597 0.0004 19.0
No log 4.0 60 5.2343 0.0005 19.0
No log 5.0 75 5.0942 0.0008 19.0
No log 6.0 90 4.9779 0.001 19.0
No log 7.0 105 4.8902 0.001 19.0
No log 8.0 120 4.7958 0.0008 19.0
No log 9.0 135 4.7133 0.0008 19.0
No log 10.0 150 4.6379 0.0008 19.0
No log 11.0 165 4.5734 0.0011 19.0
No log 12.0 180 4.5051 0.0011 19.0
No log 13.0 195 4.4446 0.0031 19.0
No log 14.0 210 4.3866 0.0085 19.0
No log 15.0 225 4.3280 0.0148 19.0
No log 16.0 240 4.2625 0.0122 19.0
No log 17.0 255 4.2007 0.0015 19.0
No log 18.0 270 4.1402 0.0015 19.0
No log 19.0 285 4.0824 0.0014 19.0
No log 20.0 300 4.0331 0.0014 19.0
No log 21.0 315 3.9883 0.0008 19.0
No log 22.0 330 3.9361 0.0007 19.0
No log 23.0 345 3.8779 0.0015 19.0
No log 24.0 360 3.8201 0.0019 19.0
No log 25.0 375 3.7696 0.0031 19.0
No log 26.0 390 3.7357 0.0032 19.0
No log 27.0 405 3.7019 0.0018 19.0
No log 28.0 420 3.6743 0.0018 19.0
No log 29.0 435 3.6439 0.0017 19.0
No log 30.0 450 3.6153 0.0016 19.0
No log 31.0 465 3.5916 0.0009 19.0
No log 32.0 480 3.5756 0.0062 19.0
No log 33.0 495 3.5618 0.001 19.0
4.6815 34.0 510 3.5500 0.0011 19.0
4.6815 35.0 525 3.5398 0.0006 19.0
4.6815 36.0 540 3.5331 0.0006 19.0
4.6815 37.0 555 3.5181 0.0006 19.0
4.6815 38.0 570 3.5059 0.0005 19.0
4.6815 39.0 585 3.4958 0.0006 18.95
4.6815 40.0 600 3.4882 0.0006 18.95
4.6815 41.0 615 3.4760 0.0007 19.0
4.6815 42.0 630 3.4673 0.0009 19.0
4.6815 43.0 645 3.4656 0.0011 19.0
4.6815 44.0 660 3.4526 0.0008 19.0
4.6815 45.0 675 3.4522 0.0009 19.0
4.6815 46.0 690 3.4395 0.0014 19.0
4.6815 47.0 705 3.4251 0.0015 19.0
4.6815 48.0 720 3.4162 0.0016 19.0
4.6815 49.0 735 3.4124 0.002 19.0
4.6815 50.0 750 3.4061 0.0025 19.0
4.6815 51.0 765 3.4014 0.0024 19.0
4.6815 52.0 780 3.3920 0.0025 19.0
4.6815 53.0 795 3.3898 0.0027 19.0
4.6815 54.0 810 3.3839 0.0021 19.0
4.6815 55.0 825 3.3777 0.0023 19.0
4.6815 56.0 840 3.3713 0.0027 19.0
4.6815 57.0 855 3.3654 0.0019 19.0
4.6815 58.0 870 3.3607 0.0024 19.0
4.6815 59.0 885 3.3496 0.0034 19.0
4.6815 60.0 900 3.3474 0.0031 19.0
4.6815 61.0 915 3.3446 0.0026 19.0
4.6815 62.0 930 3.3401 0.0031 19.0
4.6815 63.0 945 3.3326 0.0041 19.0
4.6815 64.0 960 3.3288 0.0028 19.0
4.6815 65.0 975 3.3309 0.0031 19.0
4.6815 66.0 990 3.3281 0.0034 19.0
3.5477 67.0 1005 3.3223 0.0032 19.0
3.5477 68.0 1020 3.3169 0.0037 19.0
3.5477 69.0 1035 3.3143 0.0058 19.0
3.5477 70.0 1050 3.3134 0.004 19.0
3.5477 71.0 1065 3.3082 0.0066 19.0
3.5477 72.0 1080 3.3060 0.0044 19.0
3.5477 73.0 1095 3.3042 0.0041 19.0
3.5477 74.0 1110 3.3013 0.0048 19.0
3.5477 75.0 1125 3.2972 0.0051 19.0
3.5477 76.0 1140 3.2967 0.0054 19.0
3.5477 77.0 1155 3.2942 0.0055 19.0
3.5477 78.0 1170 3.2951 0.0036 19.0
3.5477 79.0 1185 3.2948 0.0039 19.0
3.5477 80.0 1200 3.2922 0.0038 19.0
3.5477 81.0 1215 3.2871 0.0035 19.0
3.5477 82.0 1230 3.2819 0.0051 19.0
3.5477 83.0 1245 3.2804 0.0039 19.0
3.5477 84.0 1260 3.2800 0.0044 19.0
3.5477 85.0 1275 3.2809 0.0065 19.0
3.5477 86.0 1290 3.2803 0.0073 19.0
3.5477 87.0 1305 3.2779 0.0055 19.0
3.5477 88.0 1320 3.2763 0.0043 19.0
3.5477 89.0 1335 3.2746 0.0047 19.0
3.5477 90.0 1350 3.2733 0.0061 19.0
3.5477 91.0 1365 3.2723 0.005 19.0
3.5477 92.0 1380 3.2718 0.0074 19.0
3.5477 93.0 1395 3.2724 0.0051 19.0
3.5477 94.0 1410 3.2722 0.0073 19.0
3.5477 95.0 1425 3.2710 0.0047 19.0
3.5477 96.0 1440 3.2703 0.0064 19.0
3.5477 97.0 1455 3.2696 0.0056 19.0
3.5477 98.0 1470 3.2696 0.0039 19.0
3.5477 99.0 1485 3.2697 0.0074 19.0
3.3501 100.0 1500 3.2696 0.0071 19.0

Framework versions

  • Transformers 4.38.1
  • Pytorch 2.1.2
  • Datasets 2.1.0
  • Tokenizers 0.15.2
Downloads last month
0
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for ombarki345/my_awesome_opus_books_model

Base model

google-t5/t5-small
Finetuned
(1512)
this model