satyanshu404's picture
End of training
ebea5d8
|
raw
history blame
6.66 kB
metadata
license: mit
base_model: facebook/bart-large-cnn
tags:
  - generated_from_trainer
metrics:
  - rouge
model-index:
  - name: bart-large-cnn-finetuned-Kaggle-Science-LLM
    results: []

bart-large-cnn-finetuned-Kaggle-Science-LLM

This model is a fine-tuned version of facebook/bart-large-cnn on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 6.4896
  • Rouge1: 29.4886
  • Rouge2: 10.2696
  • Rougel: 22.611
  • Rougelsum: 23.6936
  • Gen Len: 70.1

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 90 2.9814 32.5407 12.8638 25.9593 28.0874 66.05
No log 2.0 180 3.1081 33.6875 13.0896 25.2244 26.9945 68.25
No log 3.0 270 3.4845 33.889 12.8396 26.2138 28.2817 70.55
No log 4.0 360 3.8911 31.8492 12.0458 23.4026 25.8547 66.25
No log 5.0 450 4.3530 31.2083 11.0996 23.9196 26.1564 72.25
1.4121 6.0 540 4.4582 29.7758 11.1798 22.9812 24.9141 72.2
1.4121 7.0 630 4.5299 30.3925 11.41 23.9357 25.4386 74.15
1.4121 8.0 720 5.0756 30.1282 10.1879 22.5263 24.3294 71.05
1.4121 9.0 810 5.2213 29.1958 11.9758 22.9344 25.3243 70.95
1.4121 10.0 900 5.0236 32.2902 12.9557 24.9154 26.9866 71.85
1.4121 11.0 990 5.2231 29.9105 11.4629 22.5421 24.7261 73.15
0.1808 12.0 1080 5.4899 30.6426 10.8586 23.0649 25.4052 69.35
0.1808 13.0 1170 5.5205 31.4239 12.4297 24.2742 25.8058 64.9
0.1808 14.0 1260 5.4710 31.3377 11.5225 23.4415 25.9487 68.3
0.1808 15.0 1350 5.3894 30.5681 11.3301 22.5992 25.0445 67.1
0.1808 16.0 1440 5.7293 30.7485 10.2947 23.2461 25.1156 67.8
0.0634 17.0 1530 5.8342 27.8846 9.4002 20.5223 22.8928 73.7
0.0634 18.0 1620 5.7280 31.3703 12.7091 24.947 27.6756 68.7
0.0634 19.0 1710 6.0204 29.311 10.8717 22.2206 23.6151 66.05
0.0634 20.0 1800 5.8662 30.3449 10.9645 22.7105 25.3131 75.6
0.0634 21.0 1890 6.0514 29.4108 10.9479 22.1319 23.8446 70.6
0.0634 22.0 1980 5.9087 30.1637 10.7748 21.7979 23.8345 71.6
0.0281 23.0 2070 6.1406 30.3179 11.0906 23.2057 24.9556 69.65
0.0281 24.0 2160 6.0541 29.7931 11.492 22.7251 24.4958 68.9
0.0281 25.0 2250 6.4349 29.6705 11.3079 22.1845 24.0782 68.2
0.0281 26.0 2340 6.2949 30.3573 9.7319 22.8766 25.5102 68.65
0.0281 27.0 2430 6.3606 30.2358 10.7457 22.9097 24.7486 69.8
0.0167 28.0 2520 6.2235 29.131 11.0196 23.0364 24.7254 69.0
0.0167 29.0 2610 6.2203 30.0767 10.4042 23.0845 24.5571 71.15
0.0167 30.0 2700 6.3899 29.524 11.0226 22.7426 24.7137 71.45
0.0167 31.0 2790 6.4216 29.9921 11.1592 22.7774 25.4653 70.35
0.0167 32.0 2880 6.4758 29.4138 10.1446 22.5501 24.4203 68.0
0.0167 33.0 2970 6.4529 30.7129 9.9512 23.3078 25.1444 70.1
0.0086 34.0 3060 6.3910 32.0673 11.8157 24.4371 26.4378 67.4
0.0086 35.0 3150 6.4725 31.0417 11.8642 23.9718 25.9358 65.5
0.0086 36.0 3240 6.5413 31.2471 11.9972 24.537 25.6679 66.6
0.0086 37.0 3330 6.6040 30.6614 11.4845 23.6335 26.3165 72.15
0.0086 38.0 3420 6.4808 30.1209 10.4855 22.7931 24.9675 74.75
0.0053 39.0 3510 6.4196 29.9709 11.1147 23.3882 25.1429 73.3
0.0053 40.0 3600 6.4798 32.6666 11.6476 24.0167 25.8167 67.7
0.0053 41.0 3690 6.4364 31.7081 11.4081 23.8924 25.3477 67.35
0.0053 42.0 3780 6.4463 31.371 11.3334 23.8642 25.5894 67.85
0.0053 43.0 3870 6.4507 29.6148 11.0601 22.5613 24.2758 70.95
0.0053 44.0 3960 6.5410 30.9704 10.054 22.8276 25.1106 66.25
0.0036 45.0 4050 6.4484 30.6993 10.2855 22.8241 25.1591 69.3
0.0036 46.0 4140 6.4579 29.6269 10.353 21.9677 23.4709 71.15
0.0036 47.0 4230 6.4931 29.8756 10.4957 23.039 24.2656 69.0
0.0036 48.0 4320 6.4831 29.6629 10.0869 22.8167 24.0125 70.35
0.0036 49.0 4410 6.4871 29.908 10.3116 22.9103 24.0365 71.9
0.0023 50.0 4500 6.4896 29.4886 10.2696 22.611 23.6936 70.1

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3