Edit model card

LifeScienceBART

This model is a fine-tuned version of facebook/bart-large-cnn on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4.4310
  • Rouge1: 52.3694
  • Rouge2: 17.5874
  • Rougel: 36.4217
  • Rougelsum: 48.765
  • Bertscore Precision: 82.295
  • Bertscore Recall: 83.951
  • Bertscore F1: 83.1121
  • Bleu: 0.1308
  • Gen Len: 227.8869

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • gradient_accumulation_steps: 16
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Bertscore Precision Bertscore Recall Bertscore F1 Bleu Gen Len
6.1988 0.0881 100 6.0815 43.5317 12.8172 29.5886 40.6668 78.4798 81.4664 79.9395 0.0939 227.8869
5.7388 0.1762 200 5.6510 41.3899 12.8237 29.1108 38.0304 77.5037 81.7443 79.5601 0.0978 227.8869
5.3718 0.2643 300 5.2822 46.279 14.1045 31.7158 43.1347 79.8268 82.2875 81.0344 0.1041 227.8869
5.1682 0.3524 400 5.1072 48.1957 15.1732 32.7384 44.0672 80.3745 82.94 81.6328 0.1137 227.8869
5.1315 0.4405 500 4.9408 48.9502 15.6058 33.6297 45.5085 81.0706 83.1289 82.0835 0.1158 227.8869
4.9456 0.5286 600 4.7786 48.4843 15.8565 34.014 45.2987 80.9541 83.0806 81.9998 0.1151 227.8869
4.8396 0.6167 700 4.6607 51.3313 16.5503 35.0136 47.9755 82.0251 83.4743 82.7408 0.1210 227.8869
4.7481 0.7048 800 4.5922 51.9257 16.9939 35.583 48.1998 82.2219 83.8107 83.0061 0.1262 227.8869
4.6688 0.7929 900 4.5112 51.3896 17.1313 35.8696 47.7303 81.926 83.7943 82.8465 0.1277 227.8869
4.4321 0.8810 1000 4.4624 52.6168 17.6855 36.2987 49.0759 82.3644 83.8994 83.1222 0.1305 227.8869
4.5732 0.9691 1100 4.4310 52.3694 17.5874 36.4217 48.765 82.295 83.951 83.1121 0.1308 227.8869

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
406M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for MarPla/LifeScienceBART

Finetuned
(295)
this model