SocialScienceBARTPrincipal
This model is a fine-tuned version of facebook/bart-large-cnn on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 4.8587
- Rouge1: 48.4993
- Rouge2: 14.8435
- Rougel: 33.0264
- Rougelsum: 44.9256
- Bertscore Precision: 80.3517
- Bertscore Recall: 82.7128
- Bertscore F1: 81.5112
- Bleu: 0.1092
- Gen Len: 195.1640
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 1
- eval_batch_size: 1
- seed: 42
- gradient_accumulation_steps: 16
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Bertscore Precision | Bertscore Recall | Bertscore F1 | Bleu | Gen Len |
---|---|---|---|---|---|---|---|---|---|---|---|---|
6.5089 | 0.1314 | 100 | 6.2390 | 39.4898 | 11.0769 | 27.6002 | 36.497 | 75.7798 | 80.6901 | 78.1466 | 0.0800 | 195.1640 |
5.9338 | 0.2628 | 200 | 5.7540 | 41.6352 | 11.9524 | 29.0458 | 38.5778 | 77.0272 | 81.1993 | 79.0507 | 0.0882 | 195.1640 |
5.6077 | 0.3943 | 300 | 5.4443 | 41.5238 | 12.2762 | 29.4389 | 38.8683 | 77.5496 | 81.3713 | 79.4075 | 0.0894 | 195.1640 |
5.3997 | 0.5257 | 400 | 5.2541 | 44.1846 | 13.1247 | 30.5659 | 41.1211 | 78.8697 | 81.8978 | 80.3498 | 0.0962 | 195.1640 |
5.1614 | 0.6571 | 500 | 5.1269 | 44.5045 | 13.3887 | 31.1505 | 41.1205 | 78.727 | 82.0655 | 80.3557 | 0.0994 | 195.1640 |
5.0558 | 0.7885 | 600 | 4.9610 | 46.7823 | 14.4367 | 32.4159 | 43.2551 | 79.6807 | 82.5047 | 81.0632 | 0.1059 | 195.1640 |
4.9749 | 0.9199 | 700 | 4.8587 | 48.4993 | 14.8435 | 33.0264 | 44.9256 | 80.3517 | 82.7128 | 81.5112 | 0.1092 | 195.1640 |
Framework versions
- Transformers 4.41.2
- Pytorch 2.3.1+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 121
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for MarPla/SocialScienceBARTPrincipal
Base model
facebook/bart-large-cnn