Edit model card

t5-small-finetuned-logjuicer

This model is a fine-tuned version of t5-small on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.8674
  • Rouge1: 18.6355
  • Rouge2: 12.317
  • Rougel: 18.4432
  • Rougelsum: 18.4298
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 46 3.9203 7.8841 1.7456 7.5043 7.9956 19.0
No log 2.0 92 3.3239 7.8589 1.7456 7.4779 7.9585 19.0
No log 3.0 138 2.9555 7.7528 3.0412 7.7043 7.7184 19.0
No log 4.0 184 2.7069 8.6355 3.8093 8.5839 8.6118 19.0
No log 5.0 230 2.5137 8.6983 3.9184 8.5872 8.6149 19.0
No log 6.0 276 2.3425 11.6364 4.355 11.1532 11.2938 19.0
No log 7.0 322 2.2153 19.4997 5.9307 17.2555 18.1446 19.0
No log 8.0 368 2.1213 19.3399 6.09 17.1197 18.0052 19.0
No log 9.0 414 2.0448 18.6986 8.2548 17.2942 17.7272 19.0
No log 10.0 460 1.9893 18.3085 11.359 17.9649 17.9785 19.0
2.9395 11.0 506 1.9429 18.1059 11.6297 17.8684 17.9266 19.0
2.9395 12.0 552 1.9091 18.3172 12.0275 18.1818 18.1678 19.0
2.9395 13.0 598 1.8859 18.5764 12.2624 18.3462 18.3067 19.0
2.9395 14.0 644 1.8717 18.626 12.3128 18.4411 18.4202 19.0
2.9395 15.0 690 1.8674 18.6355 12.317 18.4432 18.4298 19.0

Framework versions

  • Transformers 4.37.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2
Downloads last month
12
Safetensors
Model size
60.5M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for fedora-copr/t5-small-finetuned-logjuicer

Base model

google-t5/t5-small
Finetuned
(1512)
this model