Edit model card

mt5-small-sum-de-en-v1-finetuned-amazon-en-de

This model is a fine-tuned version of deutsche-telekom/mt5-small-sum-de-en-v1 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 2.5661
  • Rouge1: 20.9307
  • Rouge2: 12.3388
  • Rougel: 20.4694
  • Rougelsum: 20.6594

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5.6e-05
  • train_batch_size: 10
  • eval_batch_size: 10
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum
2.3857 1.0 1041 2.5918 19.0955 10.5762 18.6534 18.707
2.3597 2.0 2082 2.5887 20.2078 11.1543 19.5663 19.7039
2.3623 3.0 3123 2.5802 20.1696 12.0499 19.7038 19.89
2.3815 4.0 4164 2.5498 19.9131 11.5376 19.4158 19.5746
2.3735 5.0 5205 2.5559 20.5713 11.8808 19.9335 20.1211
2.3269 6.0 6246 2.5574 19.8362 11.033 19.3193 19.5623
2.2956 7.0 7287 2.5479 19.8859 11.5389 19.4015 19.7004
2.2646 8.0 8328 2.5669 20.4666 12.2804 20.0291 20.1897
2.2618 9.0 9369 2.5703 20.9783 12.3152 20.445 20.6735
2.236 10.0 10410 2.5661 20.9307 12.3388 20.4694 20.6594

Framework versions

  • Transformers 4.34.1
  • Pytorch 2.1.0+cu118
  • Datasets 2.14.6
  • Tokenizers 0.14.1
Downloads last month
3
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for yujiro666/mt5-small-sum-de-en-v1-finetuned-amazon-en-de

Finetuned
(1)
this model