lewtun's picture
lewtun HF staff
Add evaluation results on the 3.0.0 config of cnn_dailymail
0ca58f3
|
raw
history blame
5.05 kB
metadata
language: en
tags:
  - sagemaker
  - bart
  - summarization
license: apache-2.0
datasets:
  - samsum
widget:
  - text: >-
      Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker? 

      Philipp: Sure you can use the new Hugging Face Deep Learning Container. 

      Jeff: ok.

      Jeff: and how can I get started? 

      Jeff: where can I find documentation? 

      Philipp: ok, ok you can find everything here.
      https://huggingface.co./blog/the-partnership-amazon-sagemaker-and-hugging-face 
model-index:
  - name: philschmid/distilbart-cnn-12-6-samsum
    results:
      - task:
          type: summarization
          name: Summarization
        dataset:
          name: samsum
          type: samsum
          config: samsum
          split: test
        metrics:
          - name: ROUGE-1
            type: rouge
            value: 41.0895
            verified: true
          - name: ROUGE-2
            type: rouge
            value: 20.7459
            verified: true
          - name: ROUGE-L
            type: rouge
            value: 31.5952
            verified: true
          - name: ROUGE-LSUM
            type: rouge
            value: 38.3389
            verified: true
          - name: loss
            type: loss
            value: 1.4566329717636108
            verified: true
          - name: gen_len
            type: gen_len
            value: 59.6032
            verified: true
      - task:
          type: summarization
          name: Summarization
        dataset:
          name: xsum
          type: xsum
          config: default
          split: test
        metrics:
          - name: ROUGE-1
            type: rouge
            value: 21.1644
            verified: true
          - name: ROUGE-2
            type: rouge
            value: 4.0659
            verified: true
          - name: ROUGE-L
            type: rouge
            value: 13.9414
            verified: true
          - name: ROUGE-LSUM
            type: rouge
            value: 17.0718
            verified: true
          - name: loss
            type: loss
            value: 3.002755880355835
            verified: true
          - name: gen_len
            type: gen_len
            value: 71.2969
            verified: true
      - task:
          type: summarization
          name: Summarization
        dataset:
          name: cnn_dailymail
          type: cnn_dailymail
          config: 3.0.0
          split: test
        metrics:
          - name: ROUGE-1
            type: rouge
            value: 42.9764
            verified: true
          - name: ROUGE-2
            type: rouge
            value: 19.8711
            verified: true
          - name: ROUGE-L
            type: rouge
            value: 29.5196
            verified: true
          - name: ROUGE-LSUM
            type: rouge
            value: 39.959
            verified: true
          - name: loss
            type: loss
            value: 3.014679193496704
            verified: true
          - name: gen_len
            type: gen_len
            value: 81.956
            verified: true

distilbart-cnn-12-6-samsum

This model was trained using Amazon SageMaker and the new Hugging Face Deep Learning container.

For more information look at:

Hyperparameters

{
    "dataset_name": "samsum",
    "do_eval": true,
    "do_train": true,
    "fp16": true,
    "learning_rate": 5e-05,
    "model_name_or_path": "sshleifer/distilbart-cnn-12-6",
    "num_train_epochs": 3,
    "output_dir": "/opt/ml/model",
    "per_device_eval_batch_size": 8,
    "per_device_train_batch_size": 8,
    "seed": 7
}

Train results

key value
epoch 3.0
init_mem_cpu_alloc_delta 180338
init_mem_cpu_peaked_delta 18282
init_mem_gpu_alloc_delta 1222242816
init_mem_gpu_peaked_delta 0
train_mem_cpu_alloc_delta 6971403
train_mem_cpu_peaked_delta 640733
train_mem_gpu_alloc_delta 4910897664
train_mem_gpu_peaked_delta 23331969536
train_runtime 155.2034
train_samples 14732
train_samples_per_second 2.242

Eval results

key value
epoch 3.0
eval_loss 1.4209576845169067
eval_mem_cpu_alloc_delta 868003
eval_mem_cpu_peaked_delta 18250
eval_mem_gpu_alloc_delta 0
eval_mem_gpu_peaked_delta 328244736
eval_runtime 0.6088
eval_samples 818
eval_samples_per_second 1343.647

Usage

from transformers import pipeline
summarizer = pipeline("summarization", model="philschmid/distilbart-cnn-12-6-samsum")

conversation = '''Jeff: Can I train a 🤗 Transformers model on Amazon SageMaker? 
Philipp: Sure you can use the new Hugging Face Deep Learning Container. 
Jeff: ok.
Jeff: and how can I get started? 
Jeff: where can I find documentation? 
Philipp: ok, ok you can find everything here. https://huggingface.co./blog/the-partnership-amazon-sagemaker-and-hugging-face                                           
'''
nlp(conversation)