File size: 3,586 Bytes
edd55dd
35fb2a2
 
 
edd55dd
35fb2a2
c988b4a
 
edd55dd
 
 
 
 
 
c988b4a
edd55dd
 
35fb2a2
 
 
 
 
 
 
 
 
 
edd55dd
 
 
558a739
edd55dd
 
 
 
 
 
 
 
978220e
 
 
edd55dd
 
 
 
558a739
edd55dd
 
cccaeb2
558a739
e9987b6
0a6b770
 
 
 
e9987b6
 
 
558a739
edd55dd
 
2fa6410
 
 
 
35fb2a2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
---
language:
- en
license: apache-2.0
tags:
- dialogue-summarization
datasets:
- icsi
model_index:
- name: hybrid_hbh_bart-base_icsi_sum
  results:
  - task:
      name: Summarization
      type: summarization
base_model: facebook/bart-base
---

## Paper

## [Domain Adapted Abstractive Summarization of Dialogue using Transfer Learning](https://dl.acm.org/doi/10.1145/3508546.3508640)
Authors: *Rohit Sroch*

## Abstract

Recently, the abstractive dialogue summarization task has been gaining a lot of attention from researchers. Also, unlike news articles and documents with well-structured text, dialogue differs in the sense that it often comes from two or more interlocutors, exchanging information with each other and having an inherent hierarchical structure based on the sequence of utterances by different speakers. This paper proposes a simple but effective hybrid approach that consists of two modules and uses transfer learning by leveraging pretrained language models (PLMs) to generate an abstractive summary. The first module highlights important utterances, capturing the utterance level relationship by adapting an auto-encoding model like BERT based on the unsupervised or supervised method. And then, the second module generates a concise abstractive summary by adapting encoder-decoder models like T5, BART, and PEGASUS. Experiment results on benchmark datasets show that our approach achieves a state-of-the-art performance by adapting to dialogue scenarios and can also be helpful in low-resource settings for domain adaptation.

*Rohit Sroch. 2021. Domain Adapted Abstractive Summarization of Dialogue using Transfer Learning. In 2021 4th International Conference on Algorithms, Computing and Artificial Intelligence (ACAI'21). Association for Computing Machinery, New York, NY, USA, Article 94, 1–6. https://doi.org/10.1145/3508546.3508640*

# hybrid_hbh_bart-base_icsi_sum

This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co./facebook/bart-base) on ICSI dataset for dialogue summarization task.

## Model description

More information needed

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 3e-4
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 100.0
- label_smoothing_factor: 0.1

### Results on Test Set

- predict_gen_len           =      480.0 
- predict_rouge1             =    **46.8707** 
- predict_rouge2             =    **10.1337** 
- predict_rougeL             =    **19.3386** 
- predict_rougeLsum          =    **43.6989** 
- predict_samples            =          6 
- predict_samples_per_second =       0.54 
- predict_steps_per_second   =       0.27 

### Framework versions

- Transformers>=4.8.0
- Pytorch>=1.6.0
- Datasets>=1.10.2
- Tokenizers>=0.10.3

If you use this model, please cite the following paper:

```
@inproceedings{10.1145/3508546.3508640,
    author = {Sroch, Rohit},
    title = {Domain Adapted Abstractive Summarization of Dialogue Using Transfer Learning},
    year = {2021},
    isbn = {9781450385053},
    publisher = {Association for Computing Machinery},
    address = {New York, NY, USA},
    url = {https://doi.org/10.1145/3508546.3508640},
    doi = {10.1145/3508546.3508640},
    articleno = {94},
    numpages = {6},
    keywords = {encoder-decoder, T5, abstractive summary, PEGASUS, BART, dialogue summarization, PLMs, BERT},
    location = {Sanya, China},
    series = {ACAI'21}
} 
```