Generating Extractive and Abstractive Summaries in Parallel from Scientific Articles Incorporating Citing Statements

Sudipta Singha Roy, Robert E. Mercer


Abstract
Summarization of scientific articles often overlooks insights from citing papers, focusing solely on the document’s content. To incorporate citation contexts, we develop a model to summarize a scientific document using the information in the source and citing documents. It concurrently generates abstractive and extractive summaries, each enhancing the other. The extractive summarizer utilizes a blend of heterogeneous graph-based neural networks and graph attention networks, while the abstractive summarizer employs an autoregressive decoder. These modules exchange control signals through the loss function, ensuring the creation of high-quality summaries in both styles.
Anthology ID:
2023.newsum-1.8
Volume:
Proceedings of the 4th New Frontiers in Summarization Workshop
Month:
December
Year:
2023
Address:
Singapore
Editors:
Yue Dong, Wen Xiao, Lu Wang, Fei Liu, Giuseppe Carenini
Venue:
NewSum
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
75–86
Language:
URL:
https://aclanthology.org/2023.newsum-1.8
DOI:
10.18653/v1/2023.newsum-1.8
Bibkey:
Cite (ACL):
Sudipta Singha Roy and Robert E. Mercer. 2023. Generating Extractive and Abstractive Summaries in Parallel from Scientific Articles Incorporating Citing Statements. In Proceedings of the 4th New Frontiers in Summarization Workshop, pages 75–86, Singapore. Association for Computational Linguistics.
Cite (Informal):
Generating Extractive and Abstractive Summaries in Parallel from Scientific Articles Incorporating Citing Statements (Singha Roy & Mercer, NewSum 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.newsum-1.8.pdf
Supplementary material:
 2023.newsum-1.8.SupplementaryMaterial.zip
Supplementary material:
 2023.newsum-1.8.SupplementaryMaterial.txt