Benchmarking the Generation of Fact Checking Explanations

Daniel Russo, Serra Sinem Tekiroğlu, Marco Guerini


Abstract
Fighting misinformation is a challenging, yet crucial, task. Despite the growing number of experts being involved in manual fact-checking, this activity is time-consuming and cannot keep up with the ever-increasing amount of fake news produced daily. Hence, automating this process is necessary to help curb misinformation. Thus far, researchers have mainly focused on claim veracity classification. In this paper, instead, we address the generation of justifications (textual explanation of why a claim is classified as either true or false) and benchmark it with novel datasets and advanced baselines. In particular, we focus on summarization approaches over unstructured knowledge (i.e., news articles) and we experiment with several extractive and abstractive strategies. We employed two datasets with different styles and structures, in order to assess the generalizability of our findings. Results show that in justification production summarization benefits from the claim information, and, in particular, that a claim-driven extractive step improves abstractive summarization performances. Finally, we show that although cross-dataset experiments suffer from performance degradation, a unique model trained on a combination of the two datasets is able to retain style information in an efficient manner.
Anthology ID:
2023.tacl-1.71
Volume:
Transactions of the Association for Computational Linguistics, Volume 11
Month:
Year:
2023
Address:
Cambridge, MA
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
1250–1264
Language:
URL:
https://aclanthology.org/2023.tacl-1.71
DOI:
10.1162/tacl_a_00601
Bibkey:
Cite (ACL):
Daniel Russo, Serra Sinem Tekiroğlu, and Marco Guerini. 2023. Benchmarking the Generation of Fact Checking Explanations. Transactions of the Association for Computational Linguistics, 11:1250–1264.
Cite (Informal):
Benchmarking the Generation of Fact Checking Explanations (Russo et al., TACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.tacl-1.71.pdf
Video:
 https://aclanthology.org/2023.tacl-1.71.mp4