Knowledge Graph Compression Enhances Diverse Commonsense Generation

EunJeong Hwang, Veronika Thost, Vered Shwartz, Tengfei Ma


Abstract
Generating commonsense explanations requires reasoning about commonsense knowledge beyond what is explicitly mentioned in the context. Existing models use commonsense knowledge graphs such as ConceptNet to extract a subgraph of relevant knowledge pertaining to concepts in the input. However, due to the large coverage and, consequently, vast scale of ConceptNet, the extracted subgraphs may contain loosely related, redundant and irrelevant information, which can introduce noise into the model. We propose to address this by applying a differentiable graph compression algorithm that focuses on the relevant knowledge for the task. The compressed subgraphs yield considerably more diverse outputs when incorporated into models for the tasks of generating commonsense and abductive explanations. Moreover, our model achieves better quality-diversity tradeoff than a large language model with 100 times the number of parameters. Our generic approach can be applied to additional NLP tasks that can benefit from incorporating external knowledge.
Anthology ID:
2023.emnlp-main.37
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
558–572
Language:
URL:
https://aclanthology.org/2023.emnlp-main.37
DOI:
10.18653/v1/2023.emnlp-main.37
Bibkey:
Cite (ACL):
EunJeong Hwang, Veronika Thost, Vered Shwartz, and Tengfei Ma. 2023. Knowledge Graph Compression Enhances Diverse Commonsense Generation. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 558–572, Singapore. Association for Computational Linguistics.
Cite (Informal):
Knowledge Graph Compression Enhances Diverse Commonsense Generation (Hwang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.37.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.37.mp4