Commonsense-augmented Memory Construction and Management in Long-term Conversations via Context-aware Persona Refinement

Hana Kim, Kai Ong, Seoyeon Kim, Dongha Lee, Jinyoung Yeo


Abstract
Memorizing and utilizing speakers’ personas is a common practice for response generation in long-term conversations. Yet, human-authored datasets often provide uninformative persona sentences that hinder response quality. This paper presents a novel framework that leverages commonsense-based persona expansion to address such issues in long-term conversation.While prior work focuses on not producing personas that contradict others, we focus on transforming contradictory personas into sentences that contain rich speaker information, by refining them based on their contextual backgrounds with designed strategies. As the pioneer of persona expansion in multi-session settings, our framework facilitates better response generation via human-like persona refinement. The supplementary video of our work is available at https://caffeine-15bbf.web.app/.
Anthology ID:
2024.eacl-short.11
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
104–123
Language:
URL:
https://aclanthology.org/2024.eacl-short.11
DOI:
Bibkey:
Cite (ACL):
Hana Kim, Kai Ong, Seoyeon Kim, Dongha Lee, and Jinyoung Yeo. 2024. Commonsense-augmented Memory Construction and Management in Long-term Conversations via Context-aware Persona Refinement. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 104–123, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Commonsense-augmented Memory Construction and Management in Long-term Conversations via Context-aware Persona Refinement (Kim et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-short.11.pdf
Software:
 2024.eacl-short.11.software.zip