Leveraging Prefix Transfer for Multi-Intent Text Revision

Ruining Chong, Cunliang Kong, Liu Wu, Zhenghao Liu, Ziye Jin, Liner Yang, Yange Fan, Hanghang Fan, Erhong Yang


Abstract
Text revision is a necessary process to improve text quality. During this process, writers constantly edit texts out of different edit intentions. Identifying edit intention for a raw text is always an ambiguous work, and most previous work on revision systems mainly focuses on editing texts according to one specific edit intention. In this work, we aim to build a multi-intent text revision system that could revise texts without explicit intent annotation. Our system is based on prefix-tuning, which first gets prefixes for every edit intent, and then trains a prefix transfer module, enabling the system to selectively leverage the knowledge from various prefixes according to the input text. We conduct experiments on the IteraTeR dataset, and the results show that our system outperforms baselines. The system can significantly improve the SARI score with more than 3% improvements, which thrives on the learned editing intention prefixes.
Anthology ID:
2023.acl-short.105
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1219–1228
Language:
URL:
https://aclanthology.org/2023.acl-short.105
DOI:
10.18653/v1/2023.acl-short.105
Bibkey:
Cite (ACL):
Ruining Chong, Cunliang Kong, Liu Wu, Zhenghao Liu, Ziye Jin, Liner Yang, Yange Fan, Hanghang Fan, and Erhong Yang. 2023. Leveraging Prefix Transfer for Multi-Intent Text Revision. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1219–1228, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Leveraging Prefix Transfer for Multi-Intent Text Revision (Chong et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.105.pdf
Video:
 https://aclanthology.org/2023.acl-short.105.mp4