Toward the Modular Training of Controlled Paraphrase Adapters

Teemu Vahtola, Mathias Creutz


Abstract
Controlled paraphrase generation often focuses on a specific aspect of paraphrasing, for instance syntactically controlled paraphrase generation. However, these models face a limitation: they lack modularity. Consequently adapting them for another aspect, such as lexical variation, needs full retraining of the model each time. To enhance the flexibility in training controlled paraphrase models, our proposition involves incrementally training a modularized system for controlled paraphrase generation for English. We start by fine-tuning a pretrained language model to learn the broad task of paraphrase generation, generally emphasizing meaning preservation and surface form variation. Subsequently, we train a specialized sub-task adapter with limited sub-task specific training data. We can then leverage this adapter in guiding the paraphrase generation process toward a desired output aligning with the distinctive features within the sub-task training data. The preliminary results on comparing the fine-tuned and adapted model against various competing systems indicates that the most successful method for mastering both general paraphrasing skills and task-specific expertise follows a two-stage approach. This approach involves starting with the initial fine-tuning of a generic paraphrase model and subsequently tailoring it for the specific sub-task.
Anthology ID:
2024.moomin-1.1
Volume:
Proceedings of the 1st Workshop on Modular and Open Multilingual NLP (MOOMIN 2024)
Month:
March
Year:
2024
Address:
St Julians, Malta
Editors:
Raúl Vázquez, Timothee Mickus, Jörg Tiedemann, Ivan Vulić, Ahmet Üstün
Venues:
MOOMIN | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–6
Language:
URL:
https://aclanthology.org/2024.moomin-1.1
DOI:
Bibkey:
Cite (ACL):
Teemu Vahtola and Mathias Creutz. 2024. Toward the Modular Training of Controlled Paraphrase Adapters. In Proceedings of the 1st Workshop on Modular and Open Multilingual NLP (MOOMIN 2024), pages 1–6, St Julians, Malta. Association for Computational Linguistics.
Cite (Informal):
Toward the Modular Training of Controlled Paraphrase Adapters (Vahtola & Creutz, MOOMIN-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.moomin-1.1.pdf