Pre-Trained Language-Meaning Models for Multilingual Parsing and Generation

Chunliu Wang, Huiyuan Lai, Malvina Nissim, Johan Bos


Abstract
Pre-trained language models (PLMs) have achieved great success in NLP and have recently been used for tasks in computational semantics. However, these tasks do not fully benefit from PLMs since meaning representations are not explicitly included. We introduce multilingual pre-trained language-meaning models based on Discourse Representation Structures (DRSs), including meaning representations besides natural language texts in the same model, and design a new strategy to reduce the gap between the pre-training and fine-tuning objectives. Since DRSs are language neutral, cross-lingual transfer learning is adopted to further improve the performance of non-English tasks. Automatic evaluation results show that our approach achieves the best performance on both the multilingual DRS parsing and DRS-to-text generation tasks. Correlation analysis between automatic metrics and human judgements on the generation task further validates the effectiveness of our model. Human inspection reveals that out-of-vocabulary tokens are the main cause of erroneous results.
Anthology ID:
2023.findings-acl.345
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5586–5600
Language:
URL:
https://aclanthology.org/2023.findings-acl.345
DOI:
10.18653/v1/2023.findings-acl.345
Bibkey:
Cite (ACL):
Chunliu Wang, Huiyuan Lai, Malvina Nissim, and Johan Bos. 2023. Pre-Trained Language-Meaning Models for Multilingual Parsing and Generation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5586–5600, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Pre-Trained Language-Meaning Models for Multilingual Parsing and Generation (Wang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.345.pdf