Learning Dynamic Contextualised Word Embeddings via Template-based Temporal Adaptation

Xiaohang Tang, Yi Zhou, Danushka Bollegala


Abstract
Dynamic contextualised word embeddings (DCWEs) represent the temporal semantic variations of words. We propose a method for learning DCWEs by time-adapting a pretrained Masked Language Model (MLM) using time-sensitive templates. Given two snapshots C1 and C2 of a corpus taken respectively at two distinct timestamps T1 and T2, we first propose an unsupervised method to select (a) pivot terms related to both C1 and C2, and (b) anchor terms that are associated with a specific pivot term in each individual snapshot.We then generate prompts by filling manually compiled templates using the extracted pivot and anchor terms.Moreover, we propose an automatic method to learn time-sensitive templates from C1 and C2, without requiring any human supervision.Next, we use the generated prompts to adapt a pretrained MLM to T2 by fine-tuning using those prompts.Multiple experiments show that our proposed method significantly reduces the perplexity of test sentences in C2, outperforming the current state-of-the-art.
Anthology ID:
2023.acl-long.520
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
9352–9369
Language:
URL:
https://aclanthology.org/2023.acl-long.520
DOI:
10.18653/v1/2023.acl-long.520
Bibkey:
Cite (ACL):
Xiaohang Tang, Yi Zhou, and Danushka Bollegala. 2023. Learning Dynamic Contextualised Word Embeddings via Template-based Temporal Adaptation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 9352–9369, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Learning Dynamic Contextualised Word Embeddings via Template-based Temporal Adaptation (Tang et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.520.pdf
Video:
 https://aclanthology.org/2023.acl-long.520.mp4