Efficient Document Embeddings via Self-Contrastive Bregman Divergence Learning

Daniel Saggau, Mina Rezaei, Bernd Bischl, Ilias Chalkidis


Abstract
Learning quality document embeddings is a fundamental problem in natural language processing (NLP), information retrieval (IR), recommendation systems, and search engines. Despite recent advances in the development of transformer-based models that produce sentence embeddings with self-contrastive learning, the encoding of long documents (Ks of words) is still challenging with respect to both efficiency and quality considerations. Therefore, we train Longfomer-based document encoders using a state-of-the-art unsupervised contrastive learning method (SimCSE). Further on, we complement the baseline method -siamese neural network- with additional convex neural networks based on functional Bregman divergence aiming to enhance the quality of the output document representations. We show that overall the combination of a self-contrastive siamese network and our proposed neural Bregman network outperforms the baselines in two linear classification settings on three long document topic classification tasks from the legal and biomedical domains.
Anthology ID:
2023.findings-acl.771
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12181–12190
Language:
URL:
https://aclanthology.org/2023.findings-acl.771
DOI:
10.18653/v1/2023.findings-acl.771
Bibkey:
Cite (ACL):
Daniel Saggau, Mina Rezaei, Bernd Bischl, and Ilias Chalkidis. 2023. Efficient Document Embeddings via Self-Contrastive Bregman Divergence Learning. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12181–12190, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Efficient Document Embeddings via Self-Contrastive Bregman Divergence Learning (Saggau et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.771.pdf
Video:
 https://aclanthology.org/2023.findings-acl.771.mp4