Nut-cracking Sledgehammers: Prioritizing Target Language Data over Bigger Language Models for Cross-Lingual Metaphor Detection

Jakob Schuster, Katja Markert


Abstract
In this work, we investigate cross-lingual methods for metaphor detection of adjective-noun phrases in three languages (English, German and Polish). We explore the potential of minimalistic neural networks supported by static embeddings as a light-weight alternative for large transformer-based language models. We measure performance in zero-shot experiments without access to annotated target language data and aim to find low-resource improvements for them by mainly focusing on a k-shot paradigm. Even by incorporating a small number of phrases from the target language, the gap in accuracy between our small networks and large transformer architectures can be bridged. Lastly, we suggest that the k-shot paradigm can even be applied to models using machine translation of training data.
Anthology ID:
2023.clasp-1.12
Volume:
Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD)
Month:
September
Year:
2023
Address:
Gothenburg, Sweden
Editors:
Ellen Breitholtz, Shalom Lappin, Sharid Loaiciga, Nikolai Ilinykh, Simon Dobnik
Venue:
CLASP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
98–106
Language:
URL:
https://aclanthology.org/2023.clasp-1.12
DOI:
Bibkey:
Cite (ACL):
Jakob Schuster and Katja Markert. 2023. Nut-cracking Sledgehammers: Prioritizing Target Language Data over Bigger Language Models for Cross-Lingual Metaphor Detection. In Proceedings of the 2023 CLASP Conference on Learning with Small Data (LSD), pages 98–106, Gothenburg, Sweden. Association for Computational Linguistics.
Cite (Informal):
Nut-cracking Sledgehammers: Prioritizing Target Language Data over Bigger Language Models for Cross-Lingual Metaphor Detection (Schuster & Markert, CLASP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.clasp-1.12.pdf