Towards Distribution-shift Robust Text Classification of Emotional Content

Luana Bulla, Aldo Gangemi, Misael Mongiovi’


Abstract
Supervised models based on Transformers have been shown to achieve impressive performances in many natural language processing tasks. However, besides requiring a large amount of costly manually annotated data, supervised models tend to adapt to the characteristics of the training dataset, which are usually created ad-hoc and whose data distribution often differs from the one in real applications, showing significant performance degradation in real-world scenarios. We perform an extensive assessment of the out-of-distribution performances of supervised models for classification in the emotion and hate-speech detection tasks and show that NLI-based zero-shot models often outperform them, making task-specific annotation useless when the characteristics of final-user data are not known in advance. To benefit from both supervised and zero-shot approaches, we propose to fine-tune an NLI-based model on the task-specific dataset. The resulting model often outperforms all available supervised models both in distribution and out of distribution, with only a few thousand training samples.
Anthology ID:
2023.findings-acl.524
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8256–8268
Language:
URL:
https://aclanthology.org/2023.findings-acl.524
DOI:
10.18653/v1/2023.findings-acl.524
Bibkey:
Cite (ACL):
Luana Bulla, Aldo Gangemi, and Misael Mongiovi’. 2023. Towards Distribution-shift Robust Text Classification of Emotional Content. In Findings of the Association for Computational Linguistics: ACL 2023, pages 8256–8268, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Towards Distribution-shift Robust Text Classification of Emotional Content (Bulla et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.524.pdf