Deep Model Compression Also Helps Models Capture Ambiguity

Hancheol Park, Jong Park


Abstract
Natural language understanding (NLU) tasks face a non-trivial amount of ambiguous samples where veracity of their labels is debatable among annotators. NLU models should thus account for such ambiguity, but they approximate the human opinion distributions quite poorly and tend to produce over-confident predictions. To address this problem, we must consider how to exactly capture the degree of relationship between each sample and its candidate classes. In this work, we propose a novel method with deep model compression and show how such relationship can be accounted for. We see that more reasonably represented relationships can be discovered in the lower layers and that validation accuracies are converging at these layers, which naturally leads to layer pruning. We also see that distilling the relationship knowledge from a lower layer helps models produce better distribution. Experimental results demonstrate that our method makes substantial improvement on quantifying ambiguity without gold distribution labels. As positive side-effects, our method is found to reduce the model size significantly and improve latency, both attractive aspects of NLU products.
Anthology ID:
2023.acl-long.381
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6893–6905
Language:
URL:
https://aclanthology.org/2023.acl-long.381
DOI:
10.18653/v1/2023.acl-long.381
Bibkey:
Cite (ACL):
Hancheol Park and Jong Park. 2023. Deep Model Compression Also Helps Models Capture Ambiguity. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 6893–6905, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Deep Model Compression Also Helps Models Capture Ambiguity (Park & Park, ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.381.pdf
Video:
 https://aclanthology.org/2023.acl-long.381.mp4