Rethinking Loss Functions for Fact Verification

Yuta Mukobara, Yutaro Shigeto, Masashi Shimbo


Abstract
We explore loss functions for fact verification in the FEVER shared task. While the cross-entropy loss is a standard objective for training verdict predictors, it fails to capture the heterogeneity among the FEVER verdict classes. In this paper, we develop two task-specific objectives tailored to FEVER. Experimental results confirm that the proposed objective functions outperform the standard cross-entropy. Performance is further improved when these objectives are combined with simple class weighting, which effectively overcomes the imbalance in the training data. The source code is available (https://github.com/yuta-mukobara/RLF-KGAT).
Anthology ID:
2024.eacl-short.38
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
432–442
Language:
URL:
https://aclanthology.org/2024.eacl-short.38
DOI:
Bibkey:
Cite (ACL):
Yuta Mukobara, Yutaro Shigeto, and Masashi Shimbo. 2024. Rethinking Loss Functions for Fact Verification. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 432–442, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Rethinking Loss Functions for Fact Verification (Mukobara et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-short.38.pdf