Kai-Chou Yang


2019

pdf bib
Fill the GAP: Exploiting BERT for Pronoun Resolution
Kai-Chou Yang | Timothy Niven | Tzu Hsuan Chou | Hung-Yu Kao
Proceedings of the First Workshop on Gender Bias in Natural Language Processing

In this paper, we describe our entry in the gendered pronoun resolution competition which achieved fourth place without data augmentation. Our method is an ensemble system of BERTs which resolves co-reference in an interaction space. We report four insights from our work: BERT’s representations involve significant redundancy; modeling interaction effects similar to natural language inference models is useful for this task; there is an optimal BERT layer to extract representations for pronoun resolution; and the difference between the attention weights from the pronoun to the candidate entities was highly correlated with the correct label, with interesting implications for future work.