CAUnLP at NLP4IF 2019 Shared Task: Context-Dependent BERT for Sentence-Level Propaganda Detection

Wenjun Hou, Ying Chen


Abstract
The goal of fine-grained propaganda detection is to determine whether a given sentence uses propaganda techniques (sentence-level) or to recognize which techniques are used (fragment-level). This paper presents the sys- tem of our participation in the sentence-level subtask of the propaganda detection shared task. In order to better utilize the document information, we construct context-dependent input pairs (sentence-title pair and sentence- context pair) to fine-tune the pretrained BERT, and we also use the undersampling method to tackle the problem of imbalanced data.
Anthology ID:
D19-5010
Volume:
Proceedings of the Second Workshop on Natural Language Processing for Internet Freedom: Censorship, Disinformation, and Propaganda
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Anna Feldman, Giovanni Da San Martino, Alberto Barrón-Cedeño, Chris Brew, Chris Leberknight, Preslav Nakov
Venue:
NLP4IF
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
83–86
Language:
URL:
https://aclanthology.org/D19-5010
DOI:
10.18653/v1/D19-5010
Bibkey:
Cite (ACL):
Wenjun Hou and Ying Chen. 2019. CAUnLP at NLP4IF 2019 Shared Task: Context-Dependent BERT for Sentence-Level Propaganda Detection. In Proceedings of the Second Workshop on Natural Language Processing for Internet Freedom: Censorship, Disinformation, and Propaganda, pages 83–86, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
CAUnLP at NLP4IF 2019 Shared Task: Context-Dependent BERT for Sentence-Level Propaganda Detection (Hou & Chen, NLP4IF 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5010.pdf