Biomedical Relation Extraction with Entity Type Markers and Relation-specific Question Answering

Koshi Yamada, Makoto Miwa, Yutaka Sasaki


Abstract
Recently, several methods have tackled the relation extraction task with QA and have shown successful results. However, the effectiveness of existing methods in specific domains, such as the biomedical domain, is yet to be verified. When there are multiple entity pairs that share an entity in a sentence, a QA-based relation extraction model that outputs only one single answer to a given question may not extract desired relations. In addition, these methods employ QA models that are not tuned for relation extraction. To address these issues, we first extend and apply a span QA-based relation extraction method to the drug-protein relation extraction by creating question templates and incorporating entity type markers. We further propose a binary QA-based method that directly uses the entity information available in the relation extraction task. The experimental results on the DrugProt dataset show that our QA-based methods, especially the proposed binary QA method, are effective for drug-protein relation extraction.
Anthology ID:
2023.bionlp-1.35
Volume:
The 22nd Workshop on Biomedical Natural Language Processing and BioNLP Shared Tasks
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Dina Demner-fushman, Sophia Ananiadou, Kevin Cohen
Venue:
BioNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
377–384
Language:
URL:
https://aclanthology.org/2023.bionlp-1.35
DOI:
10.18653/v1/2023.bionlp-1.35
Bibkey:
Cite (ACL):
Koshi Yamada, Makoto Miwa, and Yutaka Sasaki. 2023. Biomedical Relation Extraction with Entity Type Markers and Relation-specific Question Answering. In The 22nd Workshop on Biomedical Natural Language Processing and BioNLP Shared Tasks, pages 377–384, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Biomedical Relation Extraction with Entity Type Markers and Relation-specific Question Answering (Yamada et al., BioNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.bionlp-1.35.pdf
Video:
 https://aclanthology.org/2023.bionlp-1.35.mp4