Natural Language Processing for Clinical Text

Vlada Rozova, Jinghui Liu, Mike Conway


Abstract
Learning from real-world clinical data has potential to promote the quality of care, improve the efficiency of healthcare systems, and support clinical research. As a large proportion of clinical information is recorded only in unstructured free-text format, applying NLP to process and understand the vast amount of clinical text generated in clinical encounters is essential. However, clinical text is known to be highly ambiguous, it contains complex professional terms requiring clinical expertise to understand and annotate, and it is written in different clinical contexts with distinct purposes. All these factors together make clinical NLP research both rewarding and challenging. In this tutorial, we will discuss the characteristics of clinical text and provide an overview of some of the tools and methods used to process it. We will also present a real-world example to show the effectiveness of different NLP methods in processing and understanding clinical text. Finally, we will discuss the strengths and limitations of large language models and their applications, evaluations, and extensions in clinical NLP.
Anthology ID:
2023.alta-1.23
Volume:
Proceedings of the 21st Annual Workshop of the Australasian Language Technology Association
Month:
November
Year:
2023
Address:
Melbourne, Australia
Editors:
Smaranda Muresan, Vivian Chen, Kennington Casey, Vandyke David, Dethlefs Nina, Inoue Koji, Ekstedt Erik, Ultes Stefan
Venue:
ALTA
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
179–182
Language:
URL:
https://aclanthology.org/2023.alta-1.23
DOI:
Bibkey:
Cite (ACL):
Vlada Rozova, Jinghui Liu, and Mike Conway. 2023. Natural Language Processing for Clinical Text. In Proceedings of the 21st Annual Workshop of the Australasian Language Technology Association, pages 179–182, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Natural Language Processing for Clinical Text (Rozova et al., ALTA 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.alta-1.23.pdf