Christian Scheel


2021

pdf bib
HOTTER: Hierarchical Optimal Topic Transport with Explanatory Context Representations
Sabine Wehnert | Christian Scheel | Simona Szakács-Behling | Maret Nieländer | Patrick Mielke | Ernesto William De Luca
Findings of the Association for Computational Linguistics: EMNLP 2021

Natural language processing (NLP) is often the backbone of today’s systems for user interactions, information retrieval and others. Many of such NLP applications rely on specialized learned representations (e.g. neural word embeddings, topic models) that improve the ability to reason about the relationships between documents of a corpus. Paired with the progress in learned representations, the similarity metrics used to compare representations of documents are also evolving, with numerous proposals differing in computation time or interpretability. In this paper we propose an extension to a specific emerging hybrid document distance metric which combines topic models and word embeddings: the Hierarchical Optimal Topic Transport (HOTT). In specific, we extend HOTT by using context-enhanced word representations. We provide a validation of our approach on public datasets, using the language model BERT for a document categorization task. Results indicate competitive performance of the extended HOTT metric. We furthermore apply the HOTT metric and its extension to support educational media research, with a retrieval task of matching topics in German curricula to educational textbooks passages, along with offering an auxiliary explanatory document representing the dominant topic of the retrieved document. In a user study, our explanation method is preferred over regular topic keywords.

2011

pdf bib
Investigating the Applicability of current Machine-Learning based Subjectivity Detection Algorithms on German Texts
Malik Atalla | Christian Scheel | Ernesto William De Luca | Sahin Albayrak
Proceedings of Workshop on Robust Unsupervised and Semisupervised Methods in Natural Language Processing