Quynh Ngoc Thi Do


2018

pdf bib
A Flexible and Easy-to-use Semantic Role Labeling Framework for Different Languages
Quynh Ngoc Thi Do | Artuur Leeuwenberg | Geert Heyman | Marie-Francine Moens
Proceedings of the 27th International Conference on Computational Linguistics: System Demonstrations

This paper presents a flexible and open source framework for deep semantic role labeling. We aim at facilitating easy exploration of model structures for multiple languages with different characteristics. It provides flexibility in its model construction in terms of word representation, sequence representation, output modeling, and inference styles and comes with clear output visualization. The framework is available under the Apache 2.0 license.

2017

pdf bib
Improving Implicit Semantic Role Labeling by Predicting Semantic Frame Arguments
Quynh Ngoc Thi Do | Steven Bethard | Marie-Francine Moens
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers)

Implicit semantic role labeling (iSRL) is the task of predicting the semantic roles of a predicate that do not appear as explicit arguments, but rather regard common sense knowledge or are mentioned earlier in the discourse. We introduce an approach to iSRL based on a predictive recurrent neural semantic frame model (PRNSFM) that uses a large unannotated corpus to learn the probability of a sequence of semantic arguments given a predicate. We leverage the sequence probabilities predicted by the PRNSFM to estimate selectional preferences for predicates and their arguments. On the NomBank iSRL test set, our approach improves state-of-the-art performance on implicit semantic role labeling with less reliance than prior work on manually constructed language resources.

2016

pdf bib
Visualizing the Content of a Children’s Story in a Virtual World: Lessons Learned
Quynh Ngoc Thi Do | Steven Bethard | Marie-Francine Moens
Proceedings of the Workshop on Uphill Battles in Language Processing: Scaling Early Achievements to Robust Methods

pdf bib
Facing the most difficult case of Semantic Role Labeling: A collaboration of word embeddings and co-training
Quynh Ngoc Thi Do | Steven Bethard | Marie-Francine Moens
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers

We present a successful collaboration of word embeddings and co-training to tackle in the most difficult test case of semantic role labeling: predicting out-of-domain and unseen semantic frames. Despite the fact that co-training is a successful traditional semi-supervised method, its application in SRL is very limited especially when a huge amount of labeled data is available. In this work, co-training is used together with word embeddings to improve the performance of a system trained on a large training dataset. We also introduce a semantic role labeling system with a simple learning architecture and effective inference that is easily adaptable to semi-supervised settings with new training data and/or new features. On the out-of-domain testing set of the standard benchmark CoNLL 2009 data our simple approach achieves high performance and improves state-of-the-art results.

2015

pdf bib
Adapting Coreference Resolution for Narrative Processing
Quynh Ngoc Thi Do | Steven Bethard | Marie-Francine Moens
Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing