Daniel Peterson

Also published as: Daniel W. Peterson


2016

pdf bib
Leveraging VerbNet to build Corpus-Specific Verb Clusters
Daniel Peterson | Jordan Boyd-Graber | Martha Palmer | Daisuke Kawahara
Proceedings of the Fifth Joint Conference on Lexical and Computational Semantics

2014

pdf bib
Inducing Example-based Semantic Frames from a Massive Amount of Verb Uses
Daisuke Kawahara | Daniel Peterson | Octavian Popescu | Martha Palmer
Proceedings of the 14th Conference of the European Chapter of the Association for Computational Linguistics

pdf bib
Focusing Annotation for Semantic Role Labeling
Daniel Peterson | Martha Palmer | Shumin Wu
Proceedings of the Ninth International Conference on Language Resources and Evaluation (LREC'14)

Annotation of data is a time-consuming process, but necessary for many state-of-the-art solutions to NLP tasks, including semantic role labeling (SRL). In this paper, we show that language models may be used to select sentences that are more useful to annotate. We simulate a situation where only a portion of the available data can be annotated, and compare language model based selection against a more typical baseline of randomly selected data. The data is ordered using an off-the-shelf language modeling toolkit. We show that the least probable sentences provide dramatic improved system performance over the baseline, especially when only a small portion of the data is annotated. In fact, the lion’s share of the performance can be attained by annotating only 10-20% of the data. This result holds for training a model based on new annotation, as well as when adding domain-specific annotation to a general corpus for domain adaptation.

pdf bib
A Step-wise Usage-based Method for Inducing Polysemy-aware Verb Classes
Daisuke Kawahara | Daniel W. Peterson | Martha Palmer
Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)