Ping Xue


2022

pdf bib
Section-Aware Commonsense Knowledge-Grounded Dialogue Generation with Pre-trained Language Model
Sixing Wu | Ying Li | Ping Xue | Dawei Zhang | Zhonghai Wu
Proceedings of the 29th International Conference on Computational Linguistics

In knowledge-grounded dialogue generation, pre-trained language models (PLMs) can be expected to deepen the fusing of dialogue context and knowledge because of their superior ability of semantic understanding. Unlike adopting the plain text knowledge, it is thorny to leverage the structural commonsense knowledge when using PLMs because most PLMs can only operate plain texts. Thus, linearizing commonsense knowledge facts into plan text is a compulsory trick. However, a dialogue is always aligned to a lot of retrieved fact candidates; as a result, the linearized text is always lengthy and then significantly increases the burden of using PLMs. To address this issue, we propose a novel two-stage framework SAKDP. In the first pre-screening stage, we use a ranking network PriorRanking to estimate the relevance of a retrieved knowledge fact. Thus, facts can be clustered into three sections of different priorities. As priority decreases, the relevance decreases, and the number of included facts increases. In the next dialogue generation stage, we use section-aware strategies to encode the linearized knowledge. The powerful but expensive PLM is only used for a few facts in the higher priority sections, reaching the performance-efficiency balance. Both the automatic and human evaluation demonstrate the superior performance of this work.

2011

pdf bib
Extract Chinese Unknown Words from a Large-scale Corpus Using Morphological and Distributional Evidences
Kaixu Zhang | Ruining Wang | Ping Xue | Maosong Sun
Proceedings of 5th International Joint Conference on Natural Language Processing

2010

pdf bib
Fast-Champollion: A Fast and Robust Sentence Alignment Algorithm
Peng Li | Maosong Sun | Ping Xue
Coling 2010: Posters