Joseph Sanu


2017

pdf bib
Word Embeddings based on Fixed-Size Ordinally Forgetting Encoding
Joseph Sanu | Mingbin Xu | Hui Jiang | Quan Liu
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing

In this paper, we propose to learn word embeddings based on the recent fixed-size ordinally forgetting encoding (FOFE) method, which can almost uniquely encode any variable-length sequence into a fixed-size representation. We use FOFE to fully encode the left and right context of each word in a corpus to construct a novel word-context matrix, which is further weighted and factorized using truncated SVD to generate low-dimension word embedding vectors. We evaluate this alternate method in encoding word-context statistics and show the new FOFE method has a notable effect on the resulting word embeddings. Experimental results on several popular word similarity tasks have demonstrated that the proposed method outperforms other SVD models that use canonical count based techniques to generate word context matrices.