

(Goldberg & Levy, 2014) Yoav Goldberg, and Omer Levy. In: Advances in Neural Information Processing Systems. Neural Word Embedding As Implicit Matrix Factorization. “An Efficient Algorithm for Easy-first Non-directional Dependency Parsing.” In: Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics. (Levy & Goldberg, 2014) Omer Levy, and Yoav Goldberg.

( Levy et al., 2015) ⇒ Omer Levy, Yoav Goldberg, and Ido Dagan.Factorization Meets the Item Embedding: Regularizing Matrix Factorization with Item Co-occurrence. By Omer Levy and Yoav Goldberg, Neural Information Processing Systems. Google Scholar Dawen Liang, Jaan Altosaar, Laurent Charlin, and David M Blei 2016. “ A Primer on Neural Network Models for Natural Language Processing.” In: Technical Report Journal, October 5, 2015. Neural word embedding as implicit matrix factorization NIPS.

“ A Primer on Neural Network Models for Natural Language Processing.” In: Journal of Artificial Intelligence Research, 57(1). “Neural Network Methods for Natural Language Processing.” In: Synthesis Lectures on Human Language Technologies, 10(1). “ On the Practical Computational Power of Finite Precision RNNs for Language Recognition.” In: arXiv:1805.04908 Journal. (Levy & Goldberg, 2014) Omer Levy, and Yoav Goldberg. In Advances in Neural Information Processing Systems, pages 21772185. The SPPMI-SVD method simply factorizes the sparse SPPMI matrix using Singular Value Decomposition (SVD), rather than the gradient descent methods of word2vec/GloVe, and uses the left singular vectors as the final word embeddings. ( Weiss et al., 2018) ⇒ Gail Weiss, Yoav Goldberg, and Eran Yahav. Neural word embedding as implicit matrix factorization.See: Word Embedding Algorithm, SGNS Algorithm.
