Context Vectors Are Reflections of Word Vectors in Half the Dimensions
Loading...
Date
Authors
Assylbekov, Zhenisbek
Takhanov, Rustem
Journal Title
Journal ISSN
Volume Title
Publisher
AI ACCESS FOUNDATION
Abstract
This paper takes a step towards theoretical analysis of the relationship between word embeddings and context embeddings in models such as word2vec. We start from basic probabilistic assumptions on the nature of word vectors, context vectors, and text generation. These assumptions are supported either empirically or theoretically by the existing literature. Next, we show that under these assumptions the widely-used word-word PMI matrix is approximately a random symmetric Gaussian ensemble. This, in turn, implies that context vectors are reflections of word vectors in approximately half the dimensions. As a direct application of our result, we suggest a theoretically grounded way of tying weights in the SGNS model.
Description
https://arxiv.org/pdf/1902.09859.pdf
Citation
Assylbekov, Z., & Takhanov, R. (2019). Context Vectors are Reflections of Word Vectors in Half the Dimensions. Journal of Artificial Intelligence Research, 66, 225–242. https://doi.org/10.1613/jair.1.11368
Collections
Endorsement
Review
Supplemented By
Referenced By
Creative Commons license
Except where otherwised noted, this item's license is described as Attribution-NonCommercial-ShareAlike 3.0 United States
