From Frequency to Meaning: Vector Space Models of Semantics

  title={From Frequency to Meaning: Vector Space Models of Semantics},
  author={Peter D. Turney and Patrick Pantel},
  journal={J. Artif. Intell. Res.},
Computers understand very little of the meaning of human language. This profoundly limits our ability to give instructions to computers, the ability of computers to explain their actions to us, and the ability of computers to analyse and process text. Vector space models (VSMs) of semantics are beginning to address these limits. This paper surveys the use of VSMs for semantic processing of text. We organize the literature on VSMs according to the structure of the matrix in a VSM. There are… 

Figures from this paper

Pattern-based methods for Improved Lexical Semantics and Word Embeddings
This dissertation will show that despite their tremendous success, word embeddings suffer from several limitations and present a set of pattern-based solutions to address these problems, and demonstrate that pattern based methods can be superior.
Frege in Space: A Program for Composition Distributional Semantics
The lexicon of any natural language encodes a huge number of distinct word meanings. Just to understand this article, you will need to know what thousands of words mean. The space of possible
Vector Space Models of Lexical Meaning
The meanings of words will be represented using vectors, as part of a high-dimensional " semantic space", namely the mathematical framework of vector spaces and linear algebra.
Walking the Graph of Language : On a Framework for Meaning and Analogy
A computational framework for generating representations of linguistic concepts that automatically constructs from a corpus of language large graphs with words as vertices and conceptual connections as edges is introduced, and two main algorithms for the extraction of verbal analogs of word pairs are presented.
Semantic Spaces
The Vector Space Model of semantics based on frequency matrices, as used in Natural Language Processing is considered, and the relation between vector space models and semantic spaces based on semic axes in terms of projectability of subvarieties in Grassmannians and projective spaces is formulated.
Distributional Semantics in the Real World: Building Word Vector Representations from a Truth-Theoretic Model
This paper inspects the properties of a distributional model built over a set-theoretic approximation of ‘the real world’ and concludes that, despite prior claims, truth- theoretic models are good candidates for building graded lexical representations of meaning.
Frege in Space: A Program of Compositional Distributional Semantics
The idea that word meaning can be approximated by the patterns of co-occurrence of words in corpora from statistical semantics and the idea that compositionality can be captured in terms of a syntax-driven calculus of function application from formal semantics are adopted.
Automatic Creation of a Semantic Network Encoding part_of Relations
The principles underlying the automatic creation of a semantic map to support navigation in a lexicon are described, to help authors (speakers/writers) to overcome the tipof thetongue-problem (TOT) even in cases where other resources, including Wordnet or Roget’s Thesaurus, would fail.
Composition in Distributional Models of Semantics
This article proposes a framework for representing the meaning of word combinations in vector space in terms of additive and multiplicative functions, and introduces a wide range of composition models that are evaluated empirically on a phrase similarity task.
A systematic evaluation of semantic representations in natural language processing
A method to create a representation for each sense of polysemous word is provided and it is realized that both approaches use context as the normalization factor.


Distributional Structure
This discussion will discuss how each language can be described in terms of a distributional structure, i.e. in Terms of the occurrence of parts relative to other parts, and how this description is complete without intrusion of other features such as history or meaning.
Towards a Theory of Semantic Space
A theoretical framework for semantic space models is developed by synthesizing theoretical analyses from vector space information re- trieval and categorical data analysis with new basic re- search.
The Descent of Hierarchy, and Selection in Relational Semantics
This paper explores the possibility of using an existing lexical hierarchy for the purpose of placing words from a noun compound into categories, and then using this category membership to determine the relation that holds between the nouns.
Corpus-based Learning of Analogies and Semantic Relations
An algorithm for learning from unlabeled text that can solve verbal analogy questions of the kind found in the SAT college entrance exam and is state-of-the-art for both verbal analogies and noun-modifier relations is presented.
Lexical chains as representations of context for the detection and correction of malapropisms
How lexical chains can be constructed by means of WordNet, and how they can be applied in one particularlinguistic task: the detection and correction of malapropisms is shown.
A Vector Model for Syntagmatic and Paradigmatic Relatedness
This paper introduces context digests, high-dimensional real-valued representations for the typical left and right contexts of a word that apply to identifying collocations, assessing the similarity of the arguments of different verbs, and to clustering occurrences of adjectives and verbs according to the words they modify in context.
Using WordNet-based Context Vectors to Estimate the Semantic Relatedness of Concepts
A WordNetbased measure of semantic relatedness is introduced by combining the structure and content of WordNet with co–occurrence information derived from raw text that can make comparisons between any two concepts without regard to their part of speech.
Corpus-Based Statistical Sense Resolution
The three corpus-based statistical sense resolution methods studied here attempt to infer the correct sense of a polysemous word by using knowledge about patterns of word cooccurrences. The
Dependency-Based Construction of Semantic Space Models
This article presents a novel framework for constructing semantic spaces that takes syntactic relations into account, and introduces a formalization for this class of models, which allows linguistic knowledge to guide the construction process.
Similarity-Based Models of Word Cooccurrence Probabilities
This work proposes a method for estimating the probability of such previously unseen word combinations using available information on “most similar” words, and describes probabilistic word association models based on distributional word similarity, and applies them to two tasks, language modeling and pseudo-word disambiguation.