• Publications
  • Influence
Representing word meaning and order information in a composite holographic lexicon.
TLDR
A computational model that builds a holographic lexicon representing both word meaning and word order from unsupervised experience with natural language demonstrates that a broad range of psychological data can be accounted for directly from the structure of lexical representations learned in this way, without the need for complexity to be built into either the processing mechanisms or the representations. Expand
Optimal foraging in semantic memory.
TLDR
Evidence for local structure in memory search and patch depletion preceding dynamic local-to-global transitions between patches is found, and dynamic models significantly outperformed nondynamic models. Expand
More data trumps smarter algorithms: Comparing pointwise mutual information with latent semantic analysis
TLDR
This work evaluates a simple metric of pointwise mutual information and demonstrates that this metric benefits from training on extremely large amounts of data and correlates more closely with human semantic similarity ratings than do publicly available implementations of several more complex models. Expand
The semantic richness of abstract concepts
TLDR
It is suggested that rich linguistic contexts (many semantic neighbors) facilitate early activation of abstract concepts, whereas concrete concepts benefit more from rich physical contexts ( many associated objects and locations). Expand
Redundancy in Perceptual and Linguistic Experience: Comparing Feature-Based and Distributional Models of Semantic Representation
TLDR
It is argued that the amount of perceptual and other semantic information that can be learned from purely distributional statistics has been underappreciated and that future focus should be on understanding the cognitive mechanisms humans use to integrate the two sources. Expand
Perceptual Inference Through Global Lexical Similarity
TLDR
A model that uses the global structure of memory to exploit the redundancy between language and perception in order to generate inferred perceptual representations for words with which the model has no perceptual experience is proposed. Expand
The role of semantic diversity in lexical organization.
TLDR
This work demonstrates the importance of contextual redundancy in lexical access, suggesting that contextual repetitions in language only increase a word's memory strength if the repetitions are accompanied by a modulation in semantic context. Expand
Graph-Theoretic Properties of Networks Based on Word Association Norms: Implications for Models of Lexical Semantic Memory
TLDR
The results suggest that participants switch between a contextual representation and an associative network when generating free associations, and that the role that each of these representations may play in lexical semantic memory. Expand
Querying Word Embeddings for Similarity and Relatedness
TLDR
The usefulness of context embeddings is demonstrated in predicting asymmetric association between words from a recently published dataset of production norms and it is suggested that humans respond with words closer to the cue within the context embedding space (rather than the word embeding space), when asked to generate thematically related words. Expand
The Words Children Hear
TLDR
The text of picture books may be an important source of vocabulary for young children, and these findings suggest a mechanism that underlies the language benefits associated with reading to children. Expand
...
1
2
3
4
5
...