The Latent Structure of Dictionaries

@article{VincentLamarre2016TheLS,
  title={The Latent Structure of Dictionaries},
  author={Philippe Vincent-Lamarre and Alexandre Blondin Mass{\'e} and Marcos Lopes and M{\'e}lanie Lord and Odile Marcotte and Stevan Harnad},
  journal={Topics in cognitive science},
  year={2016},
  volume={8 3},
  pages={
          625-59
        }
}
How many words-and which ones-are sufficient to define all other words? When dictionaries are analyzed as directed graphs with links from defining words to defined words, they reveal a latent structure. Recursively removing all words that are reachable by definition but that do not define any further words reduces the dictionary to a Kernel of about 10% of its size. This is still not the smallest number of words that can define all the rest. About 75% of the Kernel turns out to be its Core, a… Expand
The Dictionary Game: Toward a Characterization of Lexical Primitives Using Graph Theory and Relational Concept Analysis
In language theory and cognition, the search for a minimal set of language primitives from which every other concept could be defined is an ever-recurring topic. In order to help identify suchExpand
The Active Learner’s Construction-Combinatory Thesaurus: user-driven principles of compiling (a cognitive linguistic approach)
TLDR
A cognitive profile of the dictionary’s target user is proposed, and makes it a departure point in elaborating the principles of compiling the ALCCT, an ideographic dictionary intended for adult learners of the second (foreign) language. Expand
Attention-based Neural Networks Encode Aspects of Human Word Sense Knowledge
How humans understand variation in a word’s senses is key to explaining the structure of the lexicon, but formal models categorizing word senses like WordNet (Miller et al., 1990) do not capture thisExpand
Evaluating models of robust word recognition with serial reproduction
TLDR
A suite of probabilistic generative language models are evaluated against the yielded chains of utterances and it is found that those models that make use of abstract representations of preceding linguistic context best predict the changes made by people in the course of serial reproduction. Expand
UDC 81’374 THE ACTIVE LEARNER’S CONSTRUCTION-COMBINATORY THESAURUES: USER-DRIVEN PRINCIPLES OF COMPILING (A COGNITIVE LINGUISTIC APPROACH)
S. Zhabotynska, Ye. Plakhotniuk. The Active Learner’s Construction-Combinatory Thesaurus: userdriven principles of compiling (a cognitive linguistic approach). This article discusses the design of aExpand
Connecting concepts in the brain by mapping cortical representations of semantic relations
TLDR
A predictive model of the voxel-wise response of the default mode network is developed and applied to thousands of new words, suggesting that both semantic categories and relations are represented by spatially overlapping cortical patterns, instead of anatomically segregated regions. Expand
Discovering Psychological Principles by Mining Naturally Occurring Data Sets
TLDR
This issue is to present some exemplary case studies of mining naturally existing data sets to reveal important principles and phenomena in cognitive science, and to discuss some of the underlying issues involved with conducting traditional experiments, analyses of naturally occurring data, computational modeling, and the synthesis of all three methods. Expand
Connecting Concepts in the Brain by Mapping Cortical Representations of Semantic Relations
TLDR
It is concluded that the human brain uses distributed networks to encode not only concepts but also relationships between concepts, and the default mode network plays a central role in semantic processing for abstraction of concepts. Expand
When words are upside down: Language-space associations in children and adults.
TLDR
It is concluded that sensorimotor experiences in the spatial domain seem to play a role during meaning processing from early childhood onward, a finding crucial for embodied models of language comprehension. Expand
META-TOPICS IN BEHAVIORAL DATA AND ITS ANALYSIS
OVERVIEW: This Semester’s Meta-Topics in Behavioral Data and Its Analysis is the third in its series. The first debuted in Fall 2013 and the second in Fall 2016. Like the first two, we will notExpand
...
1
2
3
...

References

SHOWING 1-10 OF 97 REFERENCES
Hidden Structure and Function in the Lexicon
TLDR
Graph-theoretic analysis reveals that about 10% of a dictionary is a unique Kernel of words that define one another and all the rest, but this is not the smallest such subset, which contains many overlapping Minimal Grounding Sets. Expand
The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth
TLDR
A simple model for semantic growth is described, in which each new word or concept is connected to an existing network by differentiating the connectivity pattern of an existing node, which generates appropriate small-world statistics and power-law connectivity distributions. Expand
From senses to texts: An all-in-one graph-based approach for measuring semantic similarity
TLDR
The method first leverages the structural properties of a semantic network in order to model arbitrary linguistic items through a unified probabilistic representation, and then compares the linguistic items in terms of their representations. Expand
What can graph theory tell us about word learning and lexical retrieval?
  • M. Vitevitch
  • Mathematics, Medicine
  • Journal of speech, language, and hearing research : JSLHR
  • 2008
TLDR
Graph theory and the new science of networks were applied to the mental lexicon to examine the organization of words in the lexicon and to explore how that structure might influence the acquisition and retrieval of phonological word-forms. Expand
How Is Meaning Grounded in Dictionary Definitions?
TLDR
The concept of a reachable set is introduced --- a larger vocabulary whose meanings can be learned from a smaller vocabulary through definition alone, as long as the meanings of the smaller vocabulary are themselves already grounded. Expand
Symbol Grounding and the Origin of Language: From Show to Tell
Organisms’ adaptive success depends on being able to do the right thing with the right kind of thing. This is categorization. Most species can learn categories (1) by direct experience (“induction”).Expand
Examining assortativity in the mental lexicon: Evidence from word associations
TLDR
The results show that the extent to which these aspects exhibit assortativity varies considerably, with a high cue-response correspondence on valence, dominance, arousal, concreteness, and part of speech, indicating that these factors correspond to the words people deem as related. Expand
Word associations: Network and semantic properties
TLDR
A number of properties of word associations, generated in a continuous task, were investigated and the correspondence of word class in association cues and responses was investigated, indicating a dominant paradigmatic response style. Expand
Frequency of word-use predicts rates of lexical evolution throughout Indo-European history
TLDR
It is proposed that the frequency with which specific words are used in everyday language exerts a general and law-like influence on their rates of evolution, consistent with social models of word change that emphasize the role of selection and suggest that owing to the ways that humans use language, some words will evolve slowly and others rapidly across all languages. Expand
The impact of word prevalence on lexical decision times: Evidence from the Dutch Lexicon Project 2.
TLDR
It is argued that word prevalence is likely to be the most important new variable protecting researchers against experimenter bias in selecting stimulus materials and the unique variance it contributes to lexical decision times is higher than that of the other variables. Expand
...
1
2
3
4
5
...