The Emergence of Quantifiers

@inproceedings{Pauw2012TheEO,
  title={The Emergence of Quantifiers},
  author={Simon Pauw and Joseph Hilferty},
  year={2012}
}
Human natural languages use quantifiers as ways to designate the number of objects of a set. They include numerals, such as "three", or circumscriptions, such as "a few". The latter are not only underdetermined but also context dependent. We provide a cultural-evolution explanation for the emergence of such quantifiers, focusing in particular on the role of environmental constraints on strategy choices. Through a series of situated interaction experiments, we show how a community of robotic… 
Emergence of the Split Goal Marking System in a Population of Simulated Agents
Languages generally prefer not to employ overt marking on motion endpoints (Goals), with this tendency being most significant for toponyms. An earlier attempt to explain this fact is incompatible
Environmental constraints in the evolution of scalar concepts: Road to most
TLDR
It is argued that such signals might have evolved as stable semantic units through adaptation to general communicative principles and distributional properties of the environment such as normality.
Models of language evolution and change.
TLDR
Computational simulations have been at the heart of the field of evolutionary linguistics for the past two decades, but these are now being extended and complemented in a number of directions, through formal mathematical models, language-ready robotic agents, and experimental simulations in the laboratory.
Co-Acquisition of Syntax and Semantics - An Investigation in Spatial Language
TLDR
This paper shows how a learner robot can learn to produce and interpret spatial utterances in guided-learning interactions with a tutor robot (equipped with a system for producing English spatial phrases), and shows promising results towards long-term, incremental acquisition of natural language in a process of co-development of syntax and semantics.
Monotone Quantifiers Emerge via Iterated Learning
TLDR
It is shown that quantifiers satisfy the monotonicity universal evolve reliably in an iterated learning paradigm with neural networks as agents.
Language Emergence in a Population of Artificial Agents Equipped with the Autotelic Principle
TLDR
This article shows how agents provided with the autotelic principle, a system by which agents can regulate their own development, progressively develop an emerging language evolving from one word to multi-word utterances, increasing its discriminative power.
Language Grounding in Robots
TLDR
A Perceptual System for Language Game Experiments and Grounded Internal Body Models for Communication for Communication.
Learning the Semantics of Natural Language Quantifiers
TLDR
The solution to the problem is given and the coordination mechanism which is capable of handling inconsistent samples of language use and various social influences between communicating speakers is defined, which is applied to learning the semantics of upward monotone proportional quantifiers.
Open-ended Procedural Semantics
TLDR
This chapter introduces the computational infrastructure that is used to bridge the gap between results from sensorimotor processing and language and contains mechanisms for finding networks, chunking subnetworks for more efficient later reuse, and completing partial networks.
From Continuous Observations to Symbolic Concepts: A Discrimination-Based Strategy for Grounded Concept Learning
TLDR
This paper introduces a novel methodology for grounded concept learning that allows for incremental learning, needs few data points, and that the resulting concepts are general enough to be applied to previously unseen objects and can be combined compositionally.
...
...

References

SHOWING 1-10 OF 28 REFERENCES
Open-ended semantics co-evolving with spatial language
TLDR
A particular semantic modeling approach is introduced as well as the coupling of conceptual structures to the language system and how these systems play together in the evolution of spatial language using humanoid robots is shown.
Talking about quantities in space: Vague quantifiers, context and similarity
TLDR
It is shown that the number of other objects in a scene impacts upon quantifiers judgements even when those objects are in a different category to the focus objects.
Embodied determiners
TLDR
This paper contrasts the traditional approach with a new approach, called Clustering Determination, which is heavily inspired by research on grounding of sensorimotor categories, and shows that the new approach performs better in noisy, real world, referential communication.
PLANNING WHAT TO SAY : Second Order Semantics for Fluid Construction Grammars
Research in the origins and evolution of language has now reached a level where languages with grammatical structures are emerging in computer simulations and robotic experiments based on situated
CONSTRAINT BASED COMPOSITIONAL SEMANTICS
This paper presents a computational system that handles the grounding, the formation, the interpretation and the conceptualisation of rich, compo-sitional meaning for use in grounded, multi-agent
Universals in semantics
Abstract This article surveys the state of the art in the field of semantic universals. We examine potential semantic universals in three areas: (i) the lexicon, (ii) semantic “glue” (functional
Basic objects in natural categories
Perspective alignment in spatial language
TLDR
It is shown in a series of robotic experiments which cognitive mechanisms are necessary and sufficient to achieve successful spatial language and why and how perspective alignment can take place, either implicitly or based on explicit marking.
Open-ended Grounded Semantics
TLDR
Recent progress in modeling open-ended, grounded semantics through a unified software system that addresses problems of uncertainty and ambiguity in transmission is presented.
...
...