A Perceptual System for Language Game Experiments

  title={A Perceptual System for Language Game Experiments},
  author={Michael Spranger and Martin Loetzsch and Luc L. Steels},
  booktitle={Language Grounding in Robots},
This chapter describes key aspects of a visual perception system as a key component for language game experiments on physical robots. The vision system is responsible for segmenting the continuous flow of incoming visual stimuli into segments and computing a variety of features for each segment. This happens by a combination of bottom-up way processing that work on the incoming signal and top-down processing based on expectations about what was seen before or objects stored in memory. This… 

Grounded Spatial Language — An Integrated AI Research Program

This paper details recent progress in modelling and understanding the processing, acquisition and evolution of grounded spatial language. It summarises key insights from the project and gives an

A Practical Guide to Studying Emergent Communication through Grounded Language Games

A high-level robot interface that extends the Babel software system is introduced, presenting for the first time a toolkit that provides flexible modules for dealing with each subtask involved in running advanced grounded language game experiments.

Object Learning with Natural Language in a Distributed Intelligent System { A Case Study of Human-Robot Interaction

This paper describes a system to teach a robot, based on a dialogue in natural language about its real environment in real time, which integrates a fast object recognition method for the NAO humanoid robot and a hybrid ensemble learning mechanism.

Robust Natural Language Processing - Combining Reasoning, Cognitive Semantics, and Construction Grammar for Spatial Language

A system for generating and understanding of dynamic and static spatial relations in robotic interaction setups that can robustly deal with visual perception errors, language omissions and ungrammatical utterances is presented.

Language Grounding in Robots

A Perceptual System for Language Game Experiments and Grounded Internal Body Models for Communication for Communication.

The Talking Heads experiment

The motivation, the cognitive mechanisms used by the agents, the various installations of the Talking Heads, the experimental results that were obtained, and the interaction with humans are described.

Dealing with Perceptual Deviation: Vague Semantics for Spatial Language and Quantification

Two case studies for spatial language and quantification are presented that show how cognitive operations – the building blocks of grounded procedural semantics – can be efficiently grounded in sensorimotor spaces.

Grounded lexicon acquisition — Case studies in spatial language

  • Michael Spranger
  • Linguistics
    2013 IEEE Third Joint International Conference on Development and Learning and Epigenetic Robotics (ICDL)
  • 2013
This paper identifies how various spatial language systems, such as projective, absolute and proximal can be learned and shows how multiple systems can be acquired at the same time.

The evolution of grounded spatial language

This book presents groundbreaking robotic experiments on how and why spatial language evolves. It provides detailed explanations of the origins of spatial conceptualization strategies, spatial

Incremental grounded language learning in robot-robot interactions — Examples from spatial language

  • Michael Spranger
  • Linguistics
    2015 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob)
  • 2015
Models of the grounded co-acquisition of syntax and semantics of locative spatial language in developmental robots and how a learner robot can learn to produce and interpret spatial utterances in guided-learning interactions with a tutor robot are reported on.



A feature-integration theory of attention

Visual indexes, preconceptual objects, and situated vision

Grounding language in perception

It is suggested that the notions of support, contact, and attachment are crucial to specifying many simple spatial motion event types and a logical notation for describing classes of events that incorporates such notions as primitives is presented.

Principles of Object Perception

Findings suggest that a general representation of object unity and boundaries is interposed between representations of surfaces and representations of objects of familiar kinds, related to processes of physical reasoning.

Self-Taught Visually-Guided Pointing for a Humanoid Robot

A system which performs a fundamental visuomotor coordination task on the humanoid robot Cog, which requires systems for learning saccade to visual targets, generating smooth arm trajectories, locating the arm in the visual field, and learning the map between gaze direction and correct pointing configuration of the arm.

Deictic codes for the embodiment of cognition

Deictic computation provides a mechanism for representing the essential features that link external sensory data with internal cognitive programs and motor actions and this target article focuses on how deictic bindings make it possible to perform natural tasks.

Learning to Interpret Pointing Gestures: Experiments with Four-Legged Autonomous Robots

  • V. HafnerF. Kaplan
  • Psychology, Biology
    Biomimetic Neural Learning for Intelligent Robots
  • 2005
This chapter presents an experiment in which a robot learns to interpret pointing gestures of another robot and shows that simple feature-based neural learning techniques permit reliably to discriminate between left and right pointing gestures.