Can Body Language Shape Body Image?

  title={Can Body Language Shape Body Image?},
  author={Luc L. Steels and Michael Spranger},
  booktitle={IEEE Symposium on Artificial Life},
One of the central themes in autonomous robot research concerns the question how visual images of body movements by others can be interpreted and related to one’s own body movements and to language describing these body movements. The discovery of mirror neurons has shown that there are brain circuits which become active both in the perception and the re-enactment of bodily gestures, although it is so far unclear how these circuits can form, i.e. how neurons become mirror neurons. We report… 

The robot in the mirror

Robots play a situated embodied language game called the Action Game in which they ask each other to perform bodily actions and establish and continuously adapt networks linking perception, body representation, action and language.

Emergent mirror systems for body language

This chapter investigates how a vocabulary for talking about body actions can emerge in a population of grounded autonomous agents instantiated as humanoid robots. The agents play a Posture Game

Emergent Action Language on Real Robots

This chapter describes evolutionary language game experiments exploring how these competences originate, can be carried out and acquired, by real robots, using evolutionary language games and a whole systems approach.

A Formal Approach to Social Learning: Exploring Language Acquisition Through Imitation

The topic of this thesis is learning through social interaction, consisting of experiments that focus on word acquisition through imitation, and a formalism aiming to provide stronger theoretical

Learning Words by Imitating

This chapter proposes a single imitation-learning algorithm capable of simultaneously learning linguistic as well as nonlinguistic tasks, without demonstrations being labeled. A human demonstrator

The Action Game: A computational model for learning repertoires of goals and vocabularies to express them in a population of agents

A computational model is introduced which illustrates how a population of agents can coordinate a vocabulary for goal oriented behavior through repeated local interactions, called “Action Games”, and shows that in this model shared vocabularies can only be learned from multiple demonstrations consisting of exactly these actions that are strictly required to reach the goal.

The development of a robust symbolic communication system for robots via embodied iterated imitation

This research introduces multiple cycles of iterated imitation during which the symbols of the language evolve and adapt to the uncertainties in robots’ sensors and actuators and proposes as a model that explains how structural syntax emerges and evolves in a symbolic communication system.

The Talking Heads experiment

The motivation, the cognitive mechanisms used by the agents, the various installations of the Talking Heads, the experimental results that were obtained, and the interaction with humans are described.

Computer vision, geometric reasoning and graphics

The aim of this paper is to show some of the most important vision-related topics which can have a more or less direct impact on the robotics research field.

General Language Evolution in General Game Playing

This paper will extend GGP with language evolution to develop a general language generation technique that can be combined with GGP algorithms for incomplete-information games and assist players in automatically generating a common language to solve cooperation problems.



The robot in the mirror

Robots play a situated embodied language game called the Action Game in which they ask each other to perform bodily actions and establish and continuously adapt networks linking perception, body representation, action and language.

How the body shapes the way we think - a new view on intelligence

In How the Body Shapes the Way The authors Think, Rolf Pfeifer and Josh Bongard demonstrate that thought is not independent of the body but is tightly constrained, and at the same time enabled, by it.

Adaptive body schema for robotic tool-use

It is argued that the temporal integration of multisensory information is a plausible candidate mechanism to explain how manipulated objects can become incorporated into the body schema and suggests that synthetic modeling might not only be a valid avenue towards getting a better grasp on results provided by neuropsychology and neurophysiology, but also a powerful approach for building advanced tool-using robots.

Emergence of Mirror Neurons in a Model of Gaze Following

A computational model of how human infants or other agents may acquire gaze following by learning to predict the locations of interesting sights from the looking behavior of other agents through reinforcement learning is presented.

Embodied meaning in a neural theory of language

Neural Simulation of Action: A Unifying Mechanism for Motor Cognition

The hypothesis that the motor system is part of a simulation network that is activated under a variety of conditions in relation to action, either self-intended or observed from other individuals, will be developed.

Premotor cortex and the recognition of motor actions.

From Molecule to Metaphor - A Neural Theory of Language

Jerome Feldman proposes a theory of language and thought that treats language not as an abstract symbol system but as a human biological ability that can be studied as a function of the brain, as vision and motor control are studied.

Language within our grasp

Imitation: a means to enhance learning of a synthetic protolanguage in autonomous robots

The sharing of a similar perceptual context between imitator and imitatee creates a meaningful social context onto which language, that is the development of a common means of symbolic communication, can develop.