Language Grounding in Robots

@inproceedings{Steels2012LanguageGI,
  title={Language Grounding in Robots},
  author={Luc L. Steels and Manfred Hild},
  booktitle={Springer US},
  year={2012}
}
Grounding Language through Evolutionary Language Games.- Myon, a New Humanoid.- Neural Implementation of Behavior Control.- Evolving Humanoid Behaviors for Language Games.- A Perceptual System for Language Game Experiments.- Posture Recognition Based on Slow Feature Analysis.- Grounded Internal Body Models for Communication: Integration of Sensory and Motor Spaces for Mediating Conceptualization.- Open-ended Procedural Semantics.- Dealing with Perceptual Deviation: Vague Semantics for Spatial… 

Evolving Humanoid Behaviors for Language Games

A modular approach to neural control is taken and is supported by a corresponding evolutionary algorithm, such that complete neural control networks are composed of specific functional units, the so called neuro-modules.

Crossmodal Language Grounding, Learning, and Teaching

This position paper presents visionand action-embodied language learning research as part of a project investigating multi-modal learning, and embeds this approach of internally grounding embodied experience and externally teaching abstract experience into the developmental robotics paradigm by developing and employing a neurorobot that is capable of multisensory perception and interaction.

Fluid Construction Grammar on Real Robots

This chapter introduces very briefly the framework and tools for lexical and grammatical processing that have been used in the evolutionary language game experiments reported in this book. This

Figurative Language Grounding in Humanoid Robots

A method for representing idioms (in particular, colour-related idioms) within the framework of Fluid Construction Grammar (FCG), to facilitate (figurative) language grounding in humanoid robots in the near future.

Grounding Language for Interactive Task Learning

This paper describes how language is grounded by a comprehension system called Lucia within a robotic agent called Rosie that can manipulate objects and navigate indoors and uses Embodied Construction Grammar (ECG) as a formalism for describing linguistic knowledge.

Sensorimotor input as a language generalisation tool: a neurorobotics model for generation and generalisation of noun-verb combinations with sensorimotor inputs

This study demonstrates its generalisation capability using a large data-set, with which the robot was able to generalise semantic representation of novel combinations of noun-verb sentences, and therefore produce the corresponding motor behaviours.

Dealing with Perceptual Deviation: Vague Semantics for Spatial Language and Quantification

Two case studies for spatial language and quantification are presented that show how cognitive operations – the building blocks of grounded procedural semantics – can be efficiently grounded in sensorimotor spaces.

Embodied Language Learning and Cognitive Bootstrapping: Methods and Design Principles

The purpose of this article is to bring together diverse but complementary accounts of research methods that jointly contribute to the understanding of cognitive development and in particular, language acquisition in robots.

Situated natural language interaction in uncertain and open worlds

Most natural language enabled robots rely on highly scripted interactions, keyword spotting, and shallow natural language processing techniques to achieve the desired behavior, which may be restricted to a small class of tasks.

Emergent Action Language on Real Robots

This chapter describes evolutionary language game experiments exploring how these competences originate, can be carried out and acquired, by real robots, using evolutionary language games and a whole systems approach.
...

References

SHOWING 1-10 OF 165 REFERENCES

Grounding Language through Evolutionary Language Games

  • L. Steels
  • Computer Science
    Language Grounding in Robots
  • 2012
This chapter introduces a new experimental paradigm for studying issues in the grounding of language and robots, and the integration of all aspects of intelligence into a single system. The paradigm

The Origins of Syntax in Visually Grounded Robotic Agents

Emergent mirror systems for body language

This chapter investigates how a vocabulary for talking about body actions can emerge in a population of grounded autonomous agents instantiated as humanoid robots. The agents play a Posture Game

The Emergence of Grammar in Communicating Autonomous Robotic Agents

Over the past five years, the topic of the origins of language is gaining prominence as one of the big unresolved questions of cognitive science. Artificial Intelligence can make a major contribution

Dealing with Perceptual Deviation: Vague Semantics for Spatial Language and Quantification

Two case studies for spatial language and quantification are presented that show how cognitive operations – the building blocks of grounded procedural semantics – can be efficiently grounded in sensorimotor spaces.

Open-ended semantics co-evolving with spatial language

A particular semantic modeling approach is introduced as well as the coupling of conceptual structures to the language system and how these systems play together in the evolution of spatial language using humanoid robots is shown.

The robot in the mirror

Robots play a situated embodied language game called the Action Game in which they ask each other to perform bodily actions and establish and continuously adapt networks linking perception, body representation, action and language.

Open-ended Grounded Semantics

Recent progress in modeling open-ended, grounded semantics through a unified software system that addresses problems of uncertainty and ambiguity in transmission is presented.

Learning to Interpret Pointing Gestures: Experiments with Four-Legged Autonomous Robots

  • V. HafnerF. Kaplan
  • Psychology, Biology
    Biomimetic Neural Learning for Intelligent Robots
  • 2005
This chapter presents an experiment in which a robot learns to interpret pointing gestures of another robot and shows that simple feature-based neural learning techniques permit reliably to discriminate between left and right pointing gestures.

Imitation and mechanisms of joint attention: a developmental structure for building social skills on a humanoid robot

Adults are extremely adept at recognizing social cues, such as eye direction or pointing gestures, that establish the basis of joint attention. These skills serve as the developmental basis for more
...