We present an integrated human-robot interaction system that enables a user to select and command a team of two Unmanned Aerial Vehicles (UAV) using voice, touch, face engagement and hand gestures. This system integrates multiple human [multi]-robot interaction interfaces as well as a navigation and mapping algorithm in a coherent semi-realistic scenario. The task of the UAVs is to explore and map a simulated Mars environment.
Unfortunately, ACM prohibits us from displaying non-influential references for this paper.
To see the full reference list, please visit http://dl.acm.org/citation.cfm?id=2559646.