Valiallah Monajjemi

Learn More
We present a multimodal system for creating, modifying and commanding groups of robots from a population. Extending our previous work on selecting an individual robot from a population by face-engagement, we show that we can dynamically create groups of a desired number of robots by speaking the number we desire, e.g. “You three”, and looking at the robots(More)
Extending our previous work in real-time vision-based Human Robot Interaction (HRI) with multi-robot systems, we present the first example of creating, modifying and commanding teams of UAVs by an uninstrumented human. To create a team the user focuses attention on an individual robot by simply looking at it, then adds or removes it from the current team(More)
We describe a system whereby multiple humans and mobile robots interact robustly using a combination of sensing and signalling modalities. Extending our previous work on selecting an individual robot from a population by face-engagement, we show that reaching toward a robot - a specialization of pointing - can be used to designate a particular robot for(More)
We present a multi-modal multi-robot interaction whereby a user can identify an individual or a group of robots using haptic stimuli, and name them using a voice command (e.g."<i>You two are green</i>"). Subsequent commands can be addressed to the same robot(s) by name (e.g. "<i>Green! Take off</i>!"). We demonstrate this as part of a real-world integrated(More)
We present the first demonstration of establishing mutual attention between an outdoor UAV in autonomous normal flight and an uninstrumented human user. We use the familiar periodic waving gesture as a signal to attract the UAV's attention. The UAV can discriminate this gesture from human walking and running that appears similarly periodic. Once a signaling(More)
We present the first demonstration of end-to-end far-to-near situated interaction between an uninstrumented human user and an initially distant outdoor autonomous Unmanned Aerial Vehicle (UAV). The user uses an arm-waving gesture as a signal to attract the UAV's attention from a distance. Once this signal is detected, the UAV approaches the user using(More)
We present an integrated human-robot interaction system that enables a user to select and command a team of two Unmanned Aerial Vehicles (UAV) using voice, touch, face engagement and hand gestures. This system integrates multiple human [multi]-robot interaction interfaces as well as a navigation and mapping algorithm in a coherent semi-realistic scenario.(More)
We introduce Drums, a new tool for monitoring and debugging distributed robot systems, and a complement to robot middleware systems. Drums provides online time-series monitoring of the underlying resources that are partially abstracted away by middleware like the Robot Operating System (ROS). Interfacing with the middleware, Drums provides de-abstraction(More)