Motion Estimation and Hand Gesture Recognition-Based Human–UAV Interaction Approach in Real Time

@article{Yoo2022MotionEA,
  title={Motion Estimation and Hand Gesture Recognition-Based Human–UAV Interaction Approach in Real Time},
  author={Minjeong Yoo and Yuseung Na and Hamin Song and Gamin Kim and Junseong Yun and Sangho Kim and Changjoo Moon and Kichun Jo},
  journal={Sensors (Basel, Switzerland)},
  year={2022},
  volume={22},
  url={https://api.semanticscholar.org/CorpusID:247828227}
}
A hybrid hand gesture system that combines an inertial measurement unit (IMU)-based motion capture system and a vision-based gesture system to increase real-time performance is proposed and proves that it is a safer and more intuitive HUI design with a 0.089 ms processing speed and average lap time that takes about 19 s less than the joystick controller.

Vision-Based Gesture-Driven Drone Control in a Metaverse-Inspired 3D Simulation Environment

This work used transfer learning-based computer vision techniques to detect dynamic hand gestures in real-time, outperforming state-of-the-art methods and proposing a hybrid lightweight dynamic hand gesture recognition system and a 3D simulator based drone control environment for live simulation.

UAV Control with Vision-Based Hand Gesture Recognition over Edge-Computing

This research proposes to use a novel approach leveraging hand landmarks drawing and classification for gesture recognition based UAV control and proposes to use a edge-computing based framework to offload the heavier computing tasks, thus achieving closed-loop real-time performance.

Deep Learning-Based Human Body Posture Recognition and Tracking for Unmanned Aerial Vehicles

The empirical results show the proposed drone surveillance system can effectively recognize the targeted human behaviors with strong robustness in the presence of uncertainty and operated efficiently with high real-time performance.

An Extended Study of the Applications of Using Gesture Control to Pilot UAVs

A modeling approach for gesture-based control of UAVs, which relies on physical movements or hand gestures, eliminates the need for signal transmission susceptible to interception and an additional layer of security can be established.

Hand-Guiding Gesture-Based Telemanipulation with the Gesture Mode Classification and State Estimation Using Wearable IMU Sensors

This study proposes a telemanipulation framework with two wearable IMU sensors without human skeletal kinematics and confirmed that a subject could place the EEF within the average range of 83 mm and 2.56 degrees in the target pose with only less than ten consecutive hand-guiding gestures and visual inspection in the first trial.

A Real-Time Posture Detection Algorithm Based on Deep Learning

A real-time pose detection algorithm based on deep learning, which can effectively perform real-time tracking and detection of single and multiple individuals in different indoor and outdoor environments and at different distances is proposed.

Intuitive Human-Robot Interface: A 3-Dimensional Action Recognition and UAV Collaboration Framework

A novel methodology adept at classifying three-dimensional human actions, leveraging them to coordinate on-field with a UAV, and includes mechanisms ensuring the robot perpetually maintains the human within its visual purview, adeptly tracking user movements.

Resource-Efficient GRU for Real-Time Gesture Recognition

This work introduces a dataset for gesture recognition tasks, and develops a model for sequence classification optimized for resource constraints, demonstrating the feasibility of the approach through quantitative evaluation metrics.

Transferable Convolutional Neural Networks for IMU-based Motion Gesture Recognition in Human-Machine Interaction

This study shows that the proposed CNN model designed for efficient motion gesture recognition maintains high performance with lightweight structure, while also highlighting how transfer learning approach can address the challenges of data collection and set the stage for creating more intuitive and user-centric interaction systems.

Design and Evaluation of an Alternative Control for a Quad-Rotor Drone Using Hand-Gesture Recognition

The success of the developed HGR algorithm ensured that the proposed alternative-control system could facilitate intuitive, computationally inexpensive, and repeatable drone control without requiring specialised equipment.

Real-Time Human-UAV Interaction: New Dataset and Two Novel Gesture-Based Interacting Systems

Two novel gesture-based Human-UAV Interaction (HUI) systems are proposed to launch and control a UAV in real-time utilizing a monocular camera and a ground computer. The first proposal is an

Real-Time Human Detection and Gesture Recognition for On-Board UAV Rescue

Work on real-time UAV human detection and recognition of body and hand rescue gestures, using body-featuring solutions to establish biometric communications, can achieve the expected UAV rescue purpose.

Recognition of arm gestures using multiple orientation sensors: gesture classification

A gesture recognition algorithm from Euler angles acquired using multiple orientation sensors is presented, part of a system for controlling unmanned aerial vehicles (UAVs) in the presence of manned aircrafts on an aircraft deck.

Human Control of UAVs using Face Pose Estimates and Hand Gestures

This approach enables human operators to command and control Parrot drones by giving them directions to move, using simple hand gestures, and provides UAVs/UGVs with a better perception of the environment around the human.

Hand gesture recognition with convolutional neural networks for the multimodal UAV control

This investigation presents hardware design, inertial recognition of arm movement, and the detailed structure of a convolutional neural network system used for real-time hand gesture recognition based on MMG signals, which achieved 94% accuracy for five gestures with simple calibration for each user.

3-D hand motion tracking and gesture recognition using a data glove

A rule-based algorithm is utilized to recognize simple hand gestures namely scissor, rock, and paper using the 3-D digital hand model and the KHU-1 data glove.

Intelligent Human–UAV Interaction System with Joint Cross-Validation over Action–Gesture Recognition and Scene Understanding

We propose an intelligent human–unmanned aerial vehicle (UAV) interaction system, in which, instead of using the conventional remote controller, the UAV flight actions are controlled by a deep

Gesture Recognition for UAV-based Rescue Operation based on Deep Learning

An approach to accurately recognize various body gestures in the wild environment by using deep learning algorithms is presented in this work, which can not only recognize human rescue gestures but also detect people, track people, and count the number of humans.

An Intelligent Human–Unmanned Aerial Vehicle Interaction Approach in Real Time Based on Machine Learning Using Wearable Gloves

An intelligent human–UAV interaction approach in real time based on machine learning using wearable gloves that offers scientific contributions such as a multi-mode command structure, machine-learning-based recognition, task scheduling algorithms, real-time usage, robust and effective use, and high accuracy rates is presented.

Hand gesture recognition based on accelerometer sensors

This paper presents a hand gesture recognition system built around an accelerometer sensor and OCR, which can be used in a more complex human-computer friendly interface, in which the commands are given by gestures.