Push the Limit of Acoustic Gesture Recognition

@article{Wang2020PushTL,
  title={Push the Limit of Acoustic Gesture Recognition},
  author={Yanwen Wang and Jiaxing Shen and Yuanqing Zheng},
  journal={IEEE INFOCOM 2020 - IEEE Conference on Computer Communications},
  year={2020},
  pages={566-575}
}
With the flourish of the smart devices and their applications, controlling devices using gestures has attracted increasing attention for ubiquitous sensing and interaction. Recent works use acoustic signals to track hand movement and recognize gestures. However, they suffer from low robustness due to frequency selective fading, interference and insufficient training data. In this work, we propose RobuCIR, a robust contact-free gesture recognition system that can work under different usage… 
Acoustic-based Upper Facial Action Recognition for Smart Eyewear
TLDR
This work proposes a novel acoustic-based upper facial action (UFA) recognition system that serves as a hands-free interaction mechanism for smart eyewear and designs a Convolutional Neural Network to extract high-level features from the time-frequency patterns and classify the features into six UFAs.
Robust and Deployable Gesture Recognition for Smartwatches
TLDR
This work suggests that deployable and robust recognition is feasible but requires systematic efforts in data collection and network design to address known causes of gesture variability, and proposes convolution-based network variations for classifying raw sensor data.
LASense: Pushing the Limits of Fine-grained Activity Sensing Using Acoustic Signals
TLDR
A system called LASense is proposed, which can significantly increase the sensing range for fine-grained human activities using a single pair of speaker and microphone using a virtual transceiver idea that purely leverages delicate signal processing techniques in software.
Fall Detection via Inaudible Acoustic Sensing
TLDR
A novel and lightweight fall detection system by relying solely on a home audio device via inaudible acoustic sensing, to recognize fall occurrences for wide home deployment and is robust to environment changes, i.e., transferable to other environments after training in one environment.
Hybrid Optimized GRU-ECNN Models for Gait Recognition with Wearable IOT Devices
With the advent of the Internet of Things (IoT), human-assistive technologies in healthcare services have reached the peak of their application in terms of diagnosis and treatment process. These
Listen to Your Fingers
TLDR
A second-factor authentication method, TouchPrint, which relies on the user's hand posture shape traits (dependent on the individual different posture type and unique hand geometry biometrics) when the user inputs PIN or pattern, which is robust against the behavioral variability of inputting a passcode and places no restrictions on input manner.
Smartphone-based Handwritten Signature Verification using Acoustic Signals
TLDR
SonarSign is presented, an on-line handwritten signature verification system based on inaudible acoustic signals that can achieve accurate and robust signatures verification with an AUC of 98.02% and an EER of 5.79% for unseen users.
Ubiquitous Acoustic Sensing on Commodity IoT Devices: A Survey
TLDR
This paper presents the first systematic survey of recent advances in active acoustic sensing using commodity hardware with a frequency range below 24 kHz and proposes a general framework that categorizes main building blocks of acoustic sensing systems.
Non-intrusive Continuous User Identification from Activity Acoustic Signatures
TLDR
This paper develops an automated machine learning-based framework, trained on basic acoustic features, that can identify the users from the acoustic signatures generated by their activities to distinguish individuals performing the activities continuously.
RFID and camera fusion for recognition of human-object interactions
TLDR
An RF-Camera system by fusing RFID and Computer Vision techniques, which is the first work to recognize the human gestural interactions with physical objects in multi-subject and multi-object scenarios is proposed.
...
1
2
...

References

SHOWING 1-10 OF 52 REFERENCES
Device-free gesture tracking using acoustic signals
TLDR
This paper proposes LLAP, a device-free gesture tracking scheme that can be deployed on existing mobile devices as software, without any hardware modification, and implemented and evaluated LLAP using commercial-off-the-shelf mobile phones.
UltraGesture: Fine-Grained Gesture Sensing and Recognition
TLDR
This paper presents UltraGesture, a Channel Impulse Response (CIR) based ultrasonic finger motion perception and recognition system that runs on commercial speakers and microphones that already exist on most mobile devices without hardware modification.
Bringing Gesture Recognition to All Devices
TLDR
AllSee is introduced, the first gesture-recognition system that can operate on a range of computing devices including those with no batteries and achieves classification accuracies as high as 97% over a set of eight gestures.
WiGest: A ubiquitous WiFi-based gesture recognition system
TLDR
This work presents WiGest: a system that leverages changes in WiFi signal strength to sense in-air hand gestures around the user's mobile device, using standard WiFi equipment, with no modifications, and no training for gesture recognition.
AudioGest: enabling fine-grained hand gesture detection by decoding echo signal
TLDR
The results show that AudioGest can detect six hand gestures with an accuracy up to 96%, and by distinguishing the gesture attributions, it can provide up to 162 control commands for various applications.
WiFinger: leveraging commodity WiFi for fine-grained finger gesture recognition
TLDR
This paper presents fine-grained finger gesture recognition by using a single commodity WiFi device without requiring user to wear any sensors and proposes to capture the intrinsic gesture behavior to deal with individual diversity and gesture inconsistency.
SoundWave: using the doppler effect to sense gestures
TLDR
This work presents SoundWave, a technique that leverages the speaker and microphone already embedded in most commodity devices to sense in-air gestures around the device, and describes the phenomena and detection algorithm.
Whole-home gesture recognition using wireless signals
TLDR
WiSee is presented, a novel gesture recognition system that leverages wireless signals (e.g., Wi-Fi) to enable whole-home sensing and recognition of human gestures and achieves this goal without requiring instrumentation of the human body with sensing devices.
VSkin: Sensing Touch Gestures on Surfaces of Mobile Devices Using Acoustic Signals
TLDR
VSkin, a system that supports fine-grained gesture-sensing on the back of mobile devices based on acoustic signals, and utilizes both the structure-borne sounds, i.e., sounds propagating through the structure of the device, and the air-borne sound, to sense finger tapping and movements.
C-FMCW Based Contactless Respiration Detection Using Acoustic Signal
TLDR
This paper proposes a Correlation based Frequency Modulated Continuous Wave method (C-FMCW) which is able to achieve high ranging resolution and detects respiration in real environments with the median error lower than 0.35 breaths/min, outperforming the state-of-the-arts.
...
1
2
3
4
5
...