Touch & activate: adding interactivity to existing objects using active acoustic sensing

@article{Ono2013TouchA,
  title={Touch \& activate: adding interactivity to existing objects using active acoustic sensing},
  author={Makoto Ono and Buntarou Shizuki and Jiro Tanaka},
  journal={Proceedings of the 26th annual ACM symposium on User interface software and technology},
  year={2013}
}
  • M. Ono, B. Shizuki, J. Tanaka
  • Published 8 October 2013
  • Computer Science
  • Proceedings of the 26th annual ACM symposium on User interface software and technology
In this paper, we present a novel acoustic touch sensing technique called Touch & Activate. [] Key Result Walk up user recognition accuracies for the two applications were 97.8% and 71.2%, respectively. Since the results of our experiment showed a promising accuracy for the recognition of touch gestures and hand postures, Touch & Activate should be feasible for prototype interactive objects that have touch input capability.
VersaTouch: A Versatile Plug-and-Play System that Enables Touch Interactions on Everyday Passive Surfaces
We present VersaTouch, a portable, plug-and-play system that uses active acoustic sensing to track fine-grained touch locations as well as touch force of multiple fingers on everyday surfaces without
VibEye: Vibration-Mediated Object Recognition for Tangible Interactive Applications
TLDR
VibEye is particularly effective for recognizing objects made of different materials, which is difficult to distinguish by other means such as light and sound.
Identifying Contact Fingers on Touch Sensitive Surfaces by Ring-Based Vibratory Communication
TLDR
A system that identifies contact fingers using vibration as a modality of communication that achieves over 91 % accuracy in identifying seven contact states from three fingers while wearing only two actuator rings with the aid of a touch screen is introduced.
Ohmic-Touch: Extending Touch Interaction by Indirect Touch through Resistive Objects
TLDR
Ohmic-Touch enhances touch input modality by sensing resistance, and implements mechanisms on touch surfaces based on the electrical resistance of the object to sense the touching position on an interposed object, to identify each object, and to sense light, force, or temperature by using resistors and sensors.
VibSense: Sensing Touches on Ubiquitous Surfaces through Vibration
VibSense pushes the limits of vibration-based sensing to determine the location of a touch on extended surface areas as well as identify the object touching the surface leveraging a single sensor.
AudioTouch: Minimally Invasive Sensing of Micro-Gestures via Active Bio-Acoustic Sensing
We present AudioTouch, a minimally invasive approach for sensing micro-gestures using active bio-acoustic sensing. It only requires attaching two piezo-electric elements, acting as a surface mounted
SmartGrip: grip sensing system for commodity mobile devices through sound signals
TLDR
A novel grip sensing system, called SmartGrip, which allows a mobile device to detect different hand postures without any additional hardware and a screen touch event, and emits carefully designed sound signals and differentiates the propagated signals distorted by different user grips.
Sensing Touch Force using Active Acoustic Sensing
TLDR
A lightweight technique with which creators can prototype force-sensitive objects by attaching a pair of piezoelectric elements: one a vibration speaker and one a contact microphone, allowing creators to give the system both the training data and the labels for training machine learning needed for dealing with continuous-valued output such as SVR.
Touchable Wall: Easy-to-Install Touch-Operated Large-Screen Projection System
TLDR
This study proposes a new touch-operated large-screen projection system using acoustic vibration sensing and a projector-camera system that can detect "true" touch without requiring any devices such as pens or pointers.
Sensing on ubiquitous surfaces via vibration signals: poster
TLDR
Vibration-based sensing supports a broad array of applications through either passive or active sensing using only a single sensor and allows it to differentiate personal objects in contact with any surface.
...
1
2
3
4
5
...

References

SHOWING 1-10 OF 61 REFERENCES
Touché: enhancing touch interaction on humans, screens, liquids, and everyday objects
TLDR
The rich capabilities of Touché are demonstrated with five example setups from different application domains and experimental studies that show gesture classification accuracies of 99% are achievable with the technology.
TapSense: enhancing finger interaction on touch surfaces
TLDR
TapSense is an enhancement to touch interaction that allows conventional surfaces to identify the type of object being used for input by segmenting and classifying sounds resulting from an object's impact by a comprehensive investigation of classification accuracy and training implications.
Grips and gestures on a multi-touch pen
TLDR
This study shows that both swipe and double tap gestures are comparable in performance to commonly employed barrel buttons without its disadvantages, and can be successfully and comfortably performed regardless of the rotation of the pen or how the user grips it.
Augmenting touch interaction through acoustic sensing
TLDR
This work proposes to expand the expressiveness of touch interfaces by augmenting touch with acoustic sensing, and includes a vision of users can naturally express different actions by touching the surface with different body parts not always distinguishable by touch technologies but recognized by acoustic sensing.
HandSense: discriminating different ways of grasping and holding a tangible user interface
TLDR
The HandSense prototype employs capacitive sensors for detecting when it is touched or held against a body part, and is able to correctly classify over 80 percent of all touches, discriminating six different ways of touching the device.
The sound of one hand: a wrist-mounted bio-acoustic fingertip gesture interface
TLDR
It is discovered that gentle fingertip gestures such as tapping, rubbing, and flicking make quiet sounds that travel by bone conduction throughout the hand, and this phenomenon is harnessed to build a wristband-mounted bio-acoustic fingertip gesture interface.
GripSense: using built-in sensors to detect hand posture and pressure on commodity mobile phones
We introduce GripSense, a system that leverages mobile device touchscreens and their built-in inertial sensors and vibration motor to infer hand postures including one- or two-handed interaction, use
Midas: fabricating custom capacitive touch sensors to prototype interactive objects
TLDR
This work introduces Midas, a software and hardware toolkit to support the design, fabrication, and programming of flexible capacitive touch sensors for interactive objects, and demonstrates how Midas can be used to create a number of touch-sensitive interfaces.
FlexAura: a flexible near-surface range sensor
TLDR
A new optical range sensor design based on high power infrared LEDs and photo-transistors, which can be fabricat-ed on a flexible PCB and wrapped around a wide variety of graspable objects including pens, mice, smartphones, and slates is presented.
OmniTouch: wearable multitouch interaction everywhere
TLDR
OmniTouch is a wearable depth-sensing and projection system that enables interactive multitouch applications on everyday surfaces and is conceivable that anything one can do on today's mobile devices, they could do in the palm of their hand.
...
1
2
3
4
5
...