RetroDepth: 3D silhouette sensing for high-precision input on and above physical surfaces

@article{Kim2014RetroDepth3S,
  title={RetroDepth: 3D silhouette sensing for high-precision input on and above physical surfaces},
  author={David Kim and Shahram Izadi and Jakub Dostal and Christoph Rhemann and Cem Keskin and Christopher Zach and Jamie Shotton and Timothy A. Large and Steven Bathiche and Matthias Nie{\ss}ner and Alex Butler and S. Fanello and Vivek Pradeep},
  journal={Proceedings of the SIGCHI Conference on Human Factors in Computing Systems},
  year={2014}
}
  • David Kim, S. Izadi, V. Pradeep
  • Published 26 April 2014
  • Computer Science
  • Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
We present RetroDepth, a new vision-based system for accurately sensing the 3D silhouettes of hands, styluses, and other objects, as they interact on and above physical surfaces. Our setup is simple, cheap, and easily reproducible, comprising of two infrared cameras, diffuse infrared LEDs, and any off-the-shelf retro-reflective material. The retro-reflector aids image segmentation, creating a strong contrast between the surface and any object in proximity. A new highly efficient stereo matching… 

Figures from this paper

Augmenting Indirect Multi-Touch Interaction with 3D Hand Contours and Skeletons
TLDR
This work in progress uses depth sensing cameras to track the user's hands above the surface and to recognize the point of interaction with a plain horizontal surface at a predefined height to provide 3D visualizations of the hands and fingers so the user can continuously know their positions before an interaction occurs.
Reconstructing Hand Poses Using Visible Light
TLDR
Aili, a table lamp reconstructing a 3D hand skeleton in real time, requiring neither cameras nor on-body sensing devices, is presented and various interaction applications Aili enables are demonstrated.
Integrating optical finger motion tracking with surface touch events
TLDR
This paper highlights a method of fusing the data from two contrasting sensor systems for studying human interaction with a mechanical system, using piano performance as the case study, including temporal and spatial alignment, segmentation into notes and automatic fingering annotation.
SymmetriSense: Enabling Near-Surface Interactivity on Glossy Surfaces using a Single Commodity Smartphone
TLDR
SymmetriSense addresses the localization challenges in using a single regular camera by a novel technique utilizing the principle of reflection symmetry and the fingertip's natural reflection casted upon surfaces like mirrors, granite countertops, or televisions.
A low-cost transparent electric field sensor for 3d interaction on mobile devices
TLDR
A thin, transparent, and low-cost design for electric field sensing, allowing for 3D finger and hand tracking and gestures on mobile devices, and presents a machine learning algorithm for mapping from signal measurements at the receivers to 3D positions.
Tailored Controls: Creating Personalized Tangible User Interfaces from Paper
TLDR
This work presents a method that allows to quickly and inexpensively create personalized interfaces from plain paper via a simple configuration interface, based on markerless tracking of the user's fingers and the paper shapes on a surface using an RGBD camera mounted above the interaction space.
Touch Detection System for Various Surfaces Using Shadow of Finger
TLDR
A new touch detection technique which utilizes the shadows of a finger, and developed a prototype system with an infrared (IR) camera and two IR lights to improve the accuracy of the estimated touch position.
Touch detection method for non-display surface using multiple shadows of finger
TLDR
A touch detection method that utilizes the shadows of a finger for use with a system featuring an infrared (IR) camera and two IR lights to create a system that enables users to interact with surrounding surfaces by using touch interactions.
Back-Pointer — Fitts' law analysis of natural mobile camera based interactions
TLDR
Back Pointer presents an alternative input mechanism for mobile devices where the back facing camera captures the motion of the user's index finger as it is moved to provide specifications for a macro camera dedicated for back of device camera based interactions.
Touch Detection Method for Non-Display Surfaces Using Shadow of Finger
TLDR
A new touch detection technique which utilizes the shadows of a finger is proposed, and a system with an infrared (IR) camera and two IR lights is developed which can detect touch.
...
1
2
3
4
...

References

SHOWING 1-10 OF 48 REFERENCES
HoloDesk: direct 3d interactions with a situated see-through display
TLDR
A new technique for interpreting raw Kinect data is introduced to approximate and track rigid and non-rigid physical objects and support a variety of physics-inspired interactions between virtual and real.
C-Slate: A Multi-Touch and Object Recognition System for Remote Collaboration using Horizontal Surfaces
TLDR
C-Slate is introduced, a new vision-based system, which utilizes stereo cameras above a commercially available tablet technology to support remote collaboration, and provides a new way to collaborate remotely, complementing existing channels such as audio and video conferencing.
BiDi screen: a thin, depth-sensing LCD for 3D interaction using light fields
TLDR
This work transforms an LCD into a display that supports both 2D multi-touch and unencumbered 3D gestures, and exploits the spatial light modulation capability of LCDs to allow lensless imaging without interfering with display functionality.
Z-touch: an infrastructure for 3d gesture interaction in the proximity of tabletop surfaces
TLDR
Z-touch is introduced, a multi-touch table that can sense the approximate postures of fingers or hands in the proximity of the tabletop's surface and its applications (e.g., drawing, map zooming viewer, Bezier curve control).
High Precision Multi-touch Sensing on Surfaces using Overhead Cameras
TLDR
A novel computer vision algorithm that can robustly identify finger tips and detect touch with a precision of a few millimetres above the surface is described, which relies on machine learning methods and a geometric finger model to achieve the required precision.
Visual touchpad: a two-handed gestural input device
TLDR
By segmenting the hand regions from the video images and then augmenting them transparently into a graphical interface, the Visual Touchpad provides a compelling direct manipulation experience without the need for more expensive tabletop displays or touch-screens, and with significantly less self-occlusion.
Depth-Sensing Video Cameras for 3D Tangible Tabletop Interaction
  • Andrew D. Wilson
  • Computer Science
    Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP'07)
  • 2007
TLDR
An interactive tabletop system which uses a depth-sensing camera to build a height map of the objects on the table surface that is used in a driving simulation game that allows players to drive a virtual car over real objects placed on a table.
Interactions in the air: adding further depth to interactive tabletops
TLDR
The goal is to design a technique that closely resembles the ways the authors manipulate physical objects in the real-world; conceptually, allowing virtual objects to be 'picked up' off the tabletop surface in order to manipulate their three dimensional position or orientation.
MirageTable: freehand interaction on a projected augmented reality tabletop
Instrumented with a single depth camera, a stereoscopic projector, and a curved screen, MirageTable is an interactive system designed to merge real and virtual worlds into a single spatially
Detecting interaction above digital tabletops using a single depth camera
TLDR
This paper proposes an approach to unobtrusively segment and detect interaction above a digital surface using a depth sensing camera and proposes a novel algorithm to merge segments and gives a comparison to the original segmentation algorithm.
...
1
2
3
4
5
...