• Corpus ID: 14123471

Infrared vs. Ultrasonic Finger Detection on a Virtual Piano Keyboard

@inproceedings{Pra2014InfraredVU,
  title={Infrared vs. Ultrasonic Finger Detection on a Virtual Piano Keyboard},
  author={Yuri De Pra and Federico Fontana and Linmi Tao},
  booktitle={ICMC},
  year={2014}
}
An immaterial digital keyboard is presented, aiming at testing possibilities to substitute physical with augmented piano keys during the performance. [] Key Method Multimodal feedback has been realized by filming the hands’ action with the rear camera of a consumer’s tablet PC, and then projecting this action on its screen; furthermore this projection has been layered over the image of a piano keyboard reacting to the hands’ action. Especially in connection with a Leap Motion system in charge of doing the…

Figures and Tables from this paper

Digital beethoven — An android based virtual piano

TLDR
An Android application that allows users to play a virtual piano by using a keyboard drawn on a piece of paper by pointing the camera of a hand held device towards the keyboard, and process image of the paper keyboard in real time.

A virtual reality keyboard with realistic haptic feedback in a fully immersive virtual environment

TLDR
The results show that the haptic VR keyboard system can be used to create complex vibrations that simulate measured vibrations from a real keyboard and enhance keyboard interaction in a fully immersive VR environment.

Augmented songbook: an augmented reality educational application for raising music awareness

TLDR
An Augmented Reality mobile application which aims at sensibilizing young children to abstract concepts of music, and suggests alternatives to the SIFT local descriptors, regarding result quality and computational efficiency, both for document model identification and perspective transform estimation.

An Interactive Augmented Reality Furniture Customization System

TLDR
An augmented reality furniture customization system, which provides users the abilities to view and change the materials and dimensions of three-dimensional virtual furniture, within the contexts of real environments, is presented.

Parallel digital signal processing for efficient piano synthesis

TLDR
The resulting model is able to provide faithful reproduction of the acoustic piano physical behaviour and can also be used as an engine for novel instruments that need to provide advanced multimodal output with a low-cost embedded platform.

Specialized CNT-based Sensor Framework for Advanced Motion Tracking

TLDR
The framework presented in this paper encompasses the continuing research and development of more advanced CNT-based sensors and the implementation of novel high-fidelity motion tracking products based on them.

A Survey of Augmented Piano Prototypes: Has Augmentation Improved Learning Experiences?

TLDR
Recommendations on augmenting piano systems towards enriching the piano learning experience as well as on possible directions to expand knowledge in the area are presented.

References

SHOWING 1-10 OF 20 REFERENCES

Key detection for a virtual piano teacher

  • Adam GoodwinR. Green
  • Computer Science
    2013 28th International Conference on Image and Vision Computing New Zealand (IVCNZ 2013)
  • 2013
TLDR
Through the use of the method proposed, the keys of a piano keyboard were successfully identified from webcam video footage, with a tolerance to camera movement and occluded keys demonstrated, which demonstrates the potential for the piano teacher program as a learning tool.

Piano AR: A Markerless Augmented Reality Based Piano Teaching System

This paper presents a novel markerless augmented reality based piano teaching system, which tracks the real keyboard of piano naturally. Following the virtual hands augmented on the keyboard,

Feedback is... late: measuring multimodal delays in mobile device touchscreen interaction

TLDR
An easy to implement multimodal latency measurement tool for touchscreen interaction that uses off-the-shelf components and free software and is capable of measuring latencies accurately between different interaction events in different modalities is introduced.

Näprä: affordable fingertip tracking with ultrasound

TLDR
The motivation for building the device was the need to track users' fingertips in an immersive free-hand drawing environment, and out of the numerous tracking methods ultrasound was chosen because of its affordability and low computational requirements.

Visual tracking of bare fingers for interactive surfaces

TLDR
The design is based on the modeling of two classes of algorithms that are key to the tracker: Image Differencing Segmentation and Fast Rejection Filters, and a new chromatic distance for IDS and a FRF that is independent to finger rotation.

A Preliminary Evaluation of the Leap Motion Sensor as Controller of New Digital Musical Instruments

The introduction of new gesture interfaces has been expanding the possibilities of creating new Digital Musical Instruments (DMIs). Leap Motion Controller was recently launched promising fine-grained

Finger Tracking as an Input Device for Augmented Reality

TLDR
This paper describes experiments in the use of cross-correlation as a means of tracking pointing devices for a digital desk using a reference template and a method to detect when to initiate tracking as well as when tracking has failed.

Latency Tolerance for Gesture Controlled Continuous Sound Instrument without Tactile Feedback

TLDR
The results suggest that younger subjects detect latencies more accurately than older subjects, and that the subjects’ activity with music and musical background did not seem to have an effect.

Noncontact Tactile Display Based on Radiation Pressure of Airborne Ultrasound

TLDR
A tactile display which provides unrestricted tactile feedback in air without any mechanical contact is described, which controls ultrasound and produces a stress field in a 3D space based on a nonlinear phenomenon of ultrasound: Acoustic radiation pressure.

UltraHaptics: multi-point mid-air haptic feedback for touch surfaces

TLDR
This work investigates the desirable properties of an acoustically transparent display and demonstrates that the system is capable of creating multiple localised points of feedback in mid-air, and shows that feedback points with different tactile properties can be identified at smaller separations.