Learn More
During the past decade, multi-touch surfaces have emerged as valuable tools for collaboration, display, interaction, and musical expression. Unfortunately, they tend to be costly and often suffer from two drawbacks for music performance: (1) relatively high latency owing to their sensing mechanism , and (2) lack of haptic feedback. We analyze the la-tency(More)
The FrankenPipe project is an attempt to convert a traditional Highland Bagpipe into a controller capable of driving both real-time synthesis on a laptop as well as a radio-controlled (RC) car. Doing so engages musical creativity while enabling novel, often humorous, performance art. The chanter is outfitted with photoresistors (CdS photoconductive cells)(More)
A system is presented for detecting common gestures, musical intentions and emotions of pianists in real-time using kinesthetic data retrieved by wireless motion sensors. The algorithm can detect six performer intended emotions such as cheerful, mournful, and vigorous, completely and solely based on low-sample-rate motion sensor data. The algorithm can be(More)
Motivated by previous work aimed at developing mathematical models to describe expressive timing in music, and specifically the final ritardandi, using measured kinematic data, we further investigate the linkage of locomotion and timing in music. The natural running behavior of four subjects is measured with a wearable sensor prototype and analyzed to(More)
Few formal methods exist for evaluating digital musical instruments (DMIs). We propose a novel method of DMI evaluation using crowd-sourced tagging. Tagging is already used to classify websites and musical genres, which, like DMIs, do not lend themselves to simple categorization or parameterization. Using the social tagging method, participating individuals(More)