Nicholas Edward Gillian

Learn More
This paper presents <i>Soli</i>, a new, robust, high-resolution, low-power, miniature gesture sensing technology for human-computer interaction based on millimeter-wave radar. We describe a new approach to developing a radar-based sensor optimized for human-computer interaction, building the sensor architecture from the ground up with the inclusion of radar(More)
This paper presents a novel algorithm that has been specifically designed for the recognition of multivariate temporal musical gestures. The algorithm is based on Dynamic Time Warping and has been extended to classify any N dimensional signal, automatically compute a classification threshold to reject any data that is not a valid gesture and be quickly(More)
WristQue combines environmental and inertial sensing with precise indoor localization into a wristband wearable device that serves as the user's personal control interface to networked infrastructure. WristQue enables users to take control of devices around them by pointing to select and gesturing to control. At the same time, it uniquely identifies and(More)
We present a multimodal on-surface and near-surface sensing technique for planar, curved and flexible surfaces. Our technique leverages temporal multiplexing of signals coming from a universal interdigitated electrode design, which is printed as a single conductive layer on a flexible substrate. It supports sensing of touch and proximity input, and moreover(More)
The Gesture Recognition Toolkit is a cross-platform open-source C++ library designed to make real-time machine learning and gesture recognition more accessible for non-specialists. Emphasis is placed on ease of use, with a consistent, minimalist design that promotes accessibility while supporting flexibility and customization for advanced users. The toolkit(More)
This paper presents the SARC EyesWeb Catalog, (SEC), a machine learning toolbox that has been specifically developed for musician-computer interaction. The SEC features a large number of machine learning algorithms that can be used in real-time to recognise static postures, perform regression and classify multivariate temporal gestures. The algorithms(More)
This paper presents Digito, a gesturally controlled virtual musical instrument. Digito is controlled through a number of intricate hand gestures, providing both discrete and continuous control of Digito’s sound engine; with the finegrain hand gestures captured by a 3D depth sensor and recognized using computer vision and machine learning algorithms. We(More)
This paper presents “Scratch-Off”, a new musical multiplayer DJ game that has been designed for a mobile phone. We describe how the game is used as a test platform for experimenting with various types of multimodal feedback. The game uses movement gestures made by the players to scratch a record and control crossfades between tracks, with the objective of(More)
Gestures Everywhere is a dynamic framework for multimodal sensor fusion, pervasive analytics and gesture recognition. Our framework aggregates the real-time data from approximately 100 sensors that include RFID readers, depth cameras and RGB cameras distributed across 30 interactive displays that are located in key public areas of the MIT Media Lab.(More)
We present a wearable system that uses ambient electromagnetic interference (EMI) as a signature to identify electronic devices and support proxemic interaction. We designed a low cost tool, called EMI Spy, and a software environment for rapid deployment and evaluation of ambient EMI-based interactive infrastructure. EMI Spy captures electromagnetic(More)