• Publications
  • Influence
Human model evaluation in interactive supervised learning
TLDR
This work studying the evaluation practices of end users interactively building supervised learning systems for real-world gesture analysis problems observed that users employed evaluation techniques not only to make relevant judgments of algorithms' performance and interactively improve the trained models, but also to learn to provide more effective training data.
A Meta-Instrument for Interactive, On-the-Fly Machine Learning
TLDR
This work proposes a method for harnessing machine learning algorithms within a radically interactive paradigm, in which the designer may repeatedly generate examples, train a learner, evaluate outcomes, and modify parameters in real-time within a single software environment.
Towards a dimension space for musical devices
TLDR
One possible application of the dimension space to existing performance and interaction systems is illustrated, revealing its usefulness both in exposing patterns across existing musical devices and aiding in the design of new ones.
Real-time human interaction with supervised learning algorithms for music composition and performance
TLDR
This thesis presents a general-purpose software system for applying standard supervised learning algorithms in music and other real-time problem domains, called the Wekinator, which supports human interaction throughout the entire supervised learning process, including the generation of training examples and the application of trained models to real- time inputs.
Don't forget the laptop: using native input capabilities for expressive musical control
TLDR
It is argued that instruments designed using these built-in inputs offer benefits over custom standalone controllers, particularly in certain group performance settings, and a new toolkit for rapidly experimenting with these capabilities is described.
Human-Centred Machine Learning
TLDR
A human-centered understanding of machine learning in human context can lead not only to more usable machine learning tools, but to new ways of framing learning computationally.
Using Interactive Machine Learning to Support Interface Development Through Workshops with Disabled People
TLDR
This work has led to a better understanding of challenges in end-user training of learning models, of how people develop personalised interaction strategies with different types of pre-trained interfaces, and of how properties of control spaces and input devices influence people's customisation strategies and engagement with instruments.
ACE: A Framework for Optimizing Music Classification
TLDR
A discussion of ways in which existing general-purpose classification software can be adapted to meet the needs of music researchers and how these ideas have been implemented in ACE are shown.
Grab-and-Play Mapping: Creative Machine Learning Approaches for Musical Inclusion and Exploration
We present the first implementation of a new tool for prototyping digital musical instruments, which allows a user to literally grab a controller and turn it into a new, playable musical instrument
Mixed-Initiative Creative Interfaces
TLDR
This workshop convenes CHI and game researchers to advance mixed-initiative approaches to creativity support.
...
...