Learn More
This paper reviews models of the ways in which performer instrumental actions can be linked to sound synthesis parameters. We analyse available literature on both acoustical instrument simulation and mapping of input devices to sound synthesis in general human-computer interaction. We further demonstrate why a more complex mapping strategy is required to(More)
This paper describes recent work which challenges the predominance of the WIMP (Windows-Icons-MenusPointers) computer interface for use in real-time situations. The results of the work have implications for the design of user-interfaces for real-time control tasks (of which musical performance and experimentation are clear examples). This paper describes(More)
interfaces (and the supporting technological infrastructure) to create audiovisual instruments for use in music therapy. In considering how the multidimensional nature of sound requires multidimensional input control, we propose a model to help designers manage the complex mapping between input devices and multiple media software. We also itemize a research(More)
The paper describes the use of the MIDIGRID and MIDICREATOR systems with a range of transducers and interface devices in music therapy. This opens up new possibilities, as well as new challenges in the way in which such technology may be used in therapy. The paper includes a discussion of the role of music therapy, together with some case histories(More)
MidiGrid is a computer-based musical instrument, primarily controlled with the computer mouse, which allows live performance of MIDI-based musical material by mapping 2dimensional position onto musical events. Since its invention in 1987, it has gained a small, but enthusiastic, band of users, and has become the primary instrument for several people with(More)